[For Developers] Connecting Front-end Developers and Designers with the Token System

[For Developers] Connecting Front-end Developers and Designers with the Token System
March 15
# Tech
# Front-End
# Token System

Intro

What is the Token System discusses what is the token system, includes the definition of tokens, and how to organize tokens to meet our customization needs.

A token system is an approach to abstracting, structuring, and organizing visual styles using tokens. The token system has three layers: reference tokens, system tokens, and component tokens, which are organized hierarchically to allow for customization at different levels. This system can be leveraged to safely and quickly modify any layers to meet design needs without any side effects.

The previous article introduced how designers can use tokens in their workflow. This article will focus on the developer's perspective, and explain how to integrate tokens into the development process.

Code Structure

First, let's take a look at the code structure. Github

ezreal-ui

This is a monorepo repository, and pnpm is used to manage dependencies.

There are two packages:

I named this demo using Ezreal because he is my favorite character in League of Legends. 😀
  • token(@ezreal-ui/token): This package stores the token files and compilation logic.
  • react(@ezreal-ui/react): A demo component library package used to showcase the use of tokens.

I will introduce the purpose of these two packages in the following sections.

Token Compilation

First, let's focus on the @ezreal-ui/token package (referred to as the style package below), which stores token files and compilation logic.

Let's review the workflow of the development phase:

DEV Stage CI&CD Workflow

We have two things to do:

  • Build and publish the style package using CI/CD after UED pushes the changes to the token files to the remote.
  • Update the style package after running git pull in the local repository when token files change.

This article will focus on token compilation, namely how to convert tokens into style files that can be used by components. The next article will discuss how to automate these steps and link them together through CI/CD.

Prerequisites

Style Dictionary

Before diving into the token compilation, let's take a look at a tool library for token compilation, Style Dictionary.

Style Dictionary is a build system that allows you to define styles once, in a way for any platform or language to consume. A single place to create and edit your styles, and a single command exports these rules to all the places you need them - iOS, Android, CSS, JS, HTML, sketch files, style documentation, or anything you can think of.

Using Style Dictionary, we can transform your design system's style and color definitions into code for multiple platforms and languages, speeding up development and maintaining consistency in the design system.

Style Dictionary provides many useful APIs to enhance the ability to manipulate tokens, like:

  • extend(dictionary, extension): This method is used to extend a Style Dictionary object with additional properties.
  • registerTransform(transform): This method is used to register a new transform function with Style Dictionary.
  • registerFormat(format): This method is used to register a new format function with Style Dictionary.
  • compileAllPlatforms(): This method is used to compile all platforms and formats registered with Style Dictionary.
  • ...

Here's a simple example that shows how to use Style Dictionary to compile tokens into CSS.

Assuming we have a token file like this:

Simple Example

We can extend it with API StyleDictionary.extend:

Simple Example

We can transform the value of some tokens with API StyleDictionary.registerTransform:

Simple Example

We can compile tokens to CSS with API StyleDictionary.registerFormat and StyleDictionary.compileAllPlatforms:

Simple Example

The API of Style Dictionary is mighty, and I won't list them one by one, you can refer to this document for more information. In general, we can use Style Dictionary to modify, format, and compile tokens as we want.

Build Style Files

In the previous article, the section For UED explains how tokens work in UEDs' workflow. and the section Sync Tokens mentions that after UED completes the design, the tokens will be pushed to the remote repository, like GitHub.

Setting of the Figma Plugin

There is a field called Branch, which refers to the concept of Git branches. The diagram uses a branch feat/alpha-publish. UED and FE should use the same branch for design and development to ensure that changes made by both parties are synchronized.

There is another field called file path, which specifies the directory where the files should be stored when using the plugin to push token files to Github.

Thus, when UED uses the plugin configured as above to push token files, the files will be pushed to the directory packages/token/src/tokens/cache/cache.json under the feat/alpha-publish branch.

The file path we entered is packages/token/src/tokens/cache/cache.json, so the plugin will store and push all tokens in a single JSON file named cache.json. Although the plugin supports exporting to multiple files, this feature is only available in the pro version.

Now that we have the token files, let's take a look at how to compile them with Style Dictionary when UED push changes.

Here is what we need to do, including:

  1. Splitting files
  2. Transforming values
  3. Handling details
  4. Generating style files

1. Splitting Files

Since we have exported all tokens to a single file named cache.json, it would be better to split them into a clear structure for easier token handling.

The token files will be split into two categories: component token files and core token files. The component token folder contains all component token files, while the core token folder contains all reference and system token files:

token(@ezreal-ui/token)

Simply put, the splitting logic is as follows:

  1. Read cache.json
  2. Clear previous splitting results
  3. Generate the core directory and generate the sys.json and ref.json token files based on the content read from cache.json
  4. Generate the component directory and generate token files for each component based on the content read from cache.json.

The completed splitting logic can be viewed in the code.

2. Transforming Values

Although we have got and split the token files, some of the values in these files cannot be used directly. We need to use the StyleDictionary.registerTransform API we mentioned earlier to do some transforms, mainly including:

  1. Adding units, such as px, to some numeric values
  2. Standardizing color models, using RGB uniformly
  3. Handling colors for dark and light modes.
  4. ...
Transforming Values Code Snippet

You can find the code here. The code uses many StyleDictionary.registerTransform API calls to perform various transform operations.

3. Handling Details

After transforming values, some details need to be handled, including:

  1. Add headers to each automatically generated style file to indicate that these files should not be manually modified.
  2. Split the final generated reference and system style files by token type, such as color and spacing.
  3. Add prefixes to all the style variables generated based on tokens.
Handling Details Code Snippet

You can find the code here.

4. Generating Style Files

After completing all the preliminary work mentioned above, we can start generating the style files. We chose Less as the format for our final exported style files because Less provides powerful capabilities, including variable definitions and function calculations.

Additionally, the layering of the style files follows the token hierarchy structure we discussed earlier, namely reference, system, and component.

Generating Style Files Code Snippet

You can find the code here. packages/token/src/scripts/style-dictionary.ts is the entry point of our program.

At this point, we have completed the conversion from tokens to style files. Additionally, to facilitate the import of styles by users, I have added an index.less file and copy all these files to the dist directory. For more details, please refer to the code.

less

You can find all scripts in the package.json file.

Package.json

Token Usage

Now that we have obtained the style files converted from tokens, let's take a look at how to use them.

Let's check out the package.json files of these two packages.

Package.json

@ezreal-ui/token specifies the exported content through exports. Because we are using a monorepo, @ezreal-ui/react can reference another package, @ezreal-ui/token, in the same repository by using workspace:*.

Let's take a look at the structure of @ezreal-ui/react:

Project Structure

Since we are mainly focused on styles, let's take a look at the contents of index.less under Button directory.

Button Style

We use the @import keyword provided by Less to import the @ezreal-ui/token. By doing this, we can use all the variables defined in @ezreal-ui/token. This includes some common variables, such as prefix and ezreal-vars-prefix, as well as variables that are generated from the tokens, such as --ezreal-button-outlined-color-bg-disabled and --ezreal-button-outlined-padding-y.

In addition, we used the each function provided by Less to loop through and generate style code, which helps improve our coding efficiency. You can refer to the documentation to become familiar with these functions.

In the End

In conclusion, we have discussed how to integrate tokens into the development process from the developer's perspective. We have also covered important details about how to use Style Dictionary to manipulate tokens, such as transforming values, adding headers, and generating style files. Finally, we briefly demonstrated how to use the generated style files in a component library.

By using tokens within a design system, developers can greatly improve their workflow and increase consistency across projects. Token systems can also help with maintaining and updating styles, as well as making it easier to collaborate with other developers and designers.

In the next article, we will focus on using automation to link and automate the compilation, testing, publishing, and updating of tokens.

Related Reads