Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How library and tooling authors do keep track of conformance and changes? #247

Open
nclsndr opened this issue Aug 20, 2024 · 5 comments
Open

Comments

@nclsndr
Copy link

nclsndr commented Aug 20, 2024

Lately, I reviewed and benchmarked many of the OSS solutions for working with DTF tokens (Tokens Studio, Cobalt/Terrazzo, Design Token Validator from Anima,...). Side note: I focused on web tech, JS and TS.

What struck me was the level of repetition we all suffer from. 100% of any lib that parses DTF tokens needs to start by defining cumbersome types and primitives. Then parse and implement some sort of visitor pattern for both the tokens and their aliases. I can tell you we almost all have made mistakes interpreting some decisions or missing an update from the spec at some point 🙈.

Taking this step back, I feel like we would all benefit from a npm package that would host the types and the most implementation-agnostic definitions. I wrote the package design-token-format-module in late 2022, by that time it was not clear where it could be used and what type of APIs it could offer.
With the last version, targeting the Live Draft spec, I tried to focus on the most non-debatable piece for anyone to pick from.

Questions for library/tool authors:

  • How do you envision to keep track of the spec changes and conformance in your codebase(s)?
  • Do you feel a shared package of primitives would help?

Question for DTCG committee:
Would it make sense for the design-tokens organization on GitHub to host some repositories with the code implementing the spec for different languages / platforms? If so, I'd be happy to transfer the npm namespace to the community.

@drwpow
Copy link
Contributor

drwpow commented Aug 20, 2024

What struck me was the level of repetition we all suffer from.

I’m just asserting my opinion here, but I think ideally this is how Open Source works? Having more implementations, and more options, only benefit users.

Take YAML, for instance. It has a central specification defined of how it should work. But the group that wrote that specification aren’t responsible for porting it into every programming language; that’s done by hundreds (if not thousands) of individual developers. The end result is, yes, there’s a lot of “repetition.” But each individual implementation works extremely well for the users it’s serving.

I always view this as a feature, not a bug. Anyone being able to create their own library and open source it is always a great thing. And the community is always benefitted by more open source projects, not fewer. OSS is a lot of work! And there’s a real resource cost to trying to centralize implementation and put the burden on the standards committee, rather than distributing the load across developers, with many people being able to do part of the lifting.

But that may just be my viewpoint—I just don’t see GitHub projects “competing” with one another. I all view it under the same umbrella of collaboration and mutual sharing. That is, as long as it’s open-licensed and there’s not money/profit involved 😉

Taking this step back, I feel like we would all benefit from a npm package that would host the types and the most implementation-agnostic definitions.

I do agree with this though! Though maybe not an npm package, since that only benefits JS/TS tooling (and I’ve seen Rust, Go, and Python DTCG tools, to name a few).

I’d love to personally see an official JSONSchema implementation, hosted by the committee. I think that’s a good balance of:

  1. Low-effort (relatively-speaking)
  2. Beneficial to users and tooling maintainers (TS types could be generated from that, theoretically)
  3. Universal to all programming languages (fits in with the design goal of DTCG being JSON to begin with)

IIRC there are some current barriers to this working fully with the current design, but this may be achievable in the near future? But I haven’t looked into it in a while, admittedly.

@nclsndr
Copy link
Author

nclsndr commented Aug 20, 2024

Thanks for the inputs @drwpow

Having more implementations, and more options, only benefit users.

Completely agree. What I miss as a DTF tool developer is a source of truth in code, I could read and would speed me up understand the changes in the spec without reading the "human" version. Having the defs through a package manager is cherry on the cake.

Though maybe not an npm package, since that only benefits JS/TS tooling

My first thought was to have a repo per language, JS/TS was my starting point.

I’d love to personally see an official JSONSchema implementation

That's exactly my next step into the library, I'll explore it along the week.

@nclsndr
Copy link
Author

nclsndr commented Aug 21, 2024

@drwpow, update on the JSON Schema investigation with this PR

On the bright side:
I managed to get descent JSON Schemas for each type and their values. It passes with aliasing for top level and nested refs.

On the downside, I don't envision a way to generate the JSON Schema for the whole tree.
The issue is quite simple: we don't have a discriminator - like $type if it was required - while traversing the token tree. Hence, it becomes easy to produce false positives.

So, I feel like the JSON Schema is a step for the bare definitions.
Then, each language will still need to build up the validation business rules for:

  • $type resolution
  • alias value resolution
  • alias from/to type mapping border.color points to a color token

@romainmenke
Copy link
Contributor

romainmenke commented Aug 22, 2024

Is a shared test suite something that could help?

Each test could consist of:

  • a tokens json file
  • the address/id of the token that is the focus of the test
  • one or more expected representations of that token

Then tools could run their implementation against the test suite and check the result of parsing against the expected representation that matches their domain.


For all things web we have the web platform tests.

For my own work on CSS tooling I also try to publish and maintain shared tests so that other packages can more easily have correct implementations. (e.g. https://github.com/romainmenke/css-tokenizer-tests)

Such shared tests reduce the amount of work that each project needs to do and makes it easier to have interop.

@nclsndr
Copy link
Author

nclsndr commented Aug 26, 2024

Is a shared test suite something that could help?

That would definitely help. I was thinking about a seed at first, but the benefit was quite small.
Structured the way you propose would produce way more value for users.
I'll think about it moving forward, any help is welcomed.

I didn't know about the Web Platform Tests initiative! Thanks for sharing.
Even though the design tokens are pretty far from being native browser features, it still gives inspiration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants