Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: Oklab in react native #871

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

AntoineThibi
Copy link

Adding OkLab color space to React Native

View the rendered RFC

@NickGerleman
Copy link

NickGerleman commented Jan 24, 2025

I've been doing some very recent work related to colors (and funnily enough just left a TODO for oklab). We are planning to replace existing ViewConfig processors with native parsing as part of some broader styling changes. facebook/react-native#48913

A general thing I wondered, is this something a designer would ever use? Perceptual color spaces are really neat because euclidean distance is the same thing as visual difference, which would have some good properties for interpolation I think (or special color math), but my assumption has been that it wouldn't often show up in real-world use-cases.

I think some of the proposal is a bit inaccurate in the current state of P3 and color functions. As far as I'm aware, support for these has not landed. Instead we have a global API to treat sRGB colors against P3 primaries, which I really heavily dislike (instead of letting people use P3 primaries like on web, we added a global vibrant mode on iOS). We are somewhat close to being able to raise Android minSdkVersion to 26, which would simplify the implementation there as well (since RN can rely on color longs). I think that may happen this year.

Another gotcha I noticed, is that a lot of this is happening at the Fabric layer. Right now, Android's props parsing path is effectively parallel to this, but there is work this half to correct that, and building these primitives in shared C++ is the right long term path.

@AntoineThibi AntoineThibi force-pushed the add-oklab-color-space branch from f9a8544 to 97e935c Compare January 30, 2025 08:50
@tychota
Copy link

tychota commented Jan 30, 2025

I've been doing some very recent work related to colors (and funnily enough just left a TODO for oklab). We are planning to replace existing ViewConfig processors with native parsing as part of some broader styling changes.

When we met with @cipolleschi in Paris, I told him about how i feeel that color in RN were designed with RGB only in mind (see 1 and 2) and how I was feeling mixed between:

  • adding yet another layer (after @ryanlntn works on P3) of complexity on top of color without refactoring, as it would lead to less maintainable code
  • doing the refactoring ourself as this is tricky and would likely conflict with Meta internal uses

If you plan to improve the parsing and maybe the color datastructure yoursel, I'm really super happy.
Looking forward to see if i can somehow helps on this but having Meta internally reworking this looks the proper way to do. We will adapt the implementation (and their is no deadline / client related short term need anyway so I strongly want to take the time to do that right).

A general thing I wondered, is this something a designer would ever use? Perceptual color spaces are really neat because euclidean distance is the same thing as visual difference, which would have some good properties for interpolation I think (or special color math), but my assumption has been that it wouldn't often show up in real-world use-cases.

I don't see designer using this directly in most cases.

For me there is two exceptions:

  • accessibility focused app were you want to ensure the contrast is always good and existing polor colors like HSL fails short
  • white mark app (eg declination of a sport app to multiple competition were you want the UI to look the same but with different hue)

But I agree, if we look at today ecosystem:

  • photoshop is using oklab internally to power gradient
  • tailwind is using oklab internally to generate their new color scheme

Evil Martians seems to be using oklab directly (see 3) and if you want, I can try to reach @ai to have return of experience of this at evilmartians.

I think some of the proposal is a bit inaccurate in the current state of P3 and color functions. As far as I'm aware, support for these has not landed. Instead we have a global API to treat sRGB colors against P3 primaries, which I really heavily dislike (instead of letting people use P3 primaries like on web, we added a global vibrant mode on iOS). We are somewhat close to being able to raise Android minSdkVersion to 26, which would simplify the implementation there as well (since RN can rely on color longs). I think that may happen this year.

Yeah, my understanding is the same:

  • P3 was merged only in iOS
  • the way it works was "fine" for short term but look not the proper way to move forward
  • to merge P3 in Android, we nee to get a way to propoerly check if wideGamut is supported and this requires API bumps

I see oklab as a good way to support P3:

  • user will define P3 colors using p3 primaries
  • if we are sure P3 is supported (all iOS devices probably, and some Android devices) we use at it is
  • else we use oklab internnaly to do gamut mapping, as it is done in browser (see 4)

Another gotcha I noticed, is that a lot of this is happening at the Fabric layer. Right now, Android's props parsing path is effectively parallel to this, but there is work this half to correct that, and building these primitives in shared C++ is the right long term path.

Not sure I understand this part, sorry. Would like to get more contexte on this to see the implication.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants