A color library, built for vision.
CSS Color 4 parsing, 21 perceptual spaces, OKLCh manipulation, color scales, built-in palettes, blend modes, wide-gamut support — plus Brettel/Viénot CVD simulation, Fidaner daltonization, palette distinguishability, WCAG 2.1 and APCA contrast.
Type a color.
Any CSS string, named color, hex, rgb, hsl, oklch — whatever
swatch() accepts. Every panel below tracks this value
live; the URL stays in sync so you can share state.
Through other eyes.
Roughly 1 in 12 men and 1 in 200 women have some form of color vision deficiency. Most are missing (or have a shifted version of) one of the three cone types in the retina, which collapses the normal three-dimensional color space down to two dimensions — a whole family of colors that look different to you all map to the same sensation for them.
Viénot, Brettel and Mollon (1999) worked out exactly where that collapse happens: for each deficiency, all the colors a dichromat can't tell apart lie along straight lines in LMS cone space, and every line hits a shared "confusion plane". To simulate what the dichromat sees, you project the original color onto that plane — keeping only the information their eyes could actually recover. Drag the severity slider to walk the anomalous-trichromat continuum from full color (0) to full dichromacy (1), covering the milder shifted-cone cases in between.
Restore lost information.
If you're building a "color blind mode" for an app or game,
these are the colors you'd actually put on screen.
daltonize() rewrites a color so the distinctions
a CVD viewer would normally miss get pushed into channels they
can still see — reds that would vanish for a deutan, for
instance, shift toward yellows and blues.
The table reads left to right per row. Original (sim) is the problem — what a CVD viewer currently sees with your unmodified color. Daltonized is the corrected color you display instead. Daltonized (sim) is the proof — what the CVD viewer actually sees after correction. It should look more distinct than the first column, meaning the lost information was recovered.
The math is Onur Fidaner's error redistribution: compute what the dichromat loses (original minus simulated), then add a weighted slice of that lost signal back into the surviving channels.
WCAG 2.1 and APCA.
Live ratio, AA/AAA × normal/large/UI badges, and Andrew Somers' signed APCA Lc. Hit auto-fix to nudge the foreground until it clears your target ratio while preserving hue.
The quick brown fox
jumps over the lazy dog. 1234567890 — Pangram.
Lighten, saturate, spin.
Immutable OKLCh-space adjustments. Amounts are in perceptual lightness (L) and chroma (C) units, 0–1 scale. Click promote to lift the derived color into the root and keep working from there.
Mix toward white, black, or grey.
Tint mixes the root color toward white in OKLab; shade toward black; tone toward mid-grey. The amount runs from 0 (unchanged) to 1 (fully mixed). Click any chip to copy its hex.
Compositing blend modes.
W3C Compositing and Blending Level 1 modes, applied per-channel in linear sRGB. Pick a mode and a second color to see the result.
Sets that belong together.
Click any chip to copy its hex; shift-click to promote it to the root color.
Continuous color ramps.
Enter comma-separated color stops, or pick a built-in palette. The scale interpolates in the chosen color space and renders chips across the domain. Click any chip to copy its hex.
What's it called?
The nearest CSS named color by CIEDE2000 distance. Useful for quick labels, accessibility descriptions, and color communication.
Blackbody color.
Drag the slider to generate a color from a blackbody radiator at a given Kelvin temperature (Krystek 1985). The inverse reading shows the estimated correlated color temperature of the current root color (McCamy 1992).
Distinguishable under CVD.
A palette is only useful if every color stays visibly different from every other — including for viewers with color vision deficiency (CVD). Two swatches that look obviously distinct to you can collapse into near-identical greys for a deutan or protan, quietly breaking charts, legends, and status indicators.
Drop one color per line, pick the deficiency you're worried
about, and set a minimum ΔE (perceptual distance — anything
below ~10 tends to read as "the same color at a glance"). Swatch
simulates the palette under that deficiency and flags any pair
that falls under the threshold. Each unsafe pair gets a nudge
button wired to nearestDistinguishable, which walks
one color away in OKLCh until the gap clears.
How far apart?
ΔE ("delta E") is a single number that answers "how different do these two colors look?" — not in RGB units, but in units roughly calibrated to human perception. A ΔE of 1 is the smallest difference a trained eye can reliably spot; 2–3 is noticeable; 10+ is unmistakably a different color.
Six modes, three eras. ΔE 76 is the original 1976 formula — Euclidean distance in Lab, fast but crude around blues. CIE94 adds weighting for lightness and chroma. CIEDE2000 is the industry standard with full hue-rotation correction. CMC is its textile-industry sibling. HyAB (Abasi & Fairchild) uses a hybrid L1+L2 metric for better large-difference ranking. ΔE OK is the modern take — plain Euclidean distance in OKLab, which was designed to make that simple formula perceptually uniform.