How come certain random strings produce various colors when entered as background colors in HTML? For example:
...produces a document with a red background across all browsers and platforms.
Interestingly, while chucknorri produces a red background as well, chucknorr produces a yellow background.
What's going on here?
improve this question
The WHATWG HTML spec has the exact algorithm for parsing a legacy color value:
The code Netscape Classic used for parsing color strings is open source:
For example, notice that each character is parsed as a hex digit and then is shifted into a 32-bit integer without checking for overflow. Only eight hex digits fit into a 32-bit integer, which is why only the last 8 characters are considered. After parsing the hex digits into 32-bit integers, they are then truncated into 8-bit integers by dividing them by 16 until they fit into 8-bit, which is why leading zeros are ignored.
Most browsers will simply ignore any NON-hex values in your color string, substituting non-hex digits with zeros.
ChuCknorris translates to c00c0000000. At this point, the browser will divide the string into three equal sections, indicating Red, Green and Blue values: c00c 0000 0000. Extra bits in each section will be ignored, which makes the final result #c00000 which is a reddish color.
Note, this does not apply to CSS color parsing, which follow the CSS standard.
<p><font color='#c00000'>Same as above</font></p>
<p><span style="color: chucknorris">Black</span></p>
I'm sorry to disagree, but according to the rules for parsing a legacy color value posted by @Yuhong Bao, chucknorris DOES NOT equate to #CC0000, but rather to #C00000, a very similar but slightly different hue of red. I used the Firefox ColorZilla add-on to verify this.
The rules state:
make the string a length that is a multiple of 3 by adding 0s: chucknorris0
separate the string into 3 equal length strings: chuc knor ris0
truncate each string to 2 characters: ch kn ri
keep the hex values, and add 0's where necessary: C0 00 00
I was able to use these rules to correctly interpret the following strings:
Unfortunately I have not yet been able to determine why this doesn't seem to work for adamlevine which should be ADE0E0 but it's actually AD0E0E.
It's a holdover from the Netscape days:
Missing digits are treated as 0[...]. An incorrect digit is simply interpreted as 0. For example the values #F0F0F0, F0F0F0, F0F0F, #FxFxFx and FxFxFx are all the same.
From this blog post, which covers it in great detail, including varying lengths of color values, etc.
If we apply the rules in turn from the blog post, we get the following:
Replace all non valid hexadecimal characters with 0's
chucknorris becomes c00c0000000
Pad out to the next total number of characters divisible by 3 (11 -> 12)
c00c 0000 0000
Split into three equal groups, with each component representing the corresponding colour component of an RGB colour:
RGB (c00c, 0000, 0000)
Truncate each of the arguments from the right down to 2 characters
Which gives the result
RGB (c0, 00, 00) = #C00000 or RGB(192, 0, 0)
Here's a JSFiddle demonstrating the bgcolor attribute in action, to produce this "amazing" colour swatch:
This also answers the other part of the question; why does bgcolor="chucknorr" produce a yellow colour? Well, if we apply the rules, the string is:
c00c00000 => c00 c00 000 => c0 c0 00 [RGB(192, 192, 0)
Which gives a light yellow gold colour. As the string starts off as 9 characters, we keep the second C this time around hence it ends up in the final colour value.
I originally encountered this when someone pointed out you could do color="crap" and it, well, comes out brown.