Jan 5, 2012, 02:53 PM
This is my new app: Colorblind Camera. It is a universal app, but it obviously only runs on the iPad 2 because the iPad 1 lacks a camera.
What it does is let you view the world through the eyes of a person suffering from colorblindness. It would be a great app for graphic and web designers who need to make sure their products work well for everyone.
Here is a link to it in the store. (http://itunes.apple.com/us/app/colorblind-camera/id491236183?mt=8&ls=1)
A couple of screens of the iPad version.
This is my first app and it was a lot of fun to make. I already have a couple more planned, so any comments would be appreciated. Always looking for feedback.
Jan 5, 2012, 03:39 PM
is this just a channel thing? the whole thing just appears off, not a color mixing issue ...at least that i notice.
I am colorblind. interesting to try and see it work. but hesitant to download it.
and being a designer, people always have to ask what's this color...and that? how do you pick out your colors....i do it by doing the math on the color breakdown.
Jan 5, 2012, 04:41 PM
It is a pixel shader that is run on the input from the camera. The actual math is pretty simple, if you think about it geometrically.
Here is how it works. A normal person sees red as full red, blue as full blue, and green as full green. If you break down these as RGB values, you get these (obviously).
Red = (255, 0, 0)
Green = (0, 255, 0)
Blue = (0, 0, 255)
Now let's take Tritanopia, which is Blue-Yellow colorblindness. For someone with this, they don't see red as FULL red, they see it a bit differently. Using several sources on the net, these are what the colors red, green, blue look like to someone with Tritanopia.
Red = (255, 28, 0)
Green = (115, 237, 255)
Blue = (0, 127, 133)
Now, here is where the math part come in. If you think of the color spectrum as a 3D geometric space, where each of the primary colors are an axis in this 3D space, any color can be described as a point along those axis's. The 3D color space of a colorblind person is different, with their axis's pointing in different directions, at different orientations, etc, but the same rules apply. Any color they see will just be a point in their 3D color space.
Then, knowing this, we just need to transform the normal color space into the color space of a colorblind person. Luckily, linear algebra makes this pretty easy (it is just a single matrix/vector multiplication). We then apply this same transform to every pixel from the camera, and you then get your output. :-)
Does that make sense?
Jan 5, 2012, 05:23 PM
ok, need to read through it some more and look at the actual number values (being a print designer i work in CMYK— RGB not too often)
For me, though sure others are different, I see most colors normally but get confused (for instance) between if something, shirt, is red or green, but the rest of the stuff looks "normal". but Maybe I will look into the math some more that you presented. Maybe ask my brother to look at the app, he too is colorblind.
Hard to explain right now, gotta run and make dinner too.