Okay if this is possible please don't patent and/or steal the idea. I am just curious if this is possible. Idea:
A lot of us who own Apple's excessively glossy displays, often find ourselves in places where lighting makes it not favorable to use the product due to glare or reflections. Now with Active noise cancelation we can have headphones that analyze sound and produce waves opposite to annoying sounds to cancel the noise altogether! Now is this possible with displays? The idea would be that your iSight camera would analyze light around you (of course you could turn it off!) and produce something that combines the light around you (and where it would reflect on the screen) and the image on your display, and produces images that make the light glare invisible! I am no physicist, in fact I am only 16, so obviously this has no scientific background. How I imagine it could be is where light is glaring on the display (they could even implement cameras in the display a la Microsoft Surface) the display would get brighter and where light is not the screen dimmer. It would also need to run an operation( the gpu can do this?) to make sure the brightness seems uniform throughout the product.
This is the other idea I had although I cannot explain it well because I am not a scientist: The display would analyze light glaring on the screen run an operation to get rid of the glare by using the first method along with blending the opposite color (I think thats what it is. I might mean wave or spectrum sorry) to produce a true image without any glare.
This is favorable to the "Antiglare display" method in that the colors would still have that "pop" that glossy fans love and the no glare that some matte fans go for. As for the photographers who use Matte displays the display will still have to continue to exist. Is this possible? Thoughts on the idea?
EDIT: Another random idea from the mind of Michael.
A lot of us who own Apple's excessively glossy displays, often find ourselves in places where lighting makes it not favorable to use the product due to glare or reflections. Now with Active noise cancelation we can have headphones that analyze sound and produce waves opposite to annoying sounds to cancel the noise altogether! Now is this possible with displays? The idea would be that your iSight camera would analyze light around you (of course you could turn it off!) and produce something that combines the light around you (and where it would reflect on the screen) and the image on your display, and produces images that make the light glare invisible! I am no physicist, in fact I am only 16, so obviously this has no scientific background. How I imagine it could be is where light is glaring on the display (they could even implement cameras in the display a la Microsoft Surface) the display would get brighter and where light is not the screen dimmer. It would also need to run an operation( the gpu can do this?) to make sure the brightness seems uniform throughout the product.
This is the other idea I had although I cannot explain it well because I am not a scientist: The display would analyze light glaring on the screen run an operation to get rid of the glare by using the first method along with blending the opposite color (I think thats what it is. I might mean wave or spectrum sorry) to produce a true image without any glare.
This is favorable to the "Antiglare display" method in that the colors would still have that "pop" that glossy fans love and the no glare that some matte fans go for. As for the photographers who use Matte displays the display will still have to continue to exist. Is this possible? Thoughts on the idea?
EDIT: Another random idea from the mind of Michael.