Colour balance doesn't have an exact answer. The colours of objects that we see in everyday life depend not only on the physical characteristics of the object but also on the frequencies of light reflecting off them and on our brains' perceptions of that light. If you take a "white" object outside on a cloudy day, it will look bluer than on a sunny day; if you take it inside with incandescent lights, it will look yellower. But our brains do a lot of correcting for this: the difference we perceive looking at the same object at different times is much less than the difference if we take an unmodified photograph in each situation and compare them side by side. So how should we colour balance photographs from such different situations? Should they all be identical, corrected to reflect some theoretical ideal of what the colour "should" be? Should they accurately reflect the light that was bouncing around at the time the photograph was taken? Or something else? And let's not forget that we're taking something that existed on a full spectrum and translating it into combinations of only three specific frequencies of light, and then displaying it on a device that cannot actually recreate all the colours that our eyes can perceive. Ultimately, because of the inexact nature of human perception, photography is an art, not a science. There's no "right" answer, only what we perceive as better, and that can change depending on the context.