No doubt. And yet "seems logical" really doesn't trump decades of research results that contradict that logic. What you're failing to take into account is that the improved context is more than overshadowed by the increased difficulty of actually hitting the target.
You've got that backwards. Microsoft has been doing this *wrong* since Windows first came on the scene and has actually been going further off course since then, first by dynamically hiding individual items within menus and more recently by overhauling the entire command mechanism and breaking decades worth of users' motor memories.
There are three menuing systems in common use. The most effective and efficient is provably the one use by Mac OS. Coming in second are systems that use pervasive context menus. With some nice tweaks that haven't actually made it into any product system, these can be as good as 10% slower than the screen-rooted menu of the Mac. In practical use, they can end up as much as 75% slower. And yet they eclipse the efficiency of window-hosted menu bars so significantly that it's laughable.
So where has it been proven? I use both macs and PC's extensively, and for a user that is used to both systems, I don't see any advantage one way or the other. I do get slowed down sometimes on the mac with multiple monitors, especially when the menu bar isn't active for the app I want, so I need to bring that app front first. But that is rare and minor.
If there really have been studies that show an edge menu is better than a window meun(which is very believable by itself), was then entire context of the operating system and monitor configuration tested as well? Was it considering lots of applications open on several montors, or only a single app on a single monitor?
Just asking for facts, because as I said I really don't think there is any difference other than users personal preference.