You do understand that vectors don't exist as vectors once they are rendered right?
You don't say... Why there I thought that our displays weren't pixel based displays...
They become bitmaps and exist in texture memory?
Nope, they don't. They become bitmaps on the frame buffer memory, not the texture memory. Why render your vectors to textures before pushing them out to the framebuffer ? We're not doing 3D graphics here where the flat surface needs to be mapped onto a non-flat object on screen. We're rendering flat images to the screen. Just respect the Z-buffer and alpha blending and render straight to the frame buffer.
At some point you have to convert the mathematical construct into pixels. Doing this in real-time vs loading the exact bitmap is where the performance issues are.
Especially when you have very complex renderings with lots of layers. Just look at Adobe Flash, basic vectors no problems, complex illustrations, lots of problems.
Yawn... again, we've been doing this stuff with much less computers for quite a long time. You're making it seem harder than it is for the CPU.
The reason why Sierra chose to do their games that way (at least through King's Quest 4) is because of a lack of storage space. In Kings Quest 5, they switched to EGA-based bitmaps/sprites.
Nope, all the EGA era games used the PIC format. The bitmaps came with the VGA days. The sprites never were vector based to begin with, even in AGI as far as I recall. The point is, if a 286 running at 8 mhz could render vector graphics, I very doubt a Core i7 quad core aided by a bunch of GPU cores running in parallele can't do it.
Have you ever heard your CPU fans go into overdrive when you open a Flash page? There are reasons for that.
Nope, only when I try to have Flash decode H.264. For vector based games, the CPU usage and temp aren't high at all. Remember, Flash vector graphic based games have been around since the Pentium days. The Pentium pre-MMX days.