Originally posted by Tokyo
I've known it as "chip" too, but "computer" seems widely used. Essentially, the RISC chip was origianlly designed because the instruction sets (the list of instructions that the CPU understands and can execute) were getting to long and complex. Chip designers realized that many of the commands were so complex and arcane that they were rarely used by programmers, and that the computer could execute essentially the same process with a string of simpler commands.
As an oversimplistic example, instead of having "add" and "multiply" commands, you could get away with only the "add" command; whenever you need to multiply, just add several times instead. The multiple-add process takes longer than the multiply process, but if you hardly ever multiply things anyway, then you don't lose much. However, since you have reduced your instruction set, your chip has less transistors, is smaller and more efficient, and runs faster in normal operations.
Keep in mind, however, that CISC and RISC refer as much to architecture as they do to instruction sets. Die size, pipeline length, cache layout etc. are now part of the CISC/RISC difference, and their instruction sets are getting less and less different from each other, as Catfish_Man noted.
Macs moved to RISC with System 7.5 in 1994, which necessitated a change of software, just like today with the switch to OS X, though not quite as radical. The new RISC system ran "PowerPC" software, and the old software was "68K." A backwards-compatible emulator was built-in so that old software could run, and software for a while was written in "fat" versions so both the old CISC and new RISC Macs could run it.
Windows machines are primarily CISC, but some have RISC chips, like AMD.
"ZISC" refers to neural network technology, a "learning computer" that analyzes data and builds recognition of patterns. They are trained, not programmed, and therefore do not have the instruction sets RISC or CISC machines do.
DaveGee's link is an excellent one; much more can be found with a simple Google search. You probably should not expect this to work its way into your computer at home for a while, and when it does, it will probably be auxilliary technology, not the CPU, until it is developed quite a bit further.
Tokyo