Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

michaelltd

macrumors regular
Mar 30, 2005
142
0
So, the PS3/360 is the only console that can deliver the experience I want. What I really want is a JRPG that is hand drawn/cel shaded and in 1080p natively. Nowhere in sight it seems. Sigh.

Eternal Sonata and Tales of Vesperia for 360.

Interestingly, it seems that thus far 360, of all consoles, has more JRPGs.
 

darkwing

macrumors 65816
Original poster
Jan 6, 2004
1,210
0
michaelltd and BOSS10L:

Yes I know, and I mentioned this in a previous post. Namco is my favorite RPG maker (mostly for Tales) and they're doing everything on the 360, which makes me wonder if they're upset with Sony for some reason. The 360 has sold about 560k units in Japan and the PS3 is over 2 million, so it makes no sense that next gen RPGs would be on the 360. I even heard that Vesperia may come out in America first as the install base is greater for the 360.

I'm going to pick up both games (as well as Beautiful Katamari since I'm a Katamari whore) and keep them shrink wrapped. I know Eternal Sonata is coming out on PS3 in Japan, but there is no mention of that port coming to the US so I may have to play the 360 version. No word on Vesperia coming to the PS3 either. So, I can find a used cheap 360 someday down the road to play those games if they don't come out on my console of choice. It is my fondest hope that they release future games for the PS3 first. Their Japanese website still lists "new RPG" for a ps3 but this has been there for months. I always thought that would be Vesperia, but I guess not now!

As for JackAxe:

0:

It was the trade-in conspiracy I tell you!

1:

Chilli Con Carnage is awesome. It stereotypes a little, but don't we all?

I don't know what WiiSP is.

2:

I always knew the Wii could do more than the GC, but the architecture is still the same. You can overclock things and make the busses wider, but your top-end is limited. Intel figured this out when they went to the Pentium M and finally the Core architecture. The industry needed a change. The 1000x 386 just wasn't cutting it anymore.

Apple tried to also push this route with the G5 -- They figured that if people could write very streamlined code, the deep pipelines would be very efficient and really maximize performance. Of course, Intel went a slight different route by tackling a lot of these issues in hardware with their new architecture, and after years of Apple telling us PPC was better than Intel, they switched. (reality bubble burst.)

Sony is banking on the fact that many complex physics, AI, and even graphical manipulation routines are highly subject to being parallelized, therefore they're hoping that as the industry moves in this direction the PS3 will be the system of choice. Things of this nature are essential for maximum throughput. Of course, this also led to the whole "wah wah the ps3 is too hard to develop for" and is why Sony is pushing middleware out like there's no tomorrow. In fact, recently the Crysis guys said they're porting their engine over to PS3 and the 360 but not the Wii. Of course, they say it's because the Wii doesn't see a lot of developers using 3rd party engines on it. He also mentions the limitations of the Wii as well. Hrm.

http://www.ripten.com/2008/04/07/ps3360-cry-engine-2-to-look-near-high-settings-on-pc/

This sorta goes against what you're saying Factor 5 is claiming. Just because they say they can port their engine to the Wii doesn't mean they have. Show me the money! :p

While it is true that you can scale a game back and get it to run on something slower, there are some games that you simply cannot that on and still have the same gameplay experience. Graphics/resolution isn't the only thing that's going into games.

Besides, arguing that the Wii is "underpowered" or not as powerful as the 360/PS3 is pointless since we all already know that apparently.

Where do you see that the Gamecube's GPU is smaller than the Wii's GPU? Just because the chip may be physically bigger doesn't mean the die is. Besides, I am using the term "overclocked" to mean faster busses, wider paths, more cache, etc. Same basic architecture but just more capacity. It's like going from a Pentium to a Pentium 2 or 3. It's just more cache, another new instruction set for certain types of calculations, and faster clocks. Big deal.

Microsoft did something interesting as well in this area. They also made a new architecture for their console, but made the development feel like a familiar PC. Pretty good move, imho. The stuff that they did with their memory controller to help with the limitations of UMA was quite clever, and while I still feel that UMA limits the top-end capabilities of the system, the cost savings is rather nice. Plus the devs get flexibility on how much memory to assign to graphics and how much to the CPU.

Why would you need a 128-bit memory interface anyway? (memory interface sounds like addressing to me.) In reality, the Wii should be extremely happy with a 32 bit architecture (and probably 28 address lines) because of how much memory it has. Same goes for the PS3/360. I always laugh at idiots who think 64-bit computing is somehow better. If you're working with a 10 gig scan in photoshop or doing a lot of complex calculations with 64-bit numbers (64-bit data bus) then sure, but if you're editing your 30k resume then certainly not. When you have to store addresses as 64 bit, you are effectively cutting your cache in half. Not a smart idea!

As far as I can tell, the Wii has the same GPU as the cube except it does more of the same thing. I think video cards are due for an architecture redesign because you're seeing things like dual and quad SLI cards coming out so people can push more of the same thing out.

3:

Agreed on the Wind Waker graphics. Zelda games haven't really felt "fun" to me since the SNES game. It's all the same thing. Wander around, pick up items, back-track, and 0 story. Wind Waker actually provided a bit of a story, as does TP. There's never any character development in Zelda games, though, and that's what I really like to see in games. I like JRPGs for that reason. Though they tend to go on and on about their feelings too much. Wind Waker gave me a break from all that because I had something else to look at. Plus, I really enjoy exploration based games (where TP does not at all shine because it's so restrictive in where you can go) and so being out on the boat and sailing to different little islands was really quite awesome.

Brawl feels exactly like the GC game, and I never did play the N64 one. I've never actually played an N64. It's a fun game just for all the things you can obtain/unlock.

4:

I might be open to selling the O2 and all its accessories. PM me. ;)

I also didn't play Doom on the SNES, but I was quite impressed they managed to get it to run on the system. Mode7 was apparently quite powerful for its day.

5:

Pick up the GC version of Paper Mario and prepare to vomit. I agree the Wii version looks fun, but so did the cube version.. I am hoping to find it for $6 like I said somday. :p I really have never enjoyed mario in 3d. I just like the traditional platforming aspect of it.

I find my TV does a nice job on SD signals and dvds look great. The only problem is that digital cable where I live compresses the heck out of SD and you can sometimes see artifacts.

The difference between 720p and 480p is quite noticeable. I'm not unhappy playing a 720p game except that Sony declared that "true next gen" was 1080p and have yet to deliver most games in that format. The devs are doing 720p and porting the rubbish to the PS3. I'm rather sick of FPS games, too. It seems that a good game coming out these days for any system is a shining beacon of hope in an otherwise dismal market. Luckily I have a ton of JRPGs still shrink-wrapped from the PS2 days. ;)

Lately I'm playing Okami for Wii (which I really like... can't believe I missed it for the PS2!) and still working on Ar Tonelico. Okami's controls are actually nice. You only have to point when the action is frozen, so it works well.

I agree on the devs having to learn all the tricks. I'm hoping that by 2010 all games for PS3 will be in 1080p and running well.

7:

I think marrying a girl who is into video games is going to be awesome. Plus she's going into med school, which I'll be paying for, so she can buy me a Corvette when she becomes a rich doctor. Unless socialized medicine takes over and she can't make more than $50k a year. (Political injection.)

I am looking to possibly upgrade my MBP that I got in November 2006 when the next models come out. I may also get a PC laptop and run Linux. I'm growing quite tired of Apple's inability to fix their notebook wakeup problems. Seems to be related to using external monitors and trying to switch to the internal.

8:

There was no reason at all for Nintendo to not make TP on the GC widescreen except that they are total jerks.

Don't you mean your cube was bundled with Melee? ;)

I had a few intense moments in Resistance Fall of Man. The last level was probably the craziest thing I've ever been through in an FPS. Endless waves of air and ground based enemies attacking you and your fellow soldiers while you're trying to take out a forcefield generator thingie and pieces of the roof are falling everywhere. I was quite impressed. F.E.A.R gave me a run for my money in several places, too. Sometimes you have to sit up, grit your teeth, and sweat a little... but I don't want to do that all the time. :p

9:

It looks like the link you gave me requires a PC to run, and I don't honestly have one. I have a headless windows box at work that I use for running coLinux because they won't let us use Linux on machines here. I'd like to get VMWare up and running on it one of these days instead.

10:

Don't let him stew you. You just walk south and exit the pot. Pretty simple really. Fatty can't move more than a few feet without wheezing.

11:

I think the reson I like HD gaming so much is because I gave up on PC gaming a long time ago. I don't want to have to upgrade all the time to play the latest games like my girlfriend's brothers are always doing. With a console I just play and everything works. After years of PS1/2 on a big screen tv, though, I want something nicer to look at.

I don't really care about a game being replayable or not. If I don't feel it's worth $60, I wait until it's cheaper. A lot of games are going for the "quality over quantity" model but I think that's just an excuse to be lazy. Heavenly Sword was hailed as being a great game but short, but games like that don't interest me, so maybe I'll pick it up when it goes greatest hits someday.

My threshold seems to be about 720p. Anything lower looks like garbage for the most part.

I read a forum on video games sometimes to see news about consoles, and someone said when he looks at a Wii he sees "bright colors and muddy textures" and I agree. I think that's all a Wii owner has to look forward to this generation. The other systems are capable of much more than that graphically as well as the gameplay. So, I feel the Wii simply isn't future-proof; or modern-proof for that matter. That isn't to say it can still look good, but as devs figure out more with the PS3/360 it'll look further and further dated.

I used to think that the Wii would be surpassed in sales eventually, but now I'm not so sure. I will unfortunately wind up going with the system that gives me the most JRPGs so I guess that's where I'm stuck at. :p

But yes, I'm waiting for many games to be $20 instead of $50-60. Like Paper Mario, Mario Galaxy, etc... ;)
 

atszyman

macrumors 68020
Sep 16, 2003
2,437
16
The Dallas 'burbs
Why would you need a 128-bit memory interface anyway? (memory interface sounds like addressing to me.) In reality, the Wii should be extremely happy with a 32 bit architecture (and probably 28 address lines) because of how much memory it has. Same goes for the PS3/360. I always laugh at idiots who think 64-bit computing is somehow better. If you're working with a 10 gig scan in photoshop or doing a lot of complex calculations with 64-bit numbers (64-bit data bus) then sure, but if you're editing your 30k resume then certainly not. When you have to store addresses as 64 bit, you are effectively cutting your cache in half. Not a smart idea!

When they talk about a 128-bit memory interface they are talking about how wide the data path is, not addressing. So assuming that the memory bus is at least running at the same clock speed, if not faster, they get at least double the memory bandwidth, and are able to fetch 2 64-bit numbers or 4 32-bit numbers in one memory fetch rather than have to do two.

I highly doubt that they even use 32-bits for addressing anywhere on the Wii since that would give them at least 4 GB of addressable memory (assuming that a byte is the lowest addressable quantity, which they could do larger intervals for more memory access), and IIRC there isn't 4 GB of memory anywhere on the Wii. (unless you count the SD card slot, but there aren't 32-pins on SD cards so there are other ways to handle larger addresses.
 

darkwing

macrumors 65816
Original poster
Jan 6, 2004
1,210
0
When they talk about a 128-bit memory interface they are talking about how wide the data path is, not addressing. So assuming that the memory bus is at least running at the same clock speed, if not faster, they get at least double the memory bandwidth, and are able to fetch 2 64-bit numbers or 4 32-bit numbers in one memory fetch rather than have to do two.

I highly doubt that they even use 32-bits for addressing anywhere on the Wii since that would give them at least 4 GB of addressable memory (assuming that a byte is the lowest addressable quantity, which they could do larger intervals for more memory access), and IIRC there isn't 4 GB of memory anywhere on the Wii. (unless you count the SD card slot, but there aren't 32-pins on SD cards so there are other ways to handle larger addresses.

Atszyman,

I've just never heard the term "memory interface" used like that. I've always heard databus or data-width. BTW, it's very common for architectures to have 32-bit addressing but have fewer address lines. The NECVR4181, a system-on-a-chip based on a mips r4000 core had 21 address lines but 32-bit address. Why? Because it was a 32-bit processor and having 21 bit registers for addresses would be silly. :) They actually use some of the upper bits to help determine behavior of the MMU. IIRC, they used the upper 2 bits as followed:

00 - Physical addresses, not cached.
01 - Userspace virtual addresses.
10 - Physical addresses, cached.
11 - Virtual addresses, cached.

That's going off memory from work I haven't touched in 4 years, so don't get mad at me if I'm wrong. But yes, it had only 21 address lines. The max ram it supported, IIRC, was 8 megawords (2 banks of 4x16)) We used an 8 MB chip with 4x16 addressing.

SD cards are serial and even support SPI, which is something that any embedded systems engineer should be familiar with. That or I2C. You wouldn't want to address data that way on a CPU's memory bus. :)

I also can't find anything that says the Wii is 128-bit except for some people claiming it on forum posts. I think Nintendo is more closed about their hardware, though. I recall someone mentioning there is something on the Wii's motherboard that nobody knows what it does. Probably a charge pump to blow the GPU if you ever pirate their games. :p
 

JackAxe

macrumors 68000
Jul 6, 2004
1,535
0
In a cup of orange juice.
Ok, I'm going to put the Wii-tech thing in another post, just to cut this post in half. :)

0:

I saw that the second Synmphonia is headed for the Wii. They just had to re-use the Cube engine, but still cool. Visually I like its style.

1:
Never, I never do that, I'm an upstanding citizen. :eek:

WiiSP === PSP to Wii port. AKA EVIL!


2.
---- >>

* See following post titled, Wii Nerd Speak, seeing past Nintendo's ambiguity.*

---- >>

3.

Yeah, some Zelda games were really nothing special, some were just wako! I really enjoyed the Minish Cap though, but it was too easy. If you haven't tried Phantom Hourglass, it's worth trying. The story wasn't great, but overall it's easily one of my favorite Zeldas. It has some graphic adventure elements, which actually made me laugh out loud. I really like Ocarina of Time, because it had extras, like getting the better Goron sword -- I miss that they didn't offer this kind of option in TP.

I loved the boat sailing, etc. It was so open, but yet there was still a clear path in the game, which kept the it going. I still need to finish TP BTW.

I feel the same about Brawl. It's why I gave my bundled copy away. I'll eventually buy the Wii game just to have it around, it's a good way to kill some time.

4:

OK. I'm actually seriously interested in your O2, because I want to tinker, but It would be after I buy a few things that are important right now. :)

Mode 7 always seemed like cheese to me, but then I played Mario Kart SNES. AHHHhhhhh ~i~

I just read that Doom SNES used the Super FX "2" chip, so it was apparently more uber and they were able to create a port on the SNES that was truer to the original than even the more powerful consoles at the time.

5:

I'll pass. If it's like Mario and Luigi Partners in Time, I'd never finish it. *hides, since some love that game.* I haven't finished the Wii version yet, but I would recommend it to anyone. The gigantic 8-bit character mode just rocks!

http://image.com.com/gamespot/images/2007/098/reviews/933012_20070409_screen003.jpg

This game looks great on a HD set by the way, even the crappy ones.

Digital Cable sucks. I was on Cox for 8 years and the TV quality dropped to vomit almost over night. This was in an area that was all fiber. And for a period of time, it was the worst broadcast I had seen in my life, if I don't count bad reception.

Now I'm on Dish Network, errr, it's better, but it still sucks sometimes. Some networks don't fork out the dough for some of their content, so it gets overly compressed. I guess if I were a bigger sports nut, I would have bought a HD set already, since those broadcasts are always pristine and get the higher priorities, if not the highest.
--

Some content is noticeably better at 720p, but for me it depends. As an arteEest I'm really critical about visuals, this is why I generally say the difference is marginal. My LCD TV as an example, doesn't really look any better at 720p than it does at 480p -- I've played a 360 on it -- EVIL. I still saw jaggedness.

For games that have more complicated menus, like WoW, or RTS games, I do prefer a higher res screen. For games derived on consoles, I'm happy either way, I just don't want the obvious crap that some developers have insulted Wii owners with.

--

I drove to a few different stores when Okami was supposedly out on 15th and no one had it in yet. :( I really freaking want this game and to toot my horn, I'm really good at drawing with the Wiimote, I will own ALL. :O

Just come back to PC gaming, we're already at 1600p. 1080p, that's so PC yesteryear. :p But of course 480p is so PC of the stone-age, when dirt ruled the world. :eek: OK, besides a few games like Warcraft, PC gaming seems to be in kind of in a drought now for good games, at least for games I'm interested in playing.

On this 1080p thing and performance. My friend's 360 finally died -- he was the last of the peeps I knew that bought one. Anyways, I asked him about getting his replacement and he said he wasn't going to do so, and instead he's going to wait for he PS3 MGS bundle. When I asked why, he said it was because of frame-rate issues when running at 1080p. I guess the 360 still needs some refinement.

6 - 7:

Make sure it's not an act. I thought my wife was into games, only to find out after we got married that she really didn't like them; Just some and it was more because me and my friends were into them. It's all a sham I tell you. No girl really likes game. :p ¬¬ *hides*

I never had a problem with my TI-Book(R.I.P.) from 2002, going in and out of sleep, until it decided just to sleep forever. Could it be something with the Asus boards? I've heard various issues with them and recall how bad the iBooks were when Apple first used them. I know they've been good PC boards, but still.

Why don't you run Linux on your MPro? OO

8:

There was a good reason for releasing Zelda on the Wii first, and that was to get people to buy a one. :D It was a jerk move, but just like Sony upgrading the Memory Stick to the MSPro and stating that none of their previous devices were compatible with it, just to force everyone to upgrade. This to me seemed like BS, considering that each MS has its own I/O. For PDAs, they just needed to release a firmware update, the did so for much older VAIOs at the time.

I guess, I don't even recall what I wrote. :) The Black cube was the best BTW. :eek:

I hear you. It's a memorable experience that's good to revisit every once in a while. =)

9:

But, but, but, you can run Windows on your Mac, but then again, that's nasty. But, but, but, System Shock is worth it. OO The game was originally DOS and Mac, but I can't find the Mac version anywhere, but there's always DOSBOX if you like to tinker.

10:

The first time I left south from the pot, but then I went back in a few more times trying to get the stein, I wanted to see what would happen if he stewed me. :)

11:

I hear you and would probaly be in the same boat if I had stopped gaming on comps. I'm actually more into all of the post processing effects they can do now days, that are really giving games that extra visual impact that higher-rez games didn't offer a few years back.

DVDs are 480, they don't look like garbage. =P It's just your TV, buy a new one and send me your current one. :D

Bright colors and muddy textures describes a "PS2." Considering the Wii has been degraded to the PSP/PS2 development teams by some of these western-retard-publishers, I can see why someone would make that assessment. I'm going to avoid these games. I've already played games that look way better than that description.

When you say future-proof, I can agree on a technical level, but that really doesn't seem to matter, since as of now the Wii has enough momentum in sells to buy its future-proof. Speaking of dated, the PS2 is still doing well. :)

If MS keeps on buying its JRPGs, then it will probably get the most, but I'm pretty sure you're safe with your current WiiS3 setup, since the 360 is a dud in Japan.

Paper Mario Wii and Galaxy are both worth their cost though IMO, ye just nee to play them for long-bouts of time; yes, I know that's an oxymoron. :eek:

<]=)
 

JackAxe

macrumors 68000
Jul 6, 2004
1,535
0
In a cup of orange juice.
Wii Nerd Speak, seeing past Nintendo's ambiguity

Ramblings continued, this takes me too much time, I'm slow. :(

I used quotes again, since this is a new post.

I always knew the Wii could do more than the GC, but the architecture is still the same. You can overclock things and make the busses wider, but your top-end is limited. Intel figured this out when they went to the Pentium M and finally the Core architecture. The industry needed a change. The 1000x 386 just wasn't cutting it anymore.

Here, read through these articles -- and they're not just blogs, or posts based on speculation by Nerds, like meee:
From Nintendo;
http://www.nintendo.com/wii/what/iwataasks/volume-1/part-1

Here's from Arstechnica;
http://arstechnica.com/articles/culture/wii.ars

What Intel did, is not reinvent the Pentium, they actually took a step backwards and not a little step, A BIG STEP -- the P4 with its newer "NetBurst core" was very inefficient. The Pentium M/Core as you know isn't based on a Pentium 4, nor is it based on a new radical design, it's actually based on the p6 core of a "Pentium 3," or to go further back a Pentium Pro. When I stated the Celeron over-clock thing, I wasn't just being sarcastic, there's actually quite a bit of truth in that statement, just as much truth as your over-clocked Gekko comment.

Similar to the Pentium M, Nintendo had to completely rethink their approach with the Wii's CPU. They also wanted to create a lower-power chip that still had a higher performance. Doing as you mentioned, would not have made that possible. They talk about this in the above Nintendo link. They had to completely redesign the chip, while using existing technology as the base -- which the entire industry does, especially Intel. This is nothing new and by no means does it mean that each new chips is simply an over-clocked version of its predecessor, that's simplifying it too much.


"The Wii's CPU, while retaining compatibility with the GameCube's G3-inspired Gekko chip, will have additional features and performance that developers will be able to take advantage of."

-arstechnica


"Using state-of-the-art technology in unprecedented ways is far more complex, difficult, and requires more technological know-how than simply using the technology to improve performance. The Wii system is far more complex than that of the Nintendo 64 and GameCube. Furthermore, since Wii is compatible with GameCube software, we have not only tried to create something new, but we have also retained some of the old functionality. Honestly, this was not an easy task, but I think we can proudly present to the world a new console that will have so much appeal for so many."

-Takeda

And a 1000x386, even the latest Core is still is still a x86 offshoot. It's a 9th generation x86, it's a 100,000,000x386. :)

Apple tried to also push this route with the G5 -- They figured that if people could write very streamlined code, the deep pipelines would be very efficient and really maximize performance. Of course, Intel went a slight different route by tackling a lot of these issues in hardware with their new architecture, and after years of Apple telling us PPC was better than Intel, they switched. (reality bubble burst.)

The G5 was then and is still an excellent chip, even when compared to the latest Core and it was certainly a much better proc than the first Intel Cores offered by Apple, when referring to performance -- I wish I could slap everyone that fell for that 2X hype, when Apple was comparing 2 procs to 1 proc.

And when I say above that a Pentium 4 was inefficient, I was being nice. Even a mid-range G5 with its superior SIMD was way more powerful than the fastest Intel XEON(NetBurst core). The only x86 proc that was as good as a G5 and better, was AMD's Opteron. It took Intel how many years to catch up. If they had just stuck with the p6 core, they would have been able to keep up with IBM and AMD.

Apple was not telling a farce about the PPC being better, because in many cases it was quite true, especially with the G5. Recall the G3, it was way faster than a Pentium 2. The G4 wasn't as fast as a P3 when it came to integer, but it had a way better SIMD than even a P4. Only Intel's latest SIMD, SSE4 is as good, or better than Alitivec. When Intel went the MHz route with the P4, they basically lost the performance race for that period of time.

The reason Apple moved away from the Power PC was power-efficiency, and distribution. Intel could offer both of these things. I can't imagine Apple ever having another drought like they had during the G4s, or with their portables -- this is another reason I didn't bother upgrading my old TI-Book, since every new PB was just another speed-bumped G4.

IBM obviously wasn't given Apple a much neede low-powered G5 for notebooks -- where as they could have -- and with Nintendo, Microsoft, and Sony all using PowerPCs, I can only imagine that Apple would have encountered more shortages. The G5 had also hit its MHz peak in its current form, but I can only imagine how fast things would be if Apple had moved to a G6 -- or whatever they wanted to call it instead of the new Intels. The new IBM Power 6 is he fastest processor on the planet that I know of, and the G5 was derived from a Power4. I can only dream, but I also want a new Mac portable... *rambles* :]


Sony is banking on the fact that many complex physics, AI, and even graphical manipulation routines are highly subject to being parallelized, therefore they're hoping that as the industry moves in this direction the PS3 will be the system of choice. Things of this nature are essential for maximum throughput. Of course, this also led to the whole "wah wah the ps3 is too hard to develop for" and is why Sony is pushing middleware out like there's no tomorrow. In fact, recently the Crysis guys said they're porting their engine over to PS3 and the 360 but not the Wii. Of course, they say it's because the Wii doesn't see a lot of developers using 3rd party engines on it. He also mentions the limitations of the Wii as well. Hrm.


I could not imagine the Crysis engine on a Wii in its current uberness, but it could be ported in a lesser form if there were a demand. Here, Yerli says it better and of course he reaffirms what we both know, that the Wii is limited compared to other platforms;
http://gameinformer.com/News/Story/200608/N06.0830.2058.31148.htm

These things are all great, and visually will give our eyes a feast, but if the development cost remains too high with a greater chance on a diminish of returns, we'll see more developers/publishers opting for the less-inexpensive/safer solutions. As an example, Capcom's decision to move Monster Hunter 3 to the Wii instead of the PS3.

And since Crysis is a FPS, the only other controller I would want to play it with besides a mouse, would be... *it's not a dual-thumb stick* :]


This sorta goes against what you're saying Factor 5 is claiming. Just because they say they can port their engine to the Wii doesn't mean they have. Show me the money! :p

While it is true that you can scale a game back and get it to run on something slower, there are some games that you simply cannot that on and still have the same gameplay experience. Graphics/resolution isn't the only thing that's going into games.

Besides, arguing that the Wii is "underpowered" or not as powerful as the 360/PS3 is pointless since we all already know that apparently.

Yerli's comment from the above link fits well here. :)

"With the Nintendo Wii the approach will be similar. We have this great controller, we have the limited power of the console, How we can make a confined space or large outdoor level, whatever, how can we make the best out of the controller that’s giving the experience that we want to give? Completely fluid interactivity – how can we do that? I think it would be a completely different approach, and it deserves to be as well. So, if it our decision to make Crysis for Wii, if and I don’t want to be quoted saying we’ll do it. But if – if we would do it, it would have to be a completely optimal version, but it would be great. (laughs)"


Behold the Factor 5 grooviness;
http://wii.ign.com/articles/851/851287p1.html

And just go back and play Rogue Squadron 2 if you haven't the game looks great and it definitely does not fit your earlier bright and muddy texture comment. ;)

And since I don't work for them, I can't show you anything. =P

No argument from me about performance. But I tend to I agree with Yerli's comment about gameplay;

"You can achieve anything with every hardware. I think it’s a matter of artistic direction, how you use the limitations. That ultimately is the experience you want to give."

-Yerli

And on Factor 5, I still get annoyed that their arcade approach to Star Wars space simulators has completely killed the series IMO. :(.. I really really really want a new X-Wing... But I really like that they also see the pointer as the biggest innovation on the Wii -- did I tell you I love the Wii's pointer... :eek:

Where do you see that the Gamecube's GPU is smaller than the Wii's GPU? Just because the chip may be physically bigger doesn't mean the die is. Besides, I am using the term "overclocked" to mean faster busses, wider paths, more cache, etc. Same basic architecture but just more capacity. It's like going from a Pentium to a Pentium 2 or 3. It's just more cache, another new instruction set for certain types of calculations, and faster clocks. Big deal.


Here be the the GPU size thing.

Wii ---- > O Cube ---- > o See. :D

Here's the Wiki about the size, but, it's Wiki, so it's.. Wiki;
http://en.wikipedia.org/wiki/Hollywood_(graphics_chip)

See my ramblings above, all chips are based on existing tech, or some roadmap. You're pretty much describing any CPU There was a HUGE difference in performance between the P2 an P3 BTW, that was a big deal. ;) Even the Cell has its influence an similarities with earlier chips like the Cray and even GPUs. It may be knew, but 's not radically new -- and as we all know, it sucks for general purpose computing.

Check out this nifty chart on the difference between x86 chips;
http://en.wikipedia.org/wiki/X86_architecture

There's cache, new instructions, super scalars, multi-core, monkeys... No matter how you generalize it, these things can make a huge difference in performance. Look at the early XScales vs the later ones. There was a massive bottle neck on the PXA250, which Intel didn't really fix until the PXA270 -- their PXA255 fix didn't really do squat.

Microsoft did something interesting as well in this area. They also made a new architecture for their console, but made the development feel like a familiar PC. Pretty good move, imho. The stuff that they did with their memory controller to help with the limitations of UMA was quite clever, and while I still feel that UMA limits the top-end capabilities of the system, the cost savings is rather nice. Plus the devs get flexibility on how much memory to assign to graphics and how much to the CPU.

Truly a PC friendly development system. Direct X has been good for MS and they were good for PC gaming, at least before the XBox.

But still, even the tech they're using is based on existing tech, but I know what you're getting at, where as the Wii took a more traditional path than Sony or MS, but as Nintendo states, that's not really true, since low-power was their goal, which required more than just a moderate tech change.

Why would you need a 128-bit memory interface anyway? (memory interface sounds like addressing to me.) In reality, the Wii should be extremely happy with a 32 bit architecture (and probably 28 address lines) because of how much memory it has. Same goes for the PS3/360. I always laugh at idiots who think 64-bit computing is somehow better. If you're working with a 10 gig scan in photoshop or doing a lot of complex calculations with 64-bit numbers (64-bit data bus) then sure, but if you're editing your 30k resume then certainly not. When you have to store addresses as 64 bit, you are effectively cutting your cache in half. Not a smart idea!


I mentioned it, because it's more than just an over-clock -- which I know you're just using as a blanket-term.

Anyways, why, for the graphics, it's important for bandwidth and does make a HUGE difference in real world, which I've learned from just putting together gaming rigs -- and on this note, if the Wii's GPU could not feed that, there would have been no need for Nintendo to use it, and they would have saved money. Most budget video cards from a few years back were only 64-bit, which cut cost. Just to share, my GeForce 2 Ultra from 1999 had 128-bit memory -- it rocked for its time and put ATI too shame. Even the very first consumer 3D accelerator had a 128-bit bus, the Nvidia Riva from 1997. ~i~ And some think this stuff is new. :p


And you don't need to preach to me about 64-bit memory addressing -- I'm old enough. :]

I've been using Photoshop since version 2 and professionally since version 3 -- I know it in and out, like my foot. I've worked on billboard art --- HUGE--- and I'm the kind of guy that always ran out of layers before Photoshop 6 -- I even found a bug in PS 5 which allowed me to exceed the 100 layer limit (or it was 99, it's been years.), which would instantly crash PS and corrupt the file -- Painter saved my butt when this happened. But PS is NOTHING compared to video editing/post processing effect, or 3D rendering work. These things are PIGs.

The 64-bit thing doesn't humor me as much as those that think Photoshop is the main reason why some of us need multiple procs, when by today's standard of hardware, Photoshop is a rather tame app; Even my old G4 still does an expectable job with it -- 2Gigs sucks though, since PS has to share that with the OS.

As far as I can tell, the Wii has the same GPU as the cube except it does more of the same thing. I think video cards are due for an architecture redesign because you're seeing things like dual and quad SLI cards coming out so people can push more of the same thing out.

That makes sense, all does point to Wii's GPU being very similar to the Cube's, but I'm still an optimist that they did add a few extra undisclosed goodies. It did require ATI to design it, os maybe it's like the CPU. And I agree about video cards in general. SLI is like going back to the Voodoo days, but with a way larger power requirement. Ahhh, the memowies.


Anyways, I think that if Nintnedo continues to do well and becomes the next PS2 -- not graphically *grumbles*, it would be hilarious and a huge slap to Sony and MS at the end of this generation. if the Wii was really no differences between the Cube, besides a few minor changes. Then Nintendo could go BTW... But I'll give them a bit more credit than that, but I concede that the changes aren't huge between the Cube and Wii, especially now that I've read some newer docs on it and further speculation from others. The guys that made BWii, which I like on the Wii, which is visually one of the better Wii games, stated the Wii is twice that of a Cube. It's not what I had optimistically hoped for, but it's still better than 1.5. It's a Cube 2.0. :p And they also said they didn't use the Wii to its fullest potential when making their game. The same was said by Retro about Prime and even Nintendo about Galaxy -- which looks great in some areas, so I'm only expecting things to get better.

<]=)
 

atszyman

macrumors 68020
Sep 16, 2003
2,437
16
The Dallas 'burbs
Atszyman,

I've just never heard the term "memory interface" used like that. I've always heard databus or data-width. BTW, it's very common for architectures to have 32-bit addressing but have fewer address lines. The NECVR4181, a system-on-a-chip based on a mips r4000 core had 21 address lines but 32-bit address. Why? Because it was a 32-bit processor and having 21 bit registers for addresses would be silly. :) They actually use some of the upper bits to help determine behavior of the MMU. IIRC, they used the upper 2 bits as followed:

I think "memory interface was a poor choice due to the fact it could be assumed to be so many different things, data bus width, address width, memory type (DDR, DDR2, QDR) or any number of other factors.

SD cards are serial and even support SPI, which is something that any embedded systems engineer should be familiar with. That or I2C. You wouldn't want to address data that way on a CPU's memory bus. :)

I know serial interfaces all too well for my own good.


I also can't find anything that says the Wii is 128-bit except for some people claiming it on forum posts. I think Nintendo is more closed about their hardware, though. I recall someone mentioning there is something on the Wii's motherboard that nobody knows what it does. Probably a charge pump to blow the GPU if you ever pirate their games. :p

Which I think is why it's poor terminology. I would have to assume they are talking about the data bus since even a 64-bits of memory address would be well beyond what any system today currently needs the 16 Exabytes of addressable space is a bit of overkill for a home console. However if the memory and address bus are shared (like the PCI buss) then technically it could handle 128-bit addressing but to have that amount of memory for $250 would be a steal.
 

JackAxe

macrumors 68000
Jul 6, 2004
1,535
0
In a cup of orange juice.
Memory interface this. :)

When I was thumbing through Wii specs, the little that are known, that term was used. I regret not being more vague. I should have just listed 128-bit. :p

<]=)
 

darkwing

macrumors 65816
Original poster
Jan 6, 2004
1,210
0
0:

Yeah I really like Symphonia's style, too. Of course, the 480i presentation stinks because the faces are so blurry you can't see them unless it's up close. I forgive the visuals because they're only really noticeable in certain cut scenes, but I think the GC could have done better and hopefully they will make an effort. (Tales of the Abyss looks much better than Symphonia even though it isn't cel shaded, which is something I really like.)

1:

Why can't you just tape a PSP to the Wiimote and swing it around?

2:

Being instruction-compatible doesn't make the current generation an "x86 off-shoot." Similar yes, but sort of like the way my 2006 GTO (Holden Monaro) is similar to a model T.

From checking benchmarks at the time, I was finding that single-core comparisons of G5s to P4s was quite often in the P4's favor, unless it was things like photoshop tests. (And the Ms which were starting to come out were actually beating the G5.) The things that really put me off PPC all together was benchmarks comparing compile times. Of course, this is completely integer, and in many cases it's open source software not targeted specifically for a certain chip. It boils down to the PS3 argument, which is that it's capable of being faster if the investment goes in to making the software in such a way that it works well for the chip. Of course, a lot of things benchmarked for the G5 were done with GCC which was horrible compared to IBM's own compiler (supposedly) so maybe we'll never know.

I'd actually like to see the new C/C++ specs include a standard for branch hinting and other human-knowable optimizations.

Agreed that Intel was able to deliver more mobile solutions. Honestly, I think IBM wanted to get rid of Apple because they had the CPUs for the 360, the Wii, and the PS3. That ships a lot more units than Apple does. At least, at the time it did. Apple's gaining marketshare fast (probably due to the Intel switch forcing them to keep up with the latest tech, as an article here on MR.com suggested) and so maybe IBM is starting to regret that decision, but honestly I feel that Apple and IBM just had different roadmaps and weren't compatible. It's probably for the best.

I used to love PPC, but as a software engineer who runs builds several times a day, I hated waiting for my G4 to chug along while my Linux box was probably 3-4 times faster. My MBP does it just fine, however. As previously stated, this is probably due to raw integer power of the cpu and no real optimizations in mind for the compiler.

I think that due to this generation's success of the Wii, incredible visuals and physics are going to go the wayside and the gaming industry will have to wait another ~8 years to raise the bar I was hoping to see now. Obviously what I want doesn't match up with the market, and that's too bad for me. I'll still play whatever's out there, because I'm a consumer whore. :) We saw some amazing stuff with the graphically-weak PS2, and I think we'll see some good stuff with the Wii as time goes on. It's just that people will say "this looks great for the Wii" instead of "this looks great."

You could port anything to anything in a stripped down form. Porting Crysis to the Wii would be like porting the NES version of Donkey Kong to a wrist-watch.

Isn't Rogue Squadron 2 the game everyone makes fun of? Or was that Red Something-or-other?

I might pick up that cel-shaded rated M game. I've heard it was pretty funny. A girl who worked at a Gamestop told me that it was comical because of the violence coupled with the "graphics of the Wii." She was a huge xbox fan, though. She's the one who whispered "come back tomorrow at 10:30 if you want a Wii, but I'm not supposed to tell you that."

The link to the GPU showed me where you get this 128-bit stuff for the Wii. It's a GPU thing, not a CPU thing. Quite standard on GPUs, too, actually. The reason is that like Altivec, they use 128-bit SIMD registers. I didn't realize the Hollywood had hardware AES/Sha-1. That's cool stuff. I recently implemented both to perform a kerberos-like ticket exchange/auth on an 8-bit microcontroller. Can you say slow? I knew that you could. Actually it isn't too bad. Probably about 1.5 ms, which is good for that chip. Some of the code I was given to start was taking up like 30K all because they unrolled some loop. (wtf?) I rolled it back up, and and took only like 2.5k. Fun stuff. Seriously people, when the code inside the loop is over 2k, unrolling it 16 times to save an increment isn't the best of ideas especially when you only have 16K rom. (The large code obviously didn't load into the chip.)

I didn't realize the GPU had the stuff that talked to WiiConnect24 in it. That explains why the GPUs were burning up. :p

When I speak of architecture changes, I don't mean necessarily bolting on more cache or wider busses. That's what I refer to as "overclocking." AMD added hypertransport which made a nice benefit. It got rid of the xbridge (north or south, I forget) and put more things on the die. Nice benefit! No more FSB, either. That's a huge architecture change. Even SIMD stuff, while cool, is only "meh" if everything else stays the same.

I think you're the first person I've ever heard who says Photoshop is a "tame app." Bravo! I've often wondered why graphic artists complained about needing more power for Photoshop compared to people who do things like 3d rendering. Photoshop is only rendering 1 frame, right? As compared to what... millions? I'm no good at graphics. I can't even draw a straight line with the shift-key held down, so I don't really know what I'm talking about here and won't pretend to.

I found the Broadway's datasheet here: (apologies if it's fake)

http://microblog.routed.net/wp-content/uploads/2006/10/750cl.pdf

This specifies, as I was saying, that there is a 64-bit databus that also supports 32-bit mode. (I didn't say it also supported 32 bit mode, but cool.) Doesn't even support 64-bit addressing which doesn't surprise me.

The articles you sent me regarding the "new direction" of the Wii don't really say anything at all to prove the CPU isn't just more of the same from the GC.

http://arstechnica.com/news.ars/post/20061031-8112.html

3:

I sometimes play Minish Cap in my DS. I have it in my nintendo-bag. It is easy, but I like it. Reminds me of the SNES Zelda I liked so much. My friend liked Phantom Hourglass though he wasn't intending to get it. When I can find it for under $20 I may pick it up. I like that biggerand or whatever it's called sword in OOT. I forget if that's the Goron one or not. I got to the water temple in OOT in 2004 and only finished it about 3 weeks ago. Same for WW. I need to catch up on my Zeldas. :p I'm kind of having fun with TP because of the story. Even though it's horribly predictable it's still fun to see it play out, and Midna's attitude is quite fun. I still can't get the shield attack to work without getting ticked off at the crappy Wii controllers :p but the rest of the game plays well. I don't seem to have issues with not swinging the sword when I want to anymore.

You said Brawl again instead of Melee. :p

4:

I didn't know SNES Doom had a Super FX2 in it. That's pretty cool to know. Thanks.

5:

My friend actually bought that Partners in Time nonsense. I still make fun of his masculinity (or lack thereof) for it.

You'll still see jaggedness even with 1080p sometimes. I notice it on Gran Turismo 5p, during the closeups of the cars, sometimes the lines around the windshield cause a break in the lighting. It isn't "jagged" per say, but it is "crappy." Overall I see a lot less of it and things look better. If you want to believe 480p looks just as good because it's all the Wii can do, then go ahead. :p

I hate digital cable. I won't buy anything made by motorola, either. Their software people are all idiots. Both RAZR phones I've owned and both Moto cable boxes I've used have serious software issues. They crash or reboot a lot on their own, and sometimes they won't respond to buttons for 10 seconds or even more (2 mins one time on the cable box) but will queue up whatever I pressed and then run through it all afterwards.

I know PCs are more capable than the PS3 but they will never be as stable (too many hardware options) or as cheap (running at 1600p.) I'd be a PC gamer except for two reasons: 1) I like JRPGs, and 2) I won't give a dime to Microsoft if I can't help it. I don't own a PC and never intend to, except for Linux of course.

Microsoft bottlenecked the 360 with the UMA, as much as people want to argue about it, so it'll never have the throughput of the PS3. However, MS is in a good position because they can release a new console in a couple of years that will destroy the PS3 and if Sony follows suit, it'll look bad because they didn't follow the "10-year plan" they spoke of. If the new MS console has full backwards compatibility with the 360, then I think it'll work out well for them. We'll just have to wait and see.

6-7:

I don't run Linux on my MBP because I don't want to deal with it. I like OSX graphically, and while it isn't as efficient as Linux it's good enough for just about anything. I like having the different experience and I like showing people the pretty side of computing (as opposed to Linux) and how easy it is to use OSX. I have been starting to think about getting rid of OSX, but I'm not there yet. Mostly out of laziness. I only recently feel like I've recovered from school and that ended almost 2 years ago. :p (I worked my way through.)

My girlfriend loves games but won't have a lot of time to play since she's finishing up school and will be in med school soon. sigh.

8:

It was a jerk move, and it's why my friend vowed to never buy a Wii, though he's reconsidering for MP3 since he really likes the series. I just beat the Omega Pirate yesterday on my first try in MP1. Once I finish too I'll give MP3 a real run-through unless I throw the Wiimote in the garbage disposal. Actually I'll probably throw the sensor bar in there since it isn't the Wiimote's fault.

9:

No Windows. Sorry!

10:

You have to get the stein the first time, iirc.

11:

I said anything below 720p for gaming looks like garbage. The reason TV never suffered from a lot of jaggies is because of the built-in aliasing effect you get when filming live video. (At least, that's what I've noticed.) DVDs look fine, but BDs are the best. ;)

I never really noticed the PS2 making over-use of bright colors. They always seemed kinda dull to me. :p

I know MS is buying its JRPGs and it stinks. Still, that's the way the free market works. If I buy a used 360 someday to play them, then I won't feel so bad since I didn't actually give MS the money. I won't bother with XBL since I don't really care for online gaming anyway.
 

darkwing

macrumors 65816
Original poster
Jan 6, 2004
1,210
0
MacRumorUser said:
I was only messing, in fairness you both have made some really interesting reading material keep it up....

We try. :) BTW, MRU, you should PM me your aim/msn since the one I added like a year ago has never shown you online even once. :p

Antares said:
I know, right?

The way you guys are waving your Wii's around....seriously.

Heh, heh. I love a good debate. I echo, MRU. Keep it up.

Hey, when you got it, flaunt it!

You guys are just jealous, because this thread is so freaking long. :)

I'm just playing Okami, well, until it froze, :( but I'm working on my ramble. :eek:

<]=)

Okami froze? Ack. It's a good game. The only problem I have is with the (surprise) controls! I am unable to get the power slash to go where I want it to go about 50% of the time even when holding down Z, because it tends to make the line at the wrong angle.

I did some more reading up on the control strip, and it does seem that it's only going to work some distance away, so maybe I will have to get a box or something and stick it in the middle of the room.
 

JackAxe

macrumors 68000
Jul 6, 2004
1,535
0
In a cup of orange juice.
I played it again and it didn't crash. :) I couldn't recall last night, but today I remembered that Paper Mario Wii had froze on me once.

The angle thing is weird, It seems to be like that, since the table is on an angle, so I approach it that way.

Anyways, freaking awesome game. I'm glad I waited, but then again, this game merits a PS2 purchase. Too bad bout Clover.

The box sucks, but it really does work, even on my friend's really nast wall setup with his plasma above the mantle.

<]=)
 

JackAxe

macrumors 68000
Jul 6, 2004
1,535
0
In a cup of orange juice.
Super LONG. I'v'e been scribbling on this off and on. It's become a second job. :eek:

0:

Play it on a CRT, it helps the fuzziness, or it adds fuzziness. :) At least games like WW and Prime supported progressive scan. For games like Rogue Squadron 3 (EVIL) it was an absolute must. At 480i the game's visuals were sometimes impossible to see. I had to fly around with my targeting viewer always on, which kind of took the fun out of the game, since this is the kind of game I want to see all of the visual uberness. But this was their stupid fault, since RS2 looked great at 480i all of the way through.

1:
You can do that with your PSP, I recommend using duct-tape and super-glue. Let me know how it turns out. O.O

2.
It's all related to a degree, we can definitely both agree on that. :]

Performance is definitely relative to what you're working with, to be mr. obvious. Oo For me it was rendering in Mental Ray and AfterEffects that pushed me to a G5, oh and that 30" screen had just been released.

Rendering in Mental Ray on a G5 was night and day when compared to the best Xeon at that time. Opertons were the only x86 to best the G5 in rendering, but the difference between the G5 and Opertons weren't as big as the G5/Opterons vs Xeons. The same goes for AfterEffect when rending out a final video. It could chew through about 4 or 5 frames for every one on a Xeon. Video work was always better on Macs, because of Altivec, but now days with this Intel switch, it's kind of moot.

The one area of performance that has always bothered me, is Open GL and it was shown to be an OS X problem. I'm pretty sure Leopard is better than Vista though? It wasn't bad, but apps like Maya are really GPU intensive, it really hit home.

You and your compiling. My friend's a programmer -- a few of them actually, I remember carpooling with him and he would have his notebook on, but closed as it compiled whatever he had been working on that day. I thought that was funny -- this was in 98.

When Flash was really slow on Macs, so during the G4 DP 500 days, (Which I had at my old job) I had to request a PC. I got a P3 733, which was about 3 times faster at building the SWFs. At home, I did all Flash work on my PC until 2002, when Apple finally offered something better(At that time.) for this sort of work.

And since you're a linux guy, on a "fan"-tastic MacBook Pro, and a programmer, you probably have your reservations about Flash and a few kind words, since there's so much crap out their compared to the good stuff -- I can assure you I don't like the crap either. :)

---

Good point about the looks good for a Wii comment, that's exactly it. Give me more games like Okami and I'll be quite content. That game is spectacular for a PS2. Physics shmyshics, see, you don't need a powerhouse to simulate real physics;
http://spacewar.oversigma.com/ :)

I don't really care either way about physics, as long as the game is fun, or it's able to simulate a flight-simulator as an example -- physics rock here, but the best ones I played were back in the DOS days. My only gripe has been that so many developers have focused too much on the technical aspect of making a game, instead of making a game for the sake of making one, if that makes sense?


But I'm still amazed at he post processing work that these newer GPUs are achieving, especially the bigger console, and more so than physics.

I think even if the PS3 does become the leader now, much of what was promised visually still wouldn't become a reality until next generation, if not the next. I recall how MS and Sony were boasting the HD capabilities of the PS2 and XBox and we know how that turned out. As of now, it takes a PC with several uber GPUs and CPUs to actually achieve what Sony and MS had over-promised for this generation. And lastly, Konami's remark's about MGS4 -- which to me looks amazing -- not being able to achieve what they had imagined do to the PS3's limitations. (I don't totally buy this though, since they'll probably figure out a better approach to do something next time around.)

-------

NES and game watch, that's an EVIL comparison, even I wouldn't go that low. :eek:

As shown to me by the vastly technically inferior System Shock, it's not the tech that makes a games, it's the studio and their abilities. Looking glass did more on a 386, than studios now days can do with a the highest end gaming rig. I'll also put sound above visuals for some types of games, as being more important. In System Shock, it was Shodan's voice(Stereo) -- a women's voice -- that scared the bejesus out of me. It helped to make the whole experience believable.

I make fun of Rogue Squadron 3 -- It was bad in parts (controls and bad visual choices). But RS2 is technically one of the best looking games of last generation. The textures on the X-WIngs, no matter how close, still looked good.

Which game is the cell-shaded? If it's a game we were discussing, my brain has dumped it from memory. :eek:

For repetitive stuff, 128-bit is cool and saves me time, that's my consumer/power-user stance and yes, I knew it was a GPU thing, that's why I posted a follow up to clarify. :)
It only make sense to take the hardware route for security. Nintendo hates piracy. :p

I'm not as nerdy as you. :p I have a friend that has been programing embedded processor-- don't know if I used the right term -- since the eighties, you guys would probably have hours of ramblings.



Burning up? :) I hadn't heard of this. I found this though;
http://wii.qj.net/Wii-overheat-causes-video-output-error/pg/49/aid/77236

----

On the Photoshop thing, in some cases it might as well be 1 vs a million frames. This actually describes AfterEffects, which is pretty much Photoshop for video.

I think your shift-key is broken BTW. :]

This can be a huge ramble, so I'm going to try an keep it really short, which as you can see from my posts.. :eek:

Photoshop only gets hog worthy, when working on printable art with loads of layers. But with OS X's ability to handle its caching, which gives CS 2 and later access to all of a systems memory, even it its beyond the 32-bit limit -- Barefeats discovered this.-- and OS X's ability to give each 32-bit app their own 2-Gig block -- given there's enough ram, and of course comps that eat through 2D like it's nothing, Photoshop rarely taxes a system like it used to. It's been over 2 years since I last heard my hard drives in a frenzy.

3D rendering can pound any computer, and I mean any, into the ground. Applications like Maya have to where many hats and of course rely on apps like Photoshop to fill in the voids, but each part of Maya, is generally as complex as Photoshop, and in some cases more so -- PS of course, has absolutely no rival for what it does.

3D rendering is one area that truly benefits from both 64-bit addressing and calculations. Even companies like Pixar, with their vast an powerful rendering farms with movies like Cars, still required 70 to 100 hours just to render one frame in some of the more complex scenes. I've never done anything even remotely close to that, but even the corporate cheese I made has really taxed my systems. It took about 12 hours for my G5 and 4 other PCs (3 borrowed at the time) to render out 151 frames of 720p animation.

Anyways, there's way way more too this than my limited ramblings, which are probably flawed. I've been learning Maya since version 3 and I still don't have a grasp of al of its features. Studios will have a team for each aspect of the program. I'm just one guy, so I don't have that luxury and 3D isn't my strong suit. *excuses*


---

I've seen all of the speculation that it's just a (750 CL)G3, and I agree that it's most likely "based" on that chip. But I can tell you for a fact that the Broadway is not that chip. The Geckko has a "SIMD," therefore the Broadway must also. The 750CL does not have a SIMD. This goes for all G3s and it's kind of what makes a G4, a G4. And you can go meh about the SIMD, but from my experience, Altivec has saved me countless days over the years. I could not work without it.. Just because you have no perceived benefits from it, doesn't mean it's not important, it's certainly WAY more important than a PPU for most users.

Why is 64-bit addressing even relevant for a console? None of the current consoles have a need for it and none of them use it. Intel's lower end and earlier Core procs are only 32-bit, so I really don't see how it's important for consoles at this time. By the time it is important, we'll have holo-decks.

That article states that they had to use the latest techniques to fabricate the Broadway in order to meet their needs of a low-power CPU, which also retained a higher performance. I thought that was pretty clear and I even quoted it. And as we both know, every processor is more of the same, but as Nintendo stated, it took more know now to take their approach.


3:
I only died at the end boss of Minish, and it was for a dumb reason. I quickly shut off and started up again to clear that blemish, then killed the end guy quickly. Way too easy, but soo much fun. :)

I finished OOT the first month, but didn't complete LttP until after I finished Minish Cap, so 12 years later on my DS. :eek: I owned it on the SNES.

You suck with the shield, you should buy a new one, or just practice non stop. :D Midna is one of my favorite game characters to date. I'm so glad they didn't include the demon Tingle in this game.

4:

I found it while looking over Doom, since I had forgot it was even on the SNES.


5:

Oh, I was forced at gun point to buy Partners Through time... *hides copy*


I don't believe that 480 can look better than 1080p, what I'm trying to convey, is that that 1080p and even 720p are often under utilized for their resolutions. Like I mentioned, videos rip out detail out, which if it were a still image of of that resolution -- even somewhat compressed, there would be way more noticeable detail. For games it's as you mentioned, the developers are cutting corners to save on performance, corners that on a lower-rez screen would actually be masked in some cases, like the following;

800- looks awesome, most imperfections have been masked by the lower rez:
http://kotaku.com/photogallery/mcla421/1001234977?viewSize=thumb800x800

1280 - Looks great, but even looking past the obvious jpeg compression of the image, you can see many imperfections(Cut corners), like in the clothes, wheels, and so on:
http://kotaku.com/photogallery/mcla421/1001234977?viewSize=thumb1280x1280

Digital cable can suck it. My last box was a moto. It was slower than dirt and needed reseting too often. Now I'm on Dish's dual box thing and it's not bad, just loud. If I don't updated it, it freaks out.

I signed you up for Microsoft's mailing list, they'll need your credit card. :] Those reasons you mentioned are why MS and others have now focused on consoles.

If MS releases another console too soon, they'll never make a profit. That makes me happy for some reason. :)

6-7:

OS X is purdy. It makes my eyes salivate. I've seen Linux, but never used it. The closet I get, is the terminal window or DOSBox these days.

Good to hear about working your way through school. I really respect that. I know too many people that were handed their education, cars, houses, etc.. Not me. *jealous* I had some schooling, life drawing, painting, etc.., but ended up sidestepping into art from customer service. Back then, this was how lots of guys got going, now days it's way too saturated. It took forever to pay off my wife's student loans though. It seems as though you're better prepared than I was.

All an excuse... :p I tell you that none of them like it, they have their own agenda and it's EVIL... ¬¬ Just bug her during her long exams, she'll appreciate it.


8:

He'll come around, after all, the Wii is a virus according to Epic. But at least Nintendo didn't dump the Cube version and given their stance in the industry at that time, it was all business.

Glue your sensor bar to the middle of your screen, I assure you it will work better. O.O I don't recall the omega pirate, just that EVIL Ridley in MP1. I didn't use my missile, so kept on dying. Blah. I'm still at 99% in MP3 at the end.

9:

No apology, especially for that. There's DOSBOX. I'm just going to keep on trying... :D I got it up and running, it takes about 100% cpu for my machine to run it smooth and it uses the mouse to point, which I had totally forgot about -- just can't mouse-look. Here, read this guy's review to see why I'm so obsessive about this game;
http://www.mobygames.com/game/dos/system-shock/reviews/reviewerId,13878/


10:

I'll give it some time and figure it out. He didn't like me just swiping it. :)

11:

Even Okami looks like garbage... :( BD, as in Blu-ray I'm assuming. Yeah, they look great on a great HD set, just not the crappy ones. :p

PS2 games are dull. :p OK, I kid.


The PS3 will probably get those ports sooner than later, if not, EVIL... MS is like Nintendo with their EVIL business agenda. :]

I like online PC gaming, but only with friends. I'm too shy. (O O)

<]=)
 

darkwing

macrumors 65816
Original poster
Jan 6, 2004
1,210
0
0:

This is true. However, I don't own a CRT. :p Well, I have an SGI monitor that goes with my O2...

1:

Sony had to warranty the duct tape because the PSP flew through the window.

2:

I was never too impressed with the OpenGL drivers on OSX. I wrote a little 2d raiden like shooter once for a school project as a proof of concept for cross platform game development, and we demoed it on my 1.5 GHZ G4 PB, my friend's P4 Laptop with Linux, and another friend's P4 laptop with Windows. The OSX version used about 6% of the CPU while they were only using about 2-3%. I profiled the code and most of the work was being done in an openGL call. Ick.

Back in 98 he ran long compiles on battery power? Ick. I bet that baked his legs, too.

Never was impressed with Flash on OSX, either. Always used 100% of the CPU back in my G4 days as opposed to like 5% on Windows. Sheesh.

I don't mind flash, because it has rather limited purposes and its proponents aren't billing it as a replacement for a real language like C. (If Java was so great, why is the JVM written in C?)

That game you posted a link to is doing only a couple of calculations. It doesn't make a good point. :p Even Mario Kart has more physics than that.

My only gripe has been that so many developers have focused too much on the technical aspect of making a game, instead of making a game for the sake of making one, if that makes sense?

This is true, however the flip-side can also be said. Too many games are being made for the sake of making them so nobody is focusing on new and innovative ways to play. The more power you have, the more capability, and thus the more possibilities of finding new ways to play. Personally I think Sony should have made the PSeye higher resolution. I think there was a lot more that could be done with full body movement and image processing than can be done with the Wii's motion/pointing and the PS3 was the machine to do it. Even with the cheap camera they could still get things done, but higher resolution means more possibilities right? :p

Konami's remarks about MGS4 and "limitations" were made back in the day when devs were complaining the PS3 was too hard and Sony wasn't providing enough support. I haven't heard of such stories for a while, but I could be wrong. I actually picked up MGS3 last week to play sometime. I quit playing 2 because Raiden pissed me off. If I want to listen to some Shakespeare quoting hippy I'd drive up to UC Berkeley.

In regards to your opinion that it isn't the tech it's what a studio does with it, I agree! However, I go back to my previous point. Suppose the PS3 existed when they made System Shock, and they were going to push the limits of what it could do. It'd be a much better game than was made on the 386. If a studio wants to make a great game only for the Wii, they could make a better game for a more powerful system. Maybe it'd cost more to make, but then they'd sell it for more. After all, if the game is so cool, people will buy it. ($100 halo 3 limited editions ftl.)

I never played any of the Rogue Sqadron games. I don't even know what they're about.

Tales of Symphonia and Wind Waker are the cel shaded games we were talking about, if I recall.

I don't think Nintendo uses the optical tech they use to prevent piracy. They say that, but in reality I think it was to avoid paying DVD royalties. :)

I find most people who have been programming embedded systems since the 80s still write software like it's the 80s. Ick. (I'm 29.) So bring him into our conversation and we'll find out. ;)


There are videos of the wii gpu damage here:

http://www.youtube.com/watch?v=H_rIHnqpk4c
http://www.youtube.com/watch?v=CQojKNQM8Ik&feature=related

You can find more. Wiiconnect24 damages the GPU and causes vertical bands to appear in games. I thought they had it fixed early on, but a video I found said that he bought his in May 2007 so maybe I'll turn off WC24. One thing I give props to Sony for is that they actually pulled off a new console without major widespread hardware problems. Some Wiis came DOA asking for a factory software disc or something. We all know about the 360's problems.

Any modern operating system could give a 2-gig block of RAM to an application. I don't understand why you are calling this a "feature of OSX." Doesn't maya support distributed rendering? Maybe you could get a Mac Pro cluster going on. :)

I never said that Altivec wasn't important, did I? I don't know where you are inferring that from! Broadway must have SIMD since the Gekko did, but the Gekko is known to have had the SIMD instructions added to it. Using the "latest techniques" to fabricate the CPU doesn't mean the architecture is any different. It just means they made it a different way. :p

3:

I never beat Minish Cap. I haven't beat a Zelda game since Link's Awakening. That game wasn't easy, either.

Meta Ridley killed me last night on my first try. I didn't realize that you can't dash when you lose L-lock on something. I only lost one energy tank before he started ramming me. I'll try again. I won't get the 100% ending because I am missing 2 missile expansions and I don't even care. :p

Tingle is retarded and is another reason Nintendo really ticks me off. It's one thing to appeal to gamers of all ages, but don't try to appeal to gamers of all levels of intelligence. I actually quit putting my GBA cable in when I play WW because I hated Tingle.

(skip 4)

5:

You should have taken the bullet.

I think you and I have different philosophies. I would never make the argument that 800 looks better because it masks imperfections. I would make the argument that the developer should have worked harder to not have imperfections so it would look perfect at 1280. If a developer needs to cut corners then they probably have bad code somewhere. At least, when it comes to a pair of jeans coming out right, that is.

6-7:

The only thing I was handed was my father recommending me for a job that had some tuition reimbursement. Maybe I could have gotten it on my own, and maybe not. Who knows?

She says I always seem to pester her about something or another during her exams. Not sure why, really.

8:

He'll get a Wii at some point, but only when there are a couple of games he really wants to play. He's not big on motion control, either. He had an easier time getting used to MP3 than I did, though.

One thing I hate about FPS games is that it takes too darn long to turn around. Seriously, I can turn around 4-5 times in the time it takes Samus to.

Glue my sensor bar to the middle of my screen? It looks ugly enough on top of my tv.

9:

The game does sound pretty cool. I could run it on a Windows machine at work or something. If I only had a CD of it. :p BTW, the review mentions the computer taunting you. I wonder if that's where Portal got that from. The computer makes the game.

11:

Okami doesn't have a lot of jaggies because it, like Symphonia, is a very blurry game. WW is cel shaded as well, but not blurry, but suffers from jaggies. Ick.

I gave up on playing MMORPGs when my friend died. We used to play together.
 

JackAxe

macrumors 68000
Jul 6, 2004
1,535
0
In a cup of orange juice.
I had a really busy week. :eek:

0:
A CRT from that era, it's probably a BUBBLE tube... Recycle it! :p

1:
It probably hit someone, you should check.

2:
I recall ars technica when they ran the OG-L marks using a MBPro with OS X and Windows installed. Under OS X, it was something like 40% slower. I guess this was the price of a nicer GUI. Tiger helped with Maya noticeably and I've heard that Leopard has also improved performance.

The notebook was actually in his trunk. :) It was only on battery for about 30 minutes, but come to think of it, that probably wasn't wise.

Flash will never replace C, but like Java, its just an interrupted language for cross platform support to state what you already know. :) I use Flash to build demos, which are purposed for trade-shows and online. I keep my listeners in track and will only have one running when things are in the background, I always try to keep things happy for my G4 here.

But that game was like the first one to use physics, so nyahhh. :D

I agree with your flip side assessments. Nintendo is focusing on a new way to play, it may be simple in some respects and just a bundle of older techs, but it certainly waggled things up. A combination of those techs would be better for golf, but it would tire me out faster. So my lazy side is scared of such a tech. :eek: Yep, higher rez can open up possibilities, when coupled with the right developer. ;)

--
The MSG4 comment was actually made in April of this year;
http://kotaku.com/381412/kojima-disa...l-gear-solid-4

But still, I think they'll be able to do more next time, it's pretty much a given.

I've never played a MSG game btw. That makes me weird. :cool: So I have no reference to your hippy comments. :D

--

--
No disagreement that System Shock, or any other game couldn't be made better with less limitations. Shock on a PS3 with mouse-look and no sacrifices to what made it great would be f*cking awesome.

My ideal game would have no technical limitations like the old days, but also none of the budget and time limitations that today's developers have to deal with, since things just take longer.

The only thing limited about Halo, was its controls. =P *hides*

--

Rogue Squadron is fun, but it's just an arcade Star Wars flight game.

Oh yea. See, my mind is going. :eek:

I was commenting on AES/Sha-1, since it's for security right? And Nintendo should just release a DVD channel and get it over with. But it seems they're avoiding it as you say. :]

He's older and has lots of pent up anger... :D The same is true for artist. We all get stuck in some period it seems.

That's bad memory. I had a similar problem on my old Formac video card. I'm happy that it appears that Sony is once again making quality products. They lost it for a few years.

True. I didn't explain that very well. :) Recall that 10.3 offered greater than 32-bit memory addressing. At that time, Apple had a features listed on their site which stated that if your system has more than 2 gigs of ram, each open 32-bit application can have its own 2 gig block, if available. This features rocks, especially for my version of AfterEffects.

And YEP on the distributed computing. A new 8 MacPro would be at least 3x faster than my G5 when rendering Maya, that would be awesome.

Nope, but you went meh, as in meh, that usually says meh about something, as in meh. :p They made it in a lower-power happy way, with lots of rainbows and bunnies. :]

3:

I've been playing Links Awakening. This this game is QUITE odd with its Mario elements, but still fun.

If you get 100% ending, you'll be cool. :cool:

LOL, I so wish I could kill Tingle... I think they got the message that most did not like him, since they canceled his solo game from what I recall.

5:
Yeah, it would have been less painful...

When I looked at the jeans, I saw that they used a lower-rez texture, the same goes for the nasty shoe. The helmet bothers me also, since they're using a bump-map it seems and it just rendered funky. But at least the bike looks good, not counting the wheels. And no argument that the developers should have worked harder, but if their code was all clean and they had the time and money to get things right, then it could be either a console limitation, or they're just lazy.


6-7;

All good. :]

Trust me, it's they way they are... ¬¬ They seem to have fluctuating tolerances, which can vary widely day to day. ¬¬

8:

You just suck with a pointer and MP3 :) You should watch Invasion of the Body Snatchers, they know how to point... I'm OK with the motion, just like I am with repetitive button pressing. As long as it fits the circumstances.

Yes about the bar, and use an industrial glue that eats through glass and plastic and drips. O.O

9:

It's more than cool Oo, but finding the time to play it and getting past its visuals would probably be a big deterrent now days. Sooooooo many developers have borrowed from Looking Glass Technology.

I found this last week, marked it, but forgot to bid: :(
http://cgi.ebay.com.au/SYSTEM-SHOCK...719657QQihZ006QQcategoryZ149281QQcmdZViewItem

11:
I forgave WW, since so many parts of the game were just so wonderful to look at. I really like their painter style, even at lower-rez. WW on Wii will be awesome, I hope it comes to be, and yes it would look even better on a PS3... =P

You guys are lucky, I still know tooo many monkeys that are addicted to WoW.

Cool, my post is shorter than normal.
<]=)
 

darkwing

macrumors 65816
Original poster
Jan 6, 2004
1,210
0
0:

It seems to work well. It's got a pretty case!

2:

OSX has some inherent inefficiencies. It's thread creation times are abysmal compared to Linux or even Windows. Also, because it's a microkernel, it may be easier for apple to maintain but the cost of all the message passing it does really adds up. Linux has the modular code base with a build system which yields no run-time overhead. Quite ingenious, really. (Ingenious in that it's simple and obvious, so they stuck with it because it works.)

I think you mean "interpreted" not "interrupted."

It isn't the higher resolution that opens more possibilities but the higher amount of power that does. If you think about it, Nintendo could have released the wiimote for gamecube and it wouldn't have mattered. I think their success is because the Wii basically looks like an iPod. :) I guess it's the problem where peripherals never get the support they deserve because not everyone has them unless they came with the system.

If Kojima's team overestimated the PS3 that's their fault. Frankly, I don't think of it as a "dream machine" only the best thing out there. My suspicion, however, is that given the size and detail of maps in games like Dragon Quest 8 (ps2) or Oblivion there's no reason that he couldn't make the maps the size he wanted. It sounds like he's an idiot who is either mad at Sony or employed very poor programmers.

The Wii doesn't have a dvd drive in it, so it couldn't play DVDs even if Nintendo wanted it to. AES/Sha-1 is for security. AES is an encryption algorithm and and sha-1 is a hashing algorithm.

I had issues with my PS2 when I bought it. I'd play dvds and they'd skip or pause during playback. It wasn't the disc, though. I brought it into the store and it played fine on their unit. The PS2 that lasted me flawlessly for over 7 years was my 3rd unit.

I do recall that feature in 10.3, but the point is that it's still a 32 bit OS because each app can only have 2 gigs. :)

3:

I didn't know there was a 100% ending to Link's Awakening. I thought the Mario elements were fun and well done.

5:

I make a living squeezing all the requirements into under powered hardware. However, when you run into problems like that it tends to cost time and money to get things going properly.

9:

Too bad you missed the auction. You poor pitiful soul.

11:

WW ran in progressive mode, so I don't see why it would really look much better on the Wii. :p

I don't consider us "lucky." I meant that he died in real life, not in the game. There is a difference you know. ;)
 

JackAxe

macrumors 68000
Jul 6, 2004
1,535
0
In a cup of orange juice.
:eek: :( I recall you mentioning that about your friend at one other time. I feel like dirt. :(

0:
SGI did have great design for their time, it's still nicer than most PC computer vendors.

2:

Maybe I meant interruption. :eek:

I agree on more power. But I don't necessarily agree on the iPod comment -- unless you're joking, :) but I do agree that it is part of it. The Wiimote is what interested me most and that seems to be the feeling of other peeps I know.

Yeah, it seems they didn't focus on the big picture first and then ran into a limit. I guess they didn't want to make as many visual sacrifices as GTA IV.

The Wii does have a DVD drive. The Wii's game discs are DL DVDs. The only difference between the Wii's DVD drive and others, is its firmware is set to only recognize Wii discs. Wii mod-chips generally unlock DVD playback;
http://www.wii-modchips.com/drive-chips.htm

Nintendo just needs to offer a firmware update. For reference, the Super Drive in my G5 is a DL-Drive, but when I bought my system, Apple had firmware locked it to a gimped(slower no DL burrning support) version, since Tiger with DL support hadn't been released. It just took some searching and sure enough, others had already provided the tools to properly flash its firmware to unlock its features. Apple ticks me off sometimes.

My friend's PS2 had read issues after a few months, but not with the replacement.

Mostly 32-bit, but obviously the memory support was greater than 32-bit addressing. :) I put 5 gigs in my G5 back then and thought I would upgrade sooner than later, but have yet to do so. It turned out to be a good amount -- memory is so freaking cheap I paid over a grand for 32-megs back in the mid-nineties and I got it used from a friend.

3:

Me neither about the 100%. I'm only to the mountain dungeon... I've been playing it on my DS.... I need to buy it... :eek:

5:
But it's more rewarding when you do figure things out and achieve goals with limits imposed.

9:
It burns.... The only reason I have to live now, is food.

11:
There would be a difference in texture quality. :p

<]=)
 

darkwing

macrumors 65816
Original poster
Jan 6, 2004
1,210
0
2:

The Wiimote stinks. :p Here's a question. It seems when I play SSBB with the GC controller the Wiimote shuts off, but in Mario Kart it does not. I am really sad about Mario Kart. I won the gold cup on 3 of the 50cc races and the game just feels sluggish as heck. I'm hoping that it feels a lot faster when I get to 150cc but it's like they want you to go slow. Those half pipe things are just stupid.

One thing I notice is that red shells always seem to hit whatever you're dragging even if you're in a sharp turn, where on the DS if you were turning sharply the shell would hit you on the side. That was kind of neat because you could follow someone and still hit them at the right time.

I didn't realize there was a firmware update to make the Wii read regular DVDs. The difference between the discs was the reading at constant angular velocity. I guess with the firmware update you can have the drive spin at a constant speed. That's cool to know. One reason they also may not want it to play DVDs is that it isn't able to deal with the heat. Games don't spin the disc 100% of the time, but playing a DVD would. Just speculating, of course.

I used one of those super drive firmware updates on my old Powerbook to unlock 2x dvd writing instead of just 1x. It worked fine.

3:

I'll never play Link's Awakening again. Good riddance. :p

5:

Yes, and often times the end product won't suffer. It's a different argument, though. If I'm creating a device to ablate tissue and enforce certain safety guidelines, and I do that with cheaper hardware, I've met a specifications that are set in stone. I can only meet those specifications whether I use 100% of my resources or only 5% of them. On the other hand, a game has many variables that determine how good it is. Imagine Mario Kart on the PS3. It could have been a lot better.

9:

Look at the bright side. After you eat the food, you can squeeze one out in the shape (and smell) of a Wiimote. ;)

11:

I didn't even notice textures in WW. Isn't everything a solid color? :p
 

JackAxe

macrumors 68000
Jul 6, 2004
1,535
0
In a cup of orange juice.
I just checked my Wiimotes, they smell like plastic. Maybe it's the user. :D *I KID*

I still haven't put enough time in on the game, so I haven't noticed any thing that really bugs. I've been too busy smiling while turning my little plastic wheel. :)

That link I provided talks about how the mod-chip enables DVD playback. It modifies the scrambled chicken-seed and sectors o doom on the fly, or something along that extent, but in tech-speak.

I recall that firmware update, and I assumed Apple would do the same for my Pioneer, but they didn't.

Same here about Link, but not until I finish it. :)

Mario Kart would not be better on a PS3... Why, because Mario Kart DS is still the best one. ~i~ AHhHhhHhhh It just goes to say that more power and better controls don't always merit a better game. MK on PS3... EVIL! I can imagine this raped version of a Bowser being used.

Or I can do the deed, flush, and move on. And as I mentioned ealier about the smell, you might want to open some windows, or check your hands. :eek: *hides while kidding from a distance*

:confused: You didn't see the textures, or are you just joking? :)

<]=)
 

darkwing

macrumors 65816
Original poster
Jan 6, 2004
1,210
0
That probably is PS3 Bowser. :) DS is the best MK by far, even without analog controls. All I'm saying about the PS3 MK is that if the same game were on the PS3 as the Wii, even if the same graphics were used, it would have to be 720p to meet specs so it would look better at least for that. :) MK seems to blur in the distance, so that helps with aliasing issues. And the ps3 controller would do just as well as the wiimote in this type of game.

I thought a cool idea would be a social networking Wii game where you cram the remote up your butt and get stimulated by other uses online. Should be fun.

As for WW, I haven't really played it enough to recall a lot of texturing I guess. If there are, then they were rather simple. The game didn't seem to have a lot of colors really. Don't get me wrong, I like the way it looked. I just don't recall it being detailed.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.