s'more Sony news....

s'more Sony news....

Nintendo is doing really badly now. They have just posted their first yearly loss since like 1960 something.

So did a lot of Japanese companies.

Sony's entire non-PlayStation division has been posting losses for years.

Microsoft has lost several hundred million on Xbox.

Xbox is not going anywhere Microsoft has more than enough money to keep it in the marketplace. Sony is also very entranched now. Heck Xbox is starting to pick up the pace in Japan while GC stays pretty much even. In europe GC is all but dead and in North America it's not doing so hot either. Yes they sold more consoles but at a considerably higher loss per system since the price drop.

Yes, Xbox is here to stay. Yes, PS2 is heavily entrenched in the marketplace.

But no, Xbox is not picking up any pace in Japan.

And no, GCN is not staying even.

And no, a lot of people over at EmuTalk (which is predominantly EU) will disagree that GCN is dead in Europe.

And yes, it's doing badly in the US compared to PS2, but it's now keeping pace with Xbox (though it already has a deficit by comparison).

And no, they aren't having a 'higher loss', they're having a 'lower gain' per console.

Even VINCE at Beyond3D (staunch Sony loyalist and Nintendo hater) posted numbers that showed GCN about ~50,000-100,000 systems ahead of Xbox worldwide. Note, that's worldwide, not in the US, or in the UK, or wtf... but aggregate sales EVERYWHERE. GCN's sound mashing of Xbox in Japan is largely to blame for this.

The boost that did will die off in a couple months (some people are probably seeing this as the consoles last days, kinda like what happened with DC).

We'll see about the boost. It's leveled off now, but it has yet to fall at all. And people have been making doomsday predictions about GCN for two years now, there's nothing new there.

Nintendo is not pushing any software out the door. And the only stuff that sells is their own, which doesn't make 3rd parties very happy. Just take a look at any gaming mag and you will see that a good majority of the "big name" games out there are being made for PS2 and Xbox only. Hardly ever do you see one for GC as well.

Bull SHIT Nintendo isn't pushing any software. Have you seen any of the Star Wars: Rebel Strike commercials? Or how about the print ads that are appearing in all magazines? And Mario Kart: Double Dash!! is right around the corner - you really don't think that arguably Nintendo's most famous and popular franchise won't do anything? It's already getting print ads, and I have yet to see a REMOTELY bad (p)review of the game.

GCN has the icky problem of being the 'bastard child' right in the middle - you have the 'everyone has one!' PS2 and the 'WOW it's powerful!' Xbox... with the nicely balanced GCN right in the middle. Dev houses don't like that. Hell, scalability is already hard to include in games - having two quality levels is reasonable enough for a console game, but THREE?

Oh, and BTW, just FYI, SoulCalibur II sold more on GameCube than Xbox... and sold the LEAST on PS2 so far.

Personally to me it looks like Nintendo is doing the same thing Sega did with DC. Not enough mass market games to attract the non-hardcore public. Really the only thing keeping Nintendo as good as they are is the GBA. But it's starting to loose a bit of steam since it has in recent times gotten a bit of a "snes port" look to it.

Which is funny because the GBA-SP is selling more than PS2 now.

As far as hardware I have to hand it to Microsoft for designing one of the best consoles ever. The Xbox is the kind of console that all makers should be creating. It's powerful, full of GOOD features and MS actually did their reserach and listened to customers (as with the controller). Heck I'd say the Xbox is the best console (hardware wise) if only it had a proper VGA output on it. Yes the design might not be to everyones liking but then again I'm not too fond of the design of the GC, PS2 and yes even Saturn.

You've gotta be kidding me. Xbox has a severely underpowered CPU, insanely high-latency memory, and low aggregate memory bandwidth compared to PS2 and GCN. And the Controllers both suck, they each have good and bad elements but neither compares to GCN's fantastic pad. And I have one word for you: Wavebird.

And um.. if MS had listened to gamers, they would never have released that horrid excuse of a default pad in the first place.

Also about Sony's new portable. I DON'T WANT NO FUCKING PDA CRAP IN IT. Look at what Nokia did with the Ngage. I want just a console (yes an mp3 player would be nice and easy to do with memory stick).

N-Gage isn't bad because it's a gaming system with a phone mixed in with it, it's bad because it's a phone with a gaming system mixed in with it.

Think about that.

Anyway, Nintendo IS now losing money on each GC. Before the price drop, they were basically breaking even. Personally, I'd prefer to play NEW games on the current systems than remakes of games I could just as easily play on my NES, SNES, etc.

OK. When Nintendo dropped to $150 they were just about breaking even.

Do you REALLY FUCKING THINK that after ALL THIS TIME the GCN's production cost hasn't gone down?!

Reality check.

::holds up mirror::
 
s'more Sony news....

Originally posted by racketboy@Nov 7, 2003 @ 04:59 PM

wow that's a long post

luckly there were paragraph breaks...

Yeah, if it's something I care about I tend to say a lot. :blush:
 
s'more Sony news....

"You've gotta be kidding me. Xbox has a severely underpowered CPU, insanely high-latency memory, and low aggregate memory bandwidth compared to PS2 and GCN."

Xbox is undoubtedly the most powerful of the three. I can't believe you'd even take a swing at that. Having the fastest CPU doesn't matter for a 3D gaming console. By that kind of reasoning, Gamecube is pathetic, underpowered. Also, you claim the Xbox suffers from high-latency memory. That's not true, what is your basis for comparison here? Does the Dreamcast suffer from insanely high memory latency? If anything, an RDRAM-based solution (like PS2) would have more trouble with that, since the technology was designed for high bandwidth and the latency is often higher than comparable DDR SDRAM - this was one of its faults. Obviously having the highest aggregate memory bandwidth isn't all-important either, or PS2 would kill GCN and Xbox easily.

As for the controllers, a controller-s is very similar in analog and dpad location to a GCN controller. But the buttons on a GCN controller are just strange... I prefer a more orderly, standard layout of some sort.
 
s'more Sony news....

Xbox is undoubtedly the most powerful of the three. I can't believe you'd even take a swing at that. Having the fastest CPU doesn't matter for a 3D gaming console. By that kind of reasoning, Gamecube is pathetic, underpowered.

All of Xbox's power is in its GPU. It's an unbalanced system.

Oh, and GCN's CPU is more powerful than Xbox's. Just FYI.

And PS2's CPU is overpowered compared to its rasteriser.

Also, you claim the Xbox suffers from high-latency memory. That's not true, what is your basis for comparison here? Does the Dreamcast suffer from insanely high memory latency? If anything, an RDRAM-based solution (like PS2) would have more trouble with that, since the technology was designed for high bandwidth and the latency is often higher than comparable DDR SDRAM - this was one of its faults. Obviously having the highest aggregate memory bandwidth isn't all-important either, or PS2 would kill GCN and Xbox easily.

Um...

PS2's memory latency is actually extremely low, only slightly higher than GCN's.

DRDRAM in and of itself isn't high-latency. It's the SDRAM-like PC implementation that has problems. In the PS2, the DRDRAM is in exactly two chips which are about a centimetre each away from the CPU. Both chips can be active at all times, and the signal takes a miniscule amount of time to cross the RAM and return to the Emotion Engine. In PC's, the signal has to go across the mainboard, through ALL memory modules (lengthwise), then back around, and only one chip per channel can be active at a time - changing active chips adds a latency penalty.

GCN's RAM is designed to be insanely low-latency. AFAIR it's the lowest latency DRAM ever created - its spec even rivals most SRAM, which is why it's known as 1T-SRAM instead of DRAM.

Xbox's DDR, on the other hand, is a UMA and is being addressed by three or four different devices at any given time. Real tests done by ERP at Beyond3D found that Xbox's memory latency is more than double that of PS2's.

As for the controllers, a controller-s is very similar in analog and dpad location to a GCN controller. But the buttons on a GCN controller are just strange... I prefer a more orderly, standard layout of some sort.

It's still bulky, the stick isn't as comfortable for me (shape-wise), and there are only two shoulder buttons. Plus the controller-S's black and white buttons are placed very awkwardly.

The only true, consistent advantage Xbox has over GCN is memory - 64MB > 40MB.
 
s'more Sony news....

Not all of the GCNs memory is MoSys 1T SRAM, though much of it is. The sustainable latency isnt uber-low. The Xbox may be UMA, but the other components that share the memory aren't going to be using it nearly as much as the GPU - which is obviously efficient enough, since it solidly trounces the GC in performance. It also allows developers great flexibility. The CPU, a 733Mhz Intel chip, provides good performance while being very bandwidth efficient. At MOST it'd use up to 1GB/sec. I've looked at the performance of the GCN's 485Mhz IBM "Gekko" chip, and its not outstanding. In fact, it looks pretty weak standing at around 1125 MIPS, for starters.

http://www.pcvsconsole.com/hank/answer.php?file=164
 
s'more Sony news....

The GC and Xbox have one huge advantage over PS2 (and DC had this too). Both systems GPUs can take compressed texture maps directly from memory, while the PS2 needs to have uncompressed texture maps send to the GPU. This GREATLY reduces how many texture maps the PS2 can utilize each frame. An article I read years ago basically stated that the PS2 could use upto about 10MB of textures per frame while the DC with it's PVR compressed textures even with the slower memory speeds could do up to 26MB of texture per frame. That makes a big difference when you want to show lots of different detailed textures. The GC and Xbox can use natively DXT compressed textures (since they are basically derivatives of DirectX based cards) and as such have the potential to use more textures while using less bandwidth.

Also in modern consoles the CPU doesn't play as important role as it used to. These days it's just delegated to being the manager for the system and dealing with user input and AI.
 
s'more Sony news....

Not all of the GCNs memory is MoSys 1T SRAM, though much of it is. The sustainable latency isnt uber-low.

Oh hell yes it is. You don't think sustained 6.2ns is low?

The Xbox may be UMA, but the other components that share the memory aren't going to be using it nearly as much as the GPU - which is obviously efficient enough, since it solidly trounces the GC in performance.

They don't use it as much... but I'm not talking about bandwidth, I'm talking about LATENCY. i.e. the amount of time between when the memory is addressed and when it returns the requested data. Xbox's UMA means a lot of things are addressing the same memory at the same time, across different bus segments, which slows things down.

And XGPU doesn't "trounce" Flipper. ERP over at Beyond3D said that there are many, many things XGPU does faster than Flipper, BUT he also did say that he could think of a few cases off the top of his head where Flipper would scream past XGPU to an extreme extent.

It also allows developers great flexibility. The CPU, a 733Mhz Intel chip, provides good performance while being very bandwidth efficient. At MOST it'd use up to 1GB/sec. I've looked at the performance of the GCN's 485Mhz IBM "Gekko" chip, and its not outstanding. In fact, it looks pretty weak standing at around 1125 MIPS, for starters.

The test used to find that wasn't using the Gekko's most important feature, paired singles. It can effectively perform two 32-bit operations per clock cycle. And I don't care how much bandwidth the P3/Celeron hybrid is using - it's all about latency, and Xbox's UMA has AWFUL latency.
 
s'more Sony news....

Yes or no, are you saying that the Gekko chip is more powerful than a 733Mhz PIII with 128KB L2 cache? Remember, PIII has some features it can take advantage of too, like SSE. The XGPU is still overall more powerful than the flipper, and I'm not saying flipper is a slouch. Yes each one might excel at certain things, but the nvidia chip is more powerful overall. I'm not an nvidia/intel/ms fanboy. Far from it. But I can't believe you'd hammer the Xbox's hardware. Do what everyone else does and attack the software instead.

Anyway, 6.2ns is only for the 3MB total frame buffer + texture cache. As Gameboy said, hardware texture compression greatly helps modern GPUs, but there's still a limit. The 24MB of main memory is still decent, but has a sustainable of around 10ns. Then there's the 16MB "A-memory", which is 81Mhz DRAM, very slow. That said, I think the GCN's hardware is very efficient and great for the manufacturing cost.
 
s'more Sony news....

Originally posted by Des-ROW@Nov 8, 2003 @ 11:17 PM

The EmotionEngine is more powerful than the Gekko or the XCPU ^^!

Yeah, and the MegaDrive is more powerful than the Dreamcast... :sarcasm:
 
s'more Sony news....

QUOTE (Des-ROW @ Nov 8, 2003 @ 11:17 PM)

The EmotionEngine is more powerful than the Gekko or the XCPU ^^!

Yeah, and the MegaDrive is more powerful than the Dreamcast...

Dude... 75 million polygons per second.. NO PROBLEM! :wanker
 
s'more Sony news....

Originally posted by Cloud121@Nov 9, 2003 @ 12:45 AM

Yeah, and the MegaDrive is more powerful than the Dreamcast... :sarcasm:

Haha, the kind of answer I would expect from an ignorant!
 
s'more Sony news....

The newer console, (in this case: Xbox) is overall the most powerful.

Why is this so hard for some people to accept?

It just can't show them to you or do anything with them.

Kinda like the Xbox with Microsofts original claim of "300 Million Polygons per second", and then later 125million... :rolleyes:
 
s'more Sony news....

Yes or no, are you saying that the Gekko chip is more powerful than a 733Mhz PIII with 128KB L2 cache? Remember, PIII has some features it can take advantage of too, like SSE. The XGPU is still overall more powerful than the flipper, and I'm not saying flipper is a slouch. Yes each one might excel at certain things, but the nvidia chip is more powerful overall. I'm not an nvidia/intel/ms fanboy. Far from it. But I can't believe you'd hammer the Xbox's hardware. Do what everyone else does and attack the software instead.

Hey, I don't deny that Xbox is most powerful taken as a whole, but it's unbalanced and there ARE cases where GameCube can outperform it, WITH better output. Hell, I can even think of an example off the top of my head - Star Wars: Rebel Strike.

Anyway, 6.2ns is only for the 3MB total frame buffer + texture cache. As Gameboy said, hardware texture compression greatly helps modern GPUs, but there's still a limit. The 24MB of main memory is still decent, but has a sustainable of around 10ns. Then there's the 16MB "A-memory", which is 81Mhz DRAM, very slow. That said, I think the GCN's hardware is very efficient and great for the manufacturing cost.

Texture compression is for bandwidth, not latency. And PS2 does support *some* texture compression - 4- and 8-bit CLUT - but it's limited.

And 10ns sustained is still obscenely low. You do realise SDRAM rated at '10ns' can't possibly sustain that? It's a maximum.

And yes, the A-RAM is really, REALLY slow. *sigh* N should've skipped the A-RAM and included more 1T.

Yeah, and the MegaDrive is more powerful than the Dreamcast...

Actually, the Emotion Engine is quite a lot more powerful and flexible than Gekko and XCPU.

But only the EE as a whole; the CPU (r5900i) can't compare at all.

Having two Vector Units bolted on doesn't hurt, you know. =)

Kinda like the Xbox with Microsofts original claim of "300 Million Polygons per second", and then later 125million...

Not defending Microsoft or anything here, but IIRC that original 300 million claim was vertices, not polygons.
 
s'more Sony news....

Not defending Microsoft or anything here, but IIRC that original 300 million claim was vertices, not polygons.

I think this is the favorite BS marketing tactic of 3D console makers (except Nintendo; they've actually given real-world performance estimates on their spec sheets) - give your vertex processing capablility and don't say anything about fill rate, giving a big impressive number that means nothing by itself.
 
s'more Sony news....

I think this is the favorite BS marketing tactic of 3D console makers (except Nintendo; they've actually given real-world performance estimates on their spec sheets) - give your vertex processing capablility and don't say anything about fill rate, giving a big impressive number that means nothing by itself.

I agree.. I was just arguing with someone about Sony's 75mil figure recently, never mind the fact that it's nearly impossible to actually display that many polygons on almost any of today's display hardware (not counting obscured polygons). Correct me if I'm wrong, but I believe their claim was actually polygons, not vertices.
 
Back
Top