Microsoft selects IBM processor for next Xbox

I guess IBM was just willing to offer them more of a deal than Intel was- neither IBM nor Intel is really hurting, but IBM doesn't sell half as many processors as Intel does, ergo, Intel can afford to keep their prices higher.
 
Originally posted by it290@Nov 13, 2003 @ 12:13 PM

I'm sure they could port DirectX to that architecture if they wanted to (wouldn't be that much effort for them, really), which would also allow them to release a Mac DX version. That would be quite useful for them since they've been trying to kill OpenGL for quite some time now.

That's one thing that I'm really amazed that Microsoft hasn't done.
 
I've seen speculation that MS want IBM chips, because it's IBM who are creating the Cell processor for the PS3. Probably bollox, because I thought the Cell was pretty much canned for the PS3 - too complex.
 
It is funny how the DirectX logo is VERY similar to the Xbox one to begin with. Maybe there are plans for this on the next machine. In my eyes, I think OpenGL is smoother and more engrossing than DirectX.
 
As far as I can tell both DirectX and OpenGL are pretty much the same in terms of capabilities. The reason alot of cards these days have different results it that for most PC drivers the OpenGL bits aren't as developed as the D3D bits. They both do the same thing really.
 
DirectX evolves much more rapidly than OpenGL, and many of the more modern features are only available through vendor-specific extensions. The upcoming OGL2.0 will bring features like pixel shaders but after that it probably won't take long until it starts lagging after DirectX (that's an advantage of a standard controlled by only one entity).
 
True antime. Hardware companies might give their input, but in the end MS says "we do this", and all the manufacturers have to meet or exceed those requirements to be compatible. So in recent times it has certainly sped along capabilities. OpenGL has its advantages, like it can be more vendor-specific, but some of that is also a disadvantage.

Racketboy: Since the GPU will probably be capable of running NV2x code, they could just emulate the XCPU, if they had to. We don't know what kind of chip they are going to use, only that IBM is going to build it. I wonder what Sony is going to do for a GPU this time around. 8MB, and triple the bus bandwidth? LOL...
 
Ah the old DX vs OGL debate. I thought pixel shaders were already supported in GL? Hmm maybe not. Vendor-specific rendering paths is definitely an advantage with OpenGL- this is something that will be used to great effect in Doom 3. OpenGL is also better for professional apps, as it has some extensions that are very useful for this that aren't really supported by DX. The big advantage is portability, but that's pretty much a given.

I think DX has definitely had the edge in games for some time now, but the way MS's standards work kind of pisses me off as well. For example, I bought a Ti4600 for one of my comps a while ago, back when it was the top of the line card. Nowadays, it's still a card with a decent amount of horsepower, and speedwise is about even with the card I have in my other machine- a Radeon 9500 Pro. But the Ti4600 is in some ways 'obsolete' since it doesn't support those nifty DX9 features that the 9500 does, even though it has enough raw power to run most of today's games very well.
 
DirectX evolves much more rapidly than OpenGL, and many of the more modern features are only available through vendor-specific extensions.

From what I've heard, a lot of the cutting-edge DX features may as well be vendor-specific extensions with the way they're implemented. I seem to recall hearing of a DX point release that didn't do anything special except to change some assumptions about shader programs so that the R300 wasn't constrained by limitations designed around the NV30. Or some similar nonsense - my memory of this could stand some refreshing.

Hardware companies might give their input, but in the end MS says "we do this"

Or in the case of DirectDraw, "we don't do this".

I thought pixel shaders were already supported in GL?

AFAIK, core API support for shader programming is planned for OpenGL 1.5 (and programmable shaders are a key consideration in the ongoing development of OpenGL 2.0), and standardized extensions have been around for a while IIRC.
 
The shader thing in DX9 between ATI and Nvidia is one of those big blunders that has happened in recent history. Basically what happened was that when MS started to develope the spec for DX9 for some reason Nvidia didn't join to give their input. As the spec was developed ATI designed the R300 cores to follow the standards. By the time Nvidia joined the table they had already developed much of their proprietary shader extensions and stuff. And unfortunately for them much of it was incompatible with the DX9 standard. So basically the performance boost ATI saw with DX9 was that their cards stuck to the standard and that straight HLSL compiled shaders could be fed to the cards directly. While Nvidia had to write a complete shader recompiler that would make the shaders run decently on their cards.

When DX9 was being developed Nvidia was basically the kind of the 3D consumer crop. So they arrogantly though that whatever they do will become the standard to follow. By the time they realized that DX9 wouldn't use thier version of the new shaders it was too late to change the standard for DX9 and they had to scramble like crazy to adapt to it.
 
It's a good thing Nvidia DOESN'T dictate what goes. Anyway, that's Nvidia's blunder, not MS or ATI's problem.
 
Yup. Nvidia basically got cocky thinking that whatever they had in store for pixel and vertex shaders in the new generation of cards is what everyone would use. But since they didn't participate in the DX9 spec development until a very late time ATI had the time to basically convince MS to use a simpler implementation and then designed their cards for that. Nvidia is basically playing catch up.
 
Sorry, I was thinking of something older; it was NV25 vs. R200 and the point release in question was DX 8.1, which added "Pixel Shader 1.4" to support R200's capabilities. That it's a point release suggests mere differences in implementation detail, but it does contain substantial changes to the programming model.
 
Back
Top