- Snow (simply too many quads needed)
The problem is not the quads, it's the time needed to sort them. I confess I only played the PC version of MGS, but snow looked like plain, 1-bit color quads, using a basic "snow physics" model like the ones in demos of the 90's, but in 3D. That doesn't look very hard to implement... biggest problem would be stuff like priority over the scenario.
- Underwater transparency
That would be well, hard. Not impossible. We'd draw the scene below the water on the framebuffer, then a window effect, store that framebuffer result, then draw the remaining polys(considering clipping problems, of course)over the previous "picture". The framerate would be cut at 30fps, due to the two rendering stages.
- Motion blur in cutscenes
We could mock something out with the direct framebuffer access. We'd store the previous frame, and apply a motion filter tru the new frame using err... the DSP? Slave SH2? depends. I forsee less than 30fps in this events, altough, since the algorith is heavy.
...Now, the ninja suit, _that_ I have no idea how to do it, especially in real time and in gameplay ^^;
--
Isn't the impossibility of the 5 layers at the same time due to memory limitations? In that, you could activate them, but there would not be bandwidth enough to send all the data needed, and the screens would be corrupted. So the marketing would be correct, the screen can be activated, they just can not be used ^^;
The problem is not the quads, it's the time needed to sort them. I confess I only played the PC version of MGS, but snow looked like plain, 1-bit color quads, using a basic "snow physics" model like the ones in demos of the 90's, but in 3D. That doesn't look very hard to implement... biggest problem would be stuff like priority over the scenario.
- Underwater transparency
That would be well, hard. Not impossible. We'd draw the scene below the water on the framebuffer, then a window effect, store that framebuffer result, then draw the remaining polys(considering clipping problems, of course)over the previous "picture". The framerate would be cut at 30fps, due to the two rendering stages.
- Motion blur in cutscenes
We could mock something out with the direct framebuffer access. We'd store the previous frame, and apply a motion filter tru the new frame using err... the DSP? Slave SH2? depends. I forsee less than 30fps in this events, altough, since the algorith is heavy.
...Now, the ninja suit, _that_ I have no idea how to do it, especially in real time and in gameplay ^^;
--
Isn't the impossibility of the 5 layers at the same time due to memory limitations? In that, you could activate them, but there would not be bandwidth enough to send all the data needed, and the screens would be corrupted. So the marketing would be correct, the screen can be activated, they just can not be used ^^;