Jayextee Posted March 30, 2001 I was thinking, for Doom3, one thing I'd really like to see is some kind of interpolation between rendered frames. Blurring, if you will. Freeze-frame a video in an action scene, and things look blurred, and yet, play it through the same scene, and everything's clear. I wondered that IF this kind of effect made it to Doom3, then (a) What kind of specifications would a guy need with an engine that does this 60 times a second? and (b) would you actually consider it worth doing, if the specs were right? Just a crazy idea I had - I consider it one of the last things that need doing before a *true* photorealistic graphical standard is met. 0 Share this post Link to post
Zaldron Posted March 30, 2001 The blurring´s actually a lot harder than that. There are several ways to do this, but most important are in-render and filter. in-render is the easiest for most cards. It makes several copies of the object, place them at extrapolated positions, and give them decreasing amounts of opacity. This looks cheap most of the time, limited to a precise number of "samples" and makes things slower (5x, 6x polycount/texture blitting). Filter´s normally a software thing. The only video card with support for this is the VooDoo5, with it´s T-Buffer. The T-Buffer allows you to store the color of each pixel and mix it with the next frame´s color in the same pixel. This gives you smooth, hi-quality blurring. The problem´s obvious, you need hardware support. Software z-buffer motion blur´s pretty slow, as what I can see in 3DSMAX. The only way to make this happen is trought hardware, and it looks like the GF3 can´t handle this kinds of operations (altough pixel shaders could be more flexible than I imagine, but I doubt Carmack would program such a painful feature). I suppose that the next incarnations of Nvidia technology will be able to do post-rendering filters (glows, blur, distortions, etc) once they merge difuncts 3dfx tech with theirs... 0 Share this post Link to post
Recommended Posts