3D Ray Tracing
March 19, 2018 at 8:59 pm #17894
It looks like Microsoft have raised the bar with this new 3D rendering api. I have not had time to absorb the implications but at first glance this could do wonders for game engines and may even have implications for better and less costly 3D modelling and rendering engines. (I really must learn Blender this year, as there is nothing cheaper than free!)
Games should benefit by doing away with pre-rendered cut-scenes – instead such things could be done on the fly.
March 20, 2018 at 5:53 pm #17932
keith with the teefParticipant@thinktankForumite Points: 221
They bin on about this for years. All the way back to my 6800 ultra.
My current card 1070 pumps out wild amounts of flops.
I think we got some way to go before game engines can really turn on the charm with ray tracing because we have the compute power.
Although I feel this will come sooner than later. 20-t flops zone.
Its funny, my media center has a 610 throw away card in but it matches my old king of the hill 6800 ultra.
I guess my 1070 will be throw away compute power in 10 years time. Unfortunately I will be 63.
00April 28, 2018 at 5:45 pm #20145
- This reply was modified 6 months, 1 week ago by keith with the teef.
keith with the teefParticipant@thinktankForumite Points: 221
This ray tracing has really caught my attention.
On some demos, significantly more detail is unlocked, obviously because rasterisation is just an approximation.
But I never thought the impact would be so big as today’s games r amazing.
Also I have a strong suspicion that the next Tomb raider game will support real Nvidia ray tracing.
Watch this space. 🙂00April 28, 2018 at 6:40 pm #20151
Bob WilliamsParticipant@bullstuff2Forumite Points: 1,849
They bin on about this for years. All the way back to my 6800 ultra. My current card 1070 pumps out wild amounts of flops. I think we got some way to go before game engines can really turn on the charm with ray tracing because we have the compute power. Although I feel this will come sooner than later. 20-t flops zone. Its funny, my media center has a 610 throw away card in but it matches my old king of the hill 6800 ultra. I guess my 1070 will be throw away compute power in 10 years time. Unfortunately I will be 63.
A callow youth, Keith!😊😆 That ain’t old…
“If you think this Universe is bad, you should see some of the others.”
― Philip K. Dick, legendary SF writer.00April 29, 2018 at 1:04 am #20169
The problems with real time rendering for the man at home is the amount of work that you have to do to get something looking half decent. Bear in mind that, unlike full on renderers (Blender included), real time work relies on low poly meshes with highly optimised texture maps. Creation of those maps can take an awful long time.
Having said that, some of the stuff produced on the unreal engine is absolutely amazing, but I shudder to think at the preparation time.
00April 29, 2018 at 8:22 am #20176
- This reply was modified 5 months ago by D-Dan.
I need to read the api again, but to me the api did not seem to require improved modelling, triangulation or uv mapping its main focus appears to be that of providing more efficient culling so that only the rays that will appear in a scene actually need to be traced, thus making ray tracing more affordable.
The way I think of it that old school rasterization works from the viewer towards the object and culls the z-map as it goes. The M$ api ray traces only the visible pixels back from the object to the viewer. This incidentally ought to automatically comprehend occlusion effects – the sort of fuzzy edges that objects can have. It should also automatically generate shadow maps.
My guess would be that its first impact will be on very much improved in-game graphics.00April 29, 2018 at 5:05 pm #20203
Wheels-Of-FireParticipant@grahamdearsleyForumite Points: 621
The first time I looked at ray tracing it was on an Atari ST with POV software.
A single scene at 320×200 in 16 colours with 2 lights took 18 hours to render.
Have things improved since then then 😁00April 29, 2018 at 10:47 pm #20212
For the RT stuff, it’s low poly, with all the detail in the textures, which means putting a hell of a lot more effort into spec, normal, bump and reflections maps than you need to for a full 3D path tracer.
Path tracing, OTOH, tends to be based on highly detailed geometry and physically based shaders.
This is an issue when moving to RT, simply because the required detail isn’t there when converting. ie. The artist has to make standout, and technically superior textures for RT compared with what the path tracer artist has to do.00April 30, 2018 at 8:10 am #20217
Dan, you may be interested in this technical article for Kingdom Come Deliverance (Google the title for some quite stunning eye-candy You-tube RT porn). This game is ‘old-school’ but sets a pretty high bar for realism. As said in this article, they used a texel (pixel texture) count of 256 texels/sq metre of scene. While that requires a lot of work, it requires even better database design to be able to build up scenes from a palette of texel objects.
Where I anticipate the M$ api benefiting is that it will make the 512 texels/metre target more achievable.00April 30, 2018 at 9:25 am #20224
Just a follow-up comment, while of course your comment about texture detail is correct it really only applies to background LoD work where probably an average normal map is burn into the textures, but for foreground action work that HAS to be done with polygons and where necessary distorted by skeletons and whatever process is used to transfer the result to the game engine (e.g. Maya->Cry Engine)00April 30, 2018 at 8:45 pm #20253
I don’t mean to sound overly critical. RT is a different skill set entirely from my own. For example, this https://www.youtube.com/watch?v=6oo293kIGPQ looks awesome in many parts, but I’d struggle to say it’s photo-realistic.
In a few years, I think the two (path tracing and RT) will come together, both as the programming and the hardware improve, but at the moment, the best path traced anims still leave the best RT behind.
00May 1, 2018 at 7:18 am #20259
- This reply was modified 4 months, 4 weeks ago by D-Dan.
Dan, the problem with game RT rendering is that it has to make compromises to meet the insane target of an unscripted rendered scene in less than 0.017 seconds on a high mid-range CPU and GPU . A typical Blender or other cut-scene render can take as long as necessary using as large a render farm as it needs and just save out the finished frame.
The frame rate is the killer that drives reality compromises as a game that achieves <60fps gets very critical review comments. Level of Detail (LoD) modelling (polygon count/object goes down with distance) is a typical compromise, but you will still hear complaints about ‘popping’ as objects move from one LoD to another. There are many other artificial (unnatural) artifices that the developers have to use to reduce render time. Even then a glance at any face in any RT scene shows up as glaringly unreal – as you know flesh is very difficult to render even in pre-renders and RT just cannot afford the fps to do anything except teeter on the edge of the unreal valley.
If Moore’s law holds maybe another 10 years will see some of these artifices becoming unnecessary, and it will become more common to ease the burden on artists using natural scans of scenes and objects as shown in the demo you linked. I’d also see some of these techniques spilling back into the world of Blender as artistic time constraints also restrict time spent on design and creativity. I think it could be 20 years before the difference for pre-renders becomes virtually identical to RT renders, but that of course does not allow for the impact of 8K screens or 48bit true colour!
I suspect it is a ‘competition’ in which pre-rendering will always be slightly ahead.00
You must be logged in to reply to this topic.