الخميس، 12 أغسطس 2021

Intel Hires Former Nvidia Researcher Who Helped Develop Ray Tracing Tech

(Image is from Moana, rendered via Intel OSPray, an open-source ray tracing engine.)
Intel has hired Anton Kaplanyan to help drive the company’s ray tracing efforts and overall graphics research. Kaplanyan has devoted his career to ray tracing and researching various other types of rendering improvements. While he worked on Nvidia’s ray tracing and RTX research program, he also spent time at Oculus Research.

Intel is serious about re-entering the graphics market, but the company also has a lot of catching up to do to match the decades of experience AMD and Nvidia bring to the table when it comes to working with developers and developing high-end video card hardware.

Kaplanyan has written a blog post laying out his plans for the future:

Recent advances in neural graphics, ray tracing, cloud computing, perceptual and differentiable rendering, materials and lighting, appearance modeling, as well as in commoditized content creation require a transformation of the current graphics platform. This gives us an opportunity to not only completely reinvent how we co-design graphics hardware and software with these technologies in the driver’s seat, but also to look at a larger picture of a potentially novel graphics ecosystem that can include heterogenous distributed systems and multi-level compute for movie-quality visuals on a wide range of consumer devices.

This might sound like jargon or marketing if you don’t look too closely, but there’s real depth here. Kaplanyan’s reference to “neural graphics” refers to ongoing efforts to develop a neural graphics pipeline. There are efforts underway to combine generative adversarial network techniques directly into graphics engines. This description of OpenDR, an approximate differentiable renderer, explains the concept as follows: “OpenDR can take color and vertices as input to produce pixels in an image and from those pixels, it retains derivatives such that it can determine exactly what inputs contributed to the final pixel colors. In this way, it can ‘de-render’ an image back into colors and vertices.”

Anton Sergeyevich Kaplanyan, VP of Intel Graphics Research

I can’t say for certain what Kaplanyan has in mind when he talks about the commoditization of content creation, but I suspect it applies to tools like the in-browser character creation tools Epic has shown off as part of Unreal Engine 5. What all of this adds up to is a great deal of practical work into cutting-edge methods of displaying pixels on screens.

This kind of research is needed more than ever. GPU power consumption continues to rise; rumors have suggested next-generation cards from Nvidia and Intel could hit 400-500W. While those figures seem high, GPU power consumption has been rising for years. Ray tracing makes the climb that much harder, but we’ve already seen how technologies like Nvidia’s DLSS or AMD’s FSR can reduce the impact of ray tracing by improving the visual experience at a lower resolution. This allows the end-user to target a lower native resolution and divert additional GPU horsepower to the ray-tracing side of the equation if desired. Other techniques, like coarse pixel shading, attempt to save GPU power by rendering backgrounds with less detail.

It’s going to be a few years yet before ray tracing goes truly mainstream. It looks as though we’ll need to wait for at least one more generation before AMD and Nvidia push plausible ray tracing into the $250 price point and the budget market may take even longer. Nonetheless, the combined advances of machine learning, ray tracing, and new rendering techniques do represent a significant change compared with rasterization alone.

Exactly how much of a part Intel has to play in all of this will depend on how good the company’s graphics hardware is — but we should have some answers to that question before too much longer.

Now Read:



sourse ExtremeTechExtremeTech https://ift.tt/3CFFdtd

ليست هناك تعليقات:

إرسال تعليق