Intel Released Open Image Denoise

Intel Released Open Image Denoise

2019-02-01

Intel

Denoise is very important for our CG practitioners. Denoise is an important part of post-processing, which can eliminate the visual noise caused by insufficient ray tracing calculation. And this part is also the most time-consuming part, and the quality of the noise reduction also directly affects the quality of the output image and sequence. In an ideal world, the light should be projected directly onto every pixel of the screen, but in the real world, computing is not as advanced enough to do this in a reasonable/real-time state. The purpose of noise reduction is to try to correct and reconstruct such images. Recently, for the denoise problem, Intel introduced Open Image Denoise "Open Image Noise Reduction" (abbreviated as OIDN) open source technology, which is a high performance & high-quality noise reduction library for ray tracing rendered images, let The CPU helps with some of the work of noise reduction. Open Image Denoise provides users with a high-quality, high-efficiency and easy-to-use denoise method that significantly reduces the rendering time of rendering software for ray tracing.This denoise library can filter Monte Carlo noise inherent in “path tracking” or any other tracking method, and can even reduce the necessary sampling amount per pixel according to multiple orders of magnitude, depending on what you want to achieve. Target performance criteria. At present, Intel has not released specific technical details, but Open Image Denoise is built on the Intel Mathematical Core Library (MKL-DNN) for deep neural networks. It requires CPU support for Intel 64 architecture and SSE4.2 instruction set. Modern instruction sets such as SSE4, AVX2 and AVX-512 are accelerated to achieve higher denoise performance. It runs nodes on laptops, workstations, and HPC systems. Not only can it run efficiently during offline rendering, but it can also perform interactive ray tracing based on the hardware device used.Let's take a look at a few renderings that use Open Image Denoise, as well as the few shown above. Nvidia uses the GPU for deep learning to predict the final rendered image from the partially completed results. The resulting AI solution can denoise in a fraction of the time, infinitely close to the input image, compared to the existing method. Produce high-quality images. The NVIDIA team also used AI to solve the jagged problem in-game rendering, trained the neural network to recognize artefacts, and replaced the original position with smooth anti-aliased pixels, resulting in a clearer image. One is based on CPU, one is based on GPU, which one is more cattle? At present, there is no way to get two technical protests. However, from the effect of each display, the effect is good. Maybe only in terms of speed! ~


Welcome to join us

render farm free trial

Recommended reading


How to Render High-quality Images in Blender

2020-03-09


How to render large scenes with Redshift in Cinema 4D

2019-10-15


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Is The Best 3D Renderer?

2019-04-15


Why V-Ray Render is Black in SketchUp And How To Fix it?

2019-05-07


How to Reduce Noise in Corona Renderer - Corona Denoising (2022 Updated)

2019-05-15


How the Redshift Proxy Renders the Subdivision

2018-12-28


What is the difference between pre-rendering and real-time rendering?

2020-09-08


Blender vs Maya vs Cinema 4D, Which One Is The Best 3D Software?

2020-01-16


Partners

  • Foxrenderfarm

    Powerful Render Farm Service

  • TPN

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    Marketing Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com

    Connect with Us

Fackbook Customer Reviews