close

Intel Released Open Image Denoise

Denoise is very important for our CG practitioners. Denoise is an important part of post-processing, which can eliminate the visual noise caused by insufficient ray tracing calculation. And this part is also the most time-consuming part, and the quality of the noise reduction also directly affects the quality of the output image and sequence.

In an ideal world, the light should be projected directly onto every pixel of the screen, but in the real world, computing is not as advanced enough to do this in a reasonable/real-time state. The purpose of noise reduction is to try to correct and reconstruct such images.

Recently, for the denoise problem, Intel introduced Open Image Denoise "Open Image Noise Reduction" (abbreviated as OIDN) open source technology, which is a high performance & high-quality noise reduction library for ray tracing rendered images, let The CPU helps with some of the work of noise reduction.

Open Image Denoise provides users with a high-quality, high-efficiency and easy-to-use denoise method that significantly reduces the rendering time of rendering software for ray tracing.

This denoise library can filter Monte Carlo noise inherent in “path tracking” or any other tracking method, and can even reduce the necessary sampling amount per pixel according to multiple orders of magnitude, depending on what you want to achieve. Target performance criteria.

At present, Intel has not released specific technical details, but Open Image Denoise is built on the Intel Mathematical Core Library (MKL-DNN) for deep neural networks. It requires CPU support for Intel 64 architecture and SSE4.2 instruction set. Modern instruction sets such as SSE4, AVX2 and AVX-512 are accelerated to achieve higher denoise performance.

It runs nodes on laptops, workstations, and HPC systems. Not only can it run efficiently during offline rendering, but it can also perform interactive ray tracing based on the hardware device used.

Let's take a look at a few renderings that use Open Image Denoise, as well as the few shown above.

Nvidia uses the GPU for deep learning to predict the final rendered image from the partially completed results. The resulting AI solution can denoise in a fraction of the time, infinitely close to the input image, compared to the existing method. Produce high-quality images.

The NVIDIA team also used AI to solve the jagged problem in-game rendering, trained the neural network to recognize artefacts, and replaced the original position with smooth anti-aliased pixels, resulting in a clearer image.

One is based on CPU, one is based on GPU, which one is more cattle? At present, there is no way to get two technical protests. However, from the effect of each display, the effect is good. Maybe only in terms of speed! ~

Welcome to join us

render farm free trialCHRISTMAS SUPER SALE

Recommended reading


Top 9 Best And Free Blender Render Farms of 2024

2024-08-30


Revealing the Techniques Behind the Production of Jibaro "Love, Death & Robots", Which Took Two Years to Draw the Storyboard

2024-08-30


Top 10 Free And Best Cloud Rendering Services in 2024

2024-11-08


Top 8 After Effects Render Farm Recommended of 2023

2024-08-30


Shocked! The Secret Behind Using 3D to Make 2D Animation was Revealed!

2022-05-11


How to Render High-quality Images in Blender

2024-11-18


Easy Cel Shading Tutorial for Cartoon in Blender Within 2 Minutes

2022-07-01


Why V-Ray Render is Black in SketchUp And How To Fix it?

2024-08-30


Partners

Interested

Business Consulting

Global Agent Contact:Gordon Shaw

Email: gordon@foxrenderfarm.com

Marketing Contact: Rachel Chen

Email: rachel@foxrenderfarm.com

Message Us:
Newsletter
Keep up with our latest software updates, special offers and events!
Copyright © 2024 FoxRenderfarm.com. All Rights Reserved.