Intel Released Open Image Denoise

Intel Released Open Image Denoise

2019-09-30

Intel

Denoise is very important for our CG practitioners. Denoise is an important part of post-processing, which can eliminate the visual noise caused by insufficient ray tracing calculation. And this part is also the most time-consuming part, and the quality of the noise reduction also directly affects the quality of the output image and sequence.In an ideal world, the light should be projected directly onto every pixel of the screen, but in the real world, computing is not as advanced enough to do this in a reasonable/real-time state. The purpose of noise reduction is to try to correct and reconstruct such images.Recently, for the denoise problem, Intel introduced Open Image Denoise "Open Image Noise Reduction" (abbreviated as OIDN) open source technology, which is a high performance &x26; high-quality noise reduction library for ray tracing rendered images, let The CPU helps with some of the work of noise reduction.Open Image Denoise provides users with a high-quality, high-efficiency and easy-to-use denoise method that significantly reduces the rendering time of rendering software for ray tracing.This denoise library can filter Monte Carlo noise inherent in “path tracking” or any other tracking method, and can even reduce the necessary sampling amount per pixel according to multiple orders of magnitude, depending on what you want to achieve. Target performance criteria.At present, Intel has not released specific technical details, but Open Image Denoise is built on the Intel Mathematical Core Library (MKL-DNN) for deep neural networks. It requires CPU support for Intel 64 architecture and SSE4.2 instruction set. Modern instruction sets such as SSE4, AVX2 and AVX-512 are accelerated to achieve higher denoise performance.It runs nodes on laptops, workstations, and HPC systems. Not only can it run efficiently during offline rendering, but it can also perform interactive ray tracing based on the hardware device used.Let's take a look at a few renderings that use Open Image Denoise, as well as the few shown above. Nvidia uses the GPU for deep learning to predict the final rendered image from the partially completed results. The resulting AI solution can denoise in a fraction of the time, infinitely close to the input image, compared to the existing method. Produce high-quality images.The NVIDIA team also used AI to solve the jagged problem in-game rendering, trained the neural network to recognize artefacts, and replaced the original position with smooth anti-aliased pixels, resulting in a clearer image.One is based on CPU, one is based on GPU, which one is more cattle? At present, there is no way to get two technical protests. However, from the effect of each display, the effect is good. Maybe only in terms of speed! ~


Welcome to join us

render farm free trial

Recommended reading


How to Render High-quality Images in Blender

2023-01-13


Top 9 Best And Free Blender Render Farms of 2024

2023-05-06


Revealing the Techniques Behind the Production of Jibaro "Love, Death & Robots", Which Took Two Years to Draw the Storyboard

2023-01-13


How to Render Large Scenes with Redshift in Cinema 4D

2022-06-09


Top 10 Free And Best Cloud Rendering Services in 2024

2023-09-26


Top 8 After Effects Render Farm Recommended of 2023

2023-02-02


How to Reduce Noise in Corona Renderer - Corona Denoising

2023-01-13


Arnold for Maya Tutorial: How to Render Wireframe

2023-01-13


Partners

  • Foxrenderfarm

    Powerful Render Farm Service

  • TPN
  • TPN

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    Marketing Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com

    Connect with Us

Fackbook Customer Reviews