How does the studio directly produce the final shot? (1)

How does the studio directly produce the final shot

The Road to Innovation in Virtual Production

How does the studio directly produce the final shot? The production team of Disney's new drama The Mandalorian used the real-time rendering technology to construct the digital virtual scene in advance.

The Mandalorian

The Mandalorian, spin-off drama of The Star Wars launched by the Disney streaming platform

The Mandalorian, the first spin-off drama of The Star Wars appeared on Disney+ on November 12, 2019, which was also the first blockbuster series made in Unreal Engine.

The timeline of The Mandalorian is five years after Return of the Jedi, after the fall of the empire and before the emergence of the first order. The plot revolves around the travels of a lone bounty hunter in the outer reaches of the galaxy, far from the authority of the New Republic.

How does the studio directly produce the final shot

The Mandalorian

How does the studio directly produce the final shot

Epic Game, the development company of Unreal Engine, appeared in the list behind the scenes

It is worth noting that Epic Games, the developer of the Unreal Engine, appeared on the list of thanks for the production of The Mandalorian, followed by a list of the entire virtual production team.

Epic Game was previously committed to producing explosive games like Fortnite and other 3A-level masterpieces, including Kingdom Hearts III, Dragon Ball Fighter Z, and Star Wars Jedi Knight: Fallen Order and so on. In recent years, they are working with Lucasfilm to bring the real-time rendering capabilities of the Unreal Engine into the development of Disney’s streaming content for live-action production.

How does the studio directly produce the final shot

Many shots background of The Mandalorian is presented directly from the LED wall. Jon Favreau is the producer and screenwriter of The Mandalorian, but also the pioneer using the virtual game engine in virtual production. At the SIGGRAPH 2019 computer graphics conference in Los Angeles, Jon Favreau shared an efficient way to use the game engine to help the film perform a virtual preview (Previs).

"We used the V-cam system, which is essentially making movies in VR. All results would deliver to the editor. It is like we are editing a part of the movie in advance, the purpose of which is to achieve the pre-conceived goal."Jon Favreau said.

How does the studio directly produce the final shot

Use VR technology and real-time rendering technology to view the virtual environment of the movie

An evolution from the virtual preview tool to the final image production tool

The virtual production system no longer only provides virtual preview services for the film. The crew used the LED video wall in The Mandalorian set as the background of the live real-time camera composites (In-camera composites). The SFX team projects the pre-rendered content, such as the environment, onto the LED wall as a dynamic green screen. The virtual studio is a cube of virtual content wrapped by four LED walls. It is driven by the Unreal Engine, and the photographer can shoot the final images directly in the camera (In-camera ). The content displayed by the LED will be adjusted and transformed in real-time according to the position of the virtual camera.

How does the studio directly produce the final shot

Live actors and LED walls in the virtual studio

"We present real-time rendered content in the camera and get composing shots directly. Considering zooming and other requirements, for some types of cinematography, we are not only able to interact on the spot but also can see the lighting, interactive light, layout, background, horizon, and others directly in the camera. Hence, these don't need to wait for the post-production." Jon Favreau noted.

Welcome to join us

render farm free trial

Recommended reading


How to render large scenes with Redshift in Cinema 4D

2019-10-15


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


How the Redshift Proxy Renders the Subdivision

2018-12-28


How to Render High-quality Images in Blender

2020-03-09


China Film Administration Release New License Logo

2019-01-09


Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel

2018-12-26


Corona Renderer Learning - Denoising

2019-05-15


Redshift for Cinema 4D Tutorial: Interior Design Rendering

2019-10-12


Partners

Interested

What is Rendering? How to Accelerate the Rendering Speed?
What is Rendering? How to Accelerate the Rendering Speed?
In order for 3D projects to be displayed visually at all, they have to be rendered. Although this process is done by computer processors, this work step is very time-consuming due to the enormous computing power. But what exactly happens when rendering? To put it simply: A finished single image or an entire scene from moving images is calculated pixel by pixel from the raw data of a 3D model previously created in a corresponding program. In the case of moving images, one must also note that one second of the finished scene consists of several “frames”. For a movie, for example, that's 24 frames per second. And when rendering, each individual frame is calculated as a single image. The duration of the calculation of these frames are depending very much on the quality and the level of detail of the project, but above all on the set lighting conditions and the corresponding surface reflections, the given perspective and of course any errors that may be contained. In the end, the calculations are extremely complex and result from the previous work steps such as modeling, texturing, shading, lighting and animation. And so rendering is also the last big step in the implementation of a 3D project. 600 Years of rendering time for a character from "The Load of the Rings"? To give you an idea of what that means in practice: In the 2003 Oscar for the best visual effects film "The Lord of the Rings: The Two Towers", the character "Treebeard" - an animated tree creature - was in the long version can be seen for 13:45 minutes. This means that a total of 19,800 frames (i.e. single images) had to be rendered. In this figure, the calculation of a single frame took about 48 hours, which in turn means a total of 950,400 calculation hours for all frames together. That would be the equivalent of 39,600 days or over 600 years for a single computer processor. No wonder, then, that the executing VFX studio Weta Digital increased the already enormous technical equipment of around 2,250 processor cores by another 500 while production was still running. In the end, with this concentrated computing power, rendering “Treebeard” alone should have taken about 14 days. But the long version of the film lasts almost four hours and consists of a lot of animated content. It's hard to imagine how long it must have taken to render the entire movie. And mind you, this does not even include the artistic work, only the calculation is finished images. 60 Million rendering hours for the Disney film "Frozen" But whoever thinks that rendering will be accelerated by technological developments is only partially correct. Because the developing technological possibilities naturally also improve the quality of the projects. The production team of the world-famous Disney animation film "Frozen" from 2013 cope with a total of 60 million rendering hours. And although the Walt Disney Animation Studios had increased the number of processor cores from 5,300 to 30,000 within two years and thus had one of the world's largest data centres, rendering the entire film still took over 83 days with all the computing power 24 hours without a break. But Disney doesn't just work on one project, so it probably took longer. And another small comparison: Incidentally, a single supercomputer from this time would have taken more than 900 years to render the film ... These are of course examples from professional productions with the highest standards. And of course, for you and your studio, how to accelerate the rendering speed? The best cloud rendering service provider, Fox Renderfarm with enormous computing power worth choosing. Fox Renderfarm was founded in 2011 and is now the largest provider in this area in Asia. In a so-called “[render farm](https://www.foxrenderfarm.com/ "render farm")”, many physical servers and thus computer processors are combined to form a gigantic “supercomputer”. At Fox Renderfarm, for example, there are currently over 20,000 physical servers that are currently used for rendering by over 1.5 million end-users from over 50 countries. In total, Fox Renderfarm "renders" 1 petabyte of finished graphics results per month for its customers, which corresponds to one million gigabytes. Join and try $25 free trial now!
More
2021-04-08
FGT Art 2021 January Winner Announced
FGT Art 2021 January Winner Announced
Let us introduce you to the first FGT Art of the Month 2021: Eternity (一笑一尘缘,一念一清静。)! This amazing artwork, Eternity (一笑一尘缘,一念一清静。), is created by our friend Kay John Yim, an excellent Chartered Architect based in London. The artwork also has a beautiful caption: To see a World in a Grain of Sand And a Heaven in a Wild Flower, Hold Infinity in the palm of your hand And Eternity in an hour. Who’s our next FGT Art winner? Shine your talent and win great prizes! Hit the link and know more https://www.foxrenderfarm.com/fox-got-talent.html
More
2021-02-02
3ds Max Tutorials: The Production and Sharing of "Sci-Fi Guns"(3)
3ds Max Tutorials: The Production and Sharing of "Sci-Fi Guns"(3)
The best render farm, Fox Renderfarm still shares with you the production of the work "Sci-Fi Guns" which made with software 3ds Max and Substance Painter 2019 from 3d artist Zikai Wu. And this is part three of the sharing. Texturing Next, start the texture production. First, divide the whole into several large layers, and then create a selection area based on the id map made before. First, create a layer group and add a color selection mask. Then make the texture of each part in each layer group. When making textures, my production process is to pave the color first, and then add details. First, give a basic color, then copy the layer to make the darker part, add Dirt generator to this layer, if the effect does not meet your requirements, you can modify the generator parameters and add a brush tool to draw the part you want The effect can be. I usually add the Edge generator to the bright edges to increase the volume and make the overall effect clear. For example, small white particles on part of the surface: first create a new filling layer, add a black mask to the layer, add a filling layer to the mask, then find the picture you want in the grayscale material library and drop it on top, modify the parameters to Control the particle effect. After the texture is finished, I will create a new layer on the top and add sharpening to enhance the clarity of the details. For the gun, I used some glow textures and halo effects. These halos used some basic layers, and then used a mask to draw the shape, leaving only the Emiss self-illumination, and then changed it to blue. Then duplicate this layer, and add a Blur filter on the mask, modify the blur intensity to achieve the halo effect. Other parts of the texture also use this method, and the texture is exported after production. When exporting textures, you need to pay attention to texture size, texture location, channel, and format. Rendering After the texture is produced and exported, it enters the rendering stage. We also need to export the environment map used in Substance Painter to MarmosetToolbag 3 for use. Import the low polygon and add the MarmosetToolbag 3 shader, paste the exported texture according to the channel, pay attention to the name of each channel. You need to flip the Y-axis on the normal channel, just click it. Because this also has an emissive map, you need to open the emissive channel in MarmosetToolbag 3. After the texture is the lighting, the lighting method: one main light source (warmer), one or two auxiliary lights. (Colder). The main light source is located 45 degrees above the front, and the auxiliary light is located on the side and back, which is mainly used to illuminate the dark parts and express silhouette contours. After all the settings are adjusted, you can render. Below is the final rendering effect: The above is the whole process of making this gun. In production, I keep trying to learn and find the most suitable way. I hope this tutorial can be of some help to you.
More
2021-01-19
Fackbook Customer ReviewsFoxrenderfarm

Powerful Render Farm Service

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    Marketing Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com