How does the studio directly produce the final shot? (2)
The rising expressive power of Real-time rendering
Jon Favreau's biggest concern about The Mandalorian is whether the game engine can reach the visual effect rendering level of Star Wars on the TV budget. But when he saw a piece of sand similar to episode 2, he abandoned the thought.
Sandy environment is built with assets scanned with photos and rendered in real-time
In the virtual studio, all the scene elements and lights can be switched and upgraded freely, and the feel of virtual reality is maintained. Part of the virtual environment of The Mandalorian directly uses assets created by Unreal Engine for some video games, saving a lot of asset construction time and opening up the possibility of asset sharing between the two industries.
Real-time rendering has been able to better present the highly reflective materials often involved in science fiction dramas
The virtual scenes on the set sometimes deceive the eyes of the people on the scene. Jon Favreau said, "There was a person who came to the studio and said,'I thought you wouldn't build the whole scene here.' I said, no, we didn't build here, in fact, there only have tables.' Because the LED wall rendering is based on a camera, there is parallax. But even if you look at it casually next to you, you will still think that you are watching a live-action scene. "
Entering the virtual studio, the actor has entered the virtual world and the final scene
The changes of Hollywood led by virtual production technology
In addition to accelerating the cycle and increasing the turnover of the budget, virtual production has also brought revolutionary convenience to actors and other teams. Actors can see the situation of the environment in real-time to perform and interact.
"This not only helps the cinematography but also helps the actors understand the surroundings, such as the horizon. It also provides interactive lighting", Jon Favreau described it as “a huge breakthrough”.
Virtual production can also help fantasy characters interact with real people
In addition to The Mandalorian, movie projects such as Lion King and Fantasy Forest have already begun to use game engines to produce movies, but they are still mainly in the visual preview stage.
Besides, Steven Spielberg and Denis Villeneuve used Unity to help achieve their visual effects in the production of Number One Player and Blade Runner 2049 respectively. This method is gradually replacing the usual storyboard production method similar to hand-drawn comic books.
Spielberg stood on the monitor to view the real-time synthesis of the shooting content
In the virtual production process, VR technology is rapidly becoming a viable option for large studios and production companies. Director Jon Favreau used a lot of virtual reality technology in the recent remake of the live-action movie The Lion King, which fundamentally created the entire virtual reality world of the movie.
Lion King uses a lot of real-time virtual preview technology
The popularization of virtual production technology
In terms of budget, virtual production and LED display equipment are still relatively expensive at this stage.
Complete real-time compositing and film production at a lower cost
Jon Favreau indicated that virtual production technology is a major leap forward in the film industry, allowing creators to make creative decisions before the project starts, rather than reinventing the process or after completion. Nowadays, more people can see the appearance of the lens directly during cinematography. More people can contribute their own ideas and learn from each other because they can see the final idea in advance.
"In the past, when you went to a scene and left the green screen, you no longer care about it. But now, we have so many talented people and have accumulated more than one hundred years of filmmaking experience. Why should we give up just because we change the shooting system? Let’s continue to inherit the skills of film artists and develop tools that adapt to the times.”
Virtual production technology will gradually become the routine process of film production in the future
How to render large scenes with Redshift in Cinema 4D
Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?
How the Redshift Proxy Renders the Subdivision
How to Render High-quality Images in Blender
China Film Administration Release New License Logo
Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel
Corona Renderer Learning - Denoising
Redshift for Cinema 4D Tutorial: Interior Design Rendering
Previous: 3ds Max Tutorials: CG Production Process of The Iris Flower
Next: How does the studio directly produce the final shot? (1)