Why do you need to use a render farm?

Why-do-you-need-to-use-a-render-farm

With the rapid development of 3D movies in recent years, VFX movies have also received more and more attention. From the point of view of the box office of European movies, the top 5 movies Avengers: Endgame, Avatar, Titanic, Star Wars: The Force Awakens, Avengers: Infinity War. They are almost inseparable from VFX production. And Disney relied on the 3D animated film Frozen series, sweeping $ 2.574 billion in the world, creating a new record for the global animated film box office.

Frozen 2

image via internet

Why do you need to use a render farm? Rendering is indispensable behind these movies, and rendering is almost done by various render farms of all sizes.

Rendering is the later part of 3D animation production, and rendering is a very time-consuming step in the later stage. An 80-minute animated movie can often be rendered in thousands to tens of thousands of hours.

In the animated film COCO, 29,000 lights were used in the scene of the train station; 18,000 lights were in the cemetery; 2000 RenderMan practical lights were used in the Undead World. The production team used the RenderMan API to create 700 special point cloud lights, which expanded to 8.2 million lights. These massive lights are a nightmare for the rendering artist. For the production requirements, the test can not complete such a huge rendering. They once tested rendering and found that these light-filled shots actually took 1,000 hours per frame! They continued their research and shortened it to 125 hours and 75 hours. The rendering time of one frame at the time of final production was 50 hours and one frame. The movie is 24 frames per second, 1440 frames per minute, if a movie is calculated as 90 minutes. That 1440 frames 90 minutes = 129600 frames, 129600 50 = 6480000 hours. Converted into adulthood is 740 years?

COCO

image via internet

"To render Avatar, Weta Digital used a 10,000-square-foot server farm. There are 4,000 servers with a total of 35,000 processor cores. The average time to render a single frame is 2 hours, 160 minutes of video. The overall rendering time takes 2880000 hours, which is equivalent to 328 years of work for one server! "

AVATAR-2

image via internet

If the rendering work is only on a general computer, this is almost impossible. So the production team created its own render farm, powerful machine performance and huge number to complete such a huge amount of work.

For small teams, studio, freelancer, and individual producers, rendering take the same time. If they want to spend a lot of time using the computer for rendering work, this is undoubtedly a time-consuming thing, and they are unacceptable. These various reasons, coupled with uncontrollable factors such as computer crashes, special effects artists often cannot complete the work within the time specified by the producer due to rendering problems. Moreover, the labor, material, and electricity required to maintain computer rendering. The cost is also a big fee for the production company!

So it is very necessary to work with some render farms. Not only can it save the time of the production staff, but also can use a large number of machines in a limited time to complete the work in a short time.

The emergence of cloud rendering technology makes the render farm more mature. Cloud rendering is to upload materials to the cloud and use this cloud computing system of the rendering company to render remotely.

Another advantage of cloud rendering is that small studios can also be responsible for rendering high-quality materials. In the film industry, this technological change is no less than the change that cloud computing itself brings to the IT industry.

Taking Fox Renderfarm as an example, no matter where the special effects technician is, they can use cloud rendering services to complete the "trilogy": 1. Upload your project; 2. Wait for the rendering to complete; 3. Download to the local machine. In addition, they can call more machines according to their urgency, and they can arrange the order of rendering and monitor the progress of rendering at any time. At the same time, the price is relatively low, which is accepted by many film and television special effects companies, rendering studios, and individual designers. So, why not have a free trial on Fox Renderfarm now? to experience the amazing cloud rendering services to speed up your rendering.

Welcome to join us

render farm free trialfox got talent

Recommended reading


China Film Administration Release New License Logo

2019-01-09


Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel

2018-12-26


How the Redshift Proxy Renders the Subdivision

2018-12-28


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Corona Renderer Learning - Denoising

2019-05-15


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Is This Simulated By Houdini? No! This Is Max!

2019-02-22


Arnold Render Farm | Fox Render Farm

2018-11-27


Partners

Interested

A Guide to the Career Growth of VFX Compositors(1)
A Guide to the Career Growth of VFX Compositors(1)
As the leading cloud rendering services provider in the CG industry, Fox Renderfarm has an outstanding team with over 20 years’ experience in the CG industry. Team members are from Disney, Lucasfilm, Dreamworks, Sony, etc. With professional services and industry-leading innovations, they serve leading VFX companies and animation studios from over 50 countries and regions, including two Oscar winners. In this artichle, we will share a guide to the career growth of VFX Compositors from Zhiyong Zhang, Compositing Technical Director in Base FX(Base Media is the leading visual effects and animation studio in Asia). Part 1 Introduction What is compositing? There are many different explanations about compositing on the Internet. Since it's difficult to explain, I would analyze compositing from some shots. Like the scene in "Monster Hunt", the foreground is made with a 3D model, and the background is made with a combination of the original paintings and pictures. That is to do some MP, color adjustments, elements setting and real shot adding. Finally, all the pictures are going to be integrated, which is a compositing process. The professional value of composting Compositing is the last part of the visual effects in the later period. You need to blend and match three-dimensional things with the real shot through compositing techniques and methods. Through techniques and methods, add some 2D or 2.5D elements to make the picture more interesting. Compositing is making the projects that the entire visual effects team makes. Hence, compositors are responsible for the entire team, striving to adjust the project to the final optimal effect. This is the professional value of compositing. The artistic charm of composting For me, every adjustment on color, virtual focus, interactive light and others will make the scene more realistic or better-looking. Every step of the operation will improve the scene, bringing the compositors hope and happiness. It is awesome to do better in the time allowed and finally show something beautiful. This is the artistic charm of composting. Part 2 Six basic skills of compositors The first basic skill is erasing. Many scenes have tracking points when shooting on the green screen, or there are some things that can’t be removed from the scene. We need to do erasing, which is very basic but important. The second skill is rotoing and chroma key. The chroma key is to keying the green screen. There is also a key called no green screen keying, which is the rotoing. The third skill is to match colors. Adding an element to the real shot to change the color of the material is actually matching the material, not creation. The fourth skill is to match the virtual focus or depth of field. The material itself will actually have virtual focus, so the compositor needs to also be good at photography, otherwise, it may become a bottleneck for the compositor. The ideas of the director and the main photographer will be presented in the scenes. We use compositing to re-express what they want to express in CG. We are doing matching rather than recreating. The fifth skill is tracking. Most compositors do 2D trackings, such as billboards or on-screen patches. And sometimes they also do 3d tracking. The last one is the matching noise. For some demanding projects, it is necessary to switch between red, green, and blue screen, or even zoom in to the local area for comparison. It also depends on the broadcast platform. Generally, if it is broadcasting on a TV platform, the noise is not easy to see the noise. If in the cinema, as long as the brightness of the cinema is large enough, the noise of some scenes can be easily seen. The noise matching can make the picture more realistic. Part 3 Standards for cinema-level film compositing requirements The standards here are not officially stipulated standards but are just some common standards that I personally summarize based on experience. I divided the cinema standards for film compositing into two types: technical requirements and artistic requirements. Technical Requirements First of all, it depends on whether the color of the original material is changed. If you are not careful or have moved other parts of the screen including the color of the material for other reasons, it may cause customer dissatisfaction. Secondly, whether the maximum and minimum values of the screen are consistent with the material or not. Of course, the maximum and minimum values are not for a certain scene, but for all the materials of a project. Thirdly, whether the metadata is correct or not. If the editing requires metadata to be accurately restored, it needs to be correctly restored to the material. Generally, at the beginning of the project, it needs to agree with the previous DI or editing on how to set back. Fourthly, whether the shifting is stuck will also be a technical requirement. The shifting should be discussed with the editor. If the editing does not require any shifting, even if it seems to be stuck, it may be a fast switch of the front and rear scenes. This is one kind of editing methods. if you do something to make it smoother, re-reverting to the clip may cause problems. The last is whether there is any obvious flaw in the picture. The so-called flaw is whether there is a black border on the edge of the picture and whether the edge of the real shot is stretched. Artistic requirements The artistic requirements will be significantly different from the previous technical requirements. We will check whether the details of the keying edges are enough to maintain the authenticity of the picture. In the normal budget, the details of the edges can be kept authentic. If the edges look fake, it may make the whole picture fall short. Whether the bright and dark parts of the screen is matching with the material within the acceptable range? The dark and bright parts here refer more to brightening everything in the viewing facility, to see if the added thing is the same color as the real shot and whether the degree of the color cast is the same. Without considering the budget, as a compositor, the only requirement is to do something better, so try to match it. Then, whether the compositing elements, the virtual focus of the background and the depth of field match the real shot. The depth of field of the real shot sometimes will be very shallow, sometimes it will be very deep. Generally, there will be a range, this range is taken by the lens and the camera. We can add an element in it, and change the depth accordingly. In addition, we must also consider whether the simulation of the lens effect meets the real shooting standard. With a normal lens, the virtual focus is 1:1, and with an anamorphic lens, the virtual focus is vertical, similar to 2:1 circular virtual focus. While doing compositing, it needs to pay special attention to matching. The next thing to consider is whether the addition of smoke, flame and other atmospheres is truly fused. The light and shadow, the virtual focus and the sharpness are matched with the whole, and the things that really melt into it are realistic. The last thing is to consider the interactive effects, the corresponding interaction between the real shot material and the CG elements. When we do the compositing, in addition to shadows, there are some light interactions, we should consider whether to add them. In the process of compositing, we need to give ourselves the opportunity to exercise and do more fine, otherwise, we can't make progress. In fact, the audience is very smart now, watching a lot of blockbuster movies. Therefore, we should try our best to be realistic in the time cost of cinema-level movies. If the audience feels satisfied in the cinema, it will be successful.
More
2020-07-03
How does the studio directly produce the final shot? (2)
How does the studio directly produce the final shot? (2)
The rising expressive power of Real-time rendering Jon Favreau's biggest concern about The Mandalorian is whether the game engine can reach the visual effect rendering level of Star Wars on the TV budget. But when he saw a piece of sand similar to episode 2, he abandoned the thought. Sandy environment is built with assets scanned with photos and rendered in real-time In the virtual studio, all the scene elements and lights can be switched and upgraded freely, and the feel of virtual reality is maintained. Part of the virtual environment of The Mandalorian directly uses assets created by Unreal Engine for some video games, saving a lot of asset construction time and opening up the possibility of asset sharing between the two industries. Real-time rendering has been able to better present the highly reflective materials often involved in science fiction dramas The virtual scenes on the set sometimes deceive the eyes of the people on the scene. Jon Favreau said, "There was a person who came to the studio and said,'I thought you wouldn't build the whole scene here.' I said, no, we didn't build here, in fact, there only have tables.' Because the LED wall rendering is based on a camera, there is parallax. But even if you look at it casually next to you, you will still think that you are watching a live-action scene. " Entering the virtual studio, the actor has entered the virtual world and the final scene The changes of Hollywood led by virtual production technology In addition to accelerating the cycle and increasing the turnover of the budget, virtual production has also brought revolutionary convenience to actors and other teams. Actors can see the situation of the environment in real-time to perform and interact. "This not only helps the cinematography but also helps the actors understand the surroundings, such as the horizon. It also provides interactive lighting", Jon Favreau described it as “a huge breakthrough”. Virtual production can also help fantasy characters interact with real people In addition to The Mandalorian, movie projects such as Lion King and Fantasy Forest have already begun to use game engines to produce movies, but they are still mainly in the visual preview stage. Besides, Steven Spielberg and Denis Villeneuve used Unity to help achieve their visual effects in the production of Number One Player and Blade Runner 2049 respectively. This method is gradually replacing the usual storyboard production method similar to hand-drawn comic books. Spielberg stood on the monitor to view the real-time synthesis of the shooting content In the virtual production process, VR technology is rapidly becoming a viable option for large studios and production companies. Director Jon Favreau used a lot of virtual reality technology in the recent remake of the live-action movie The Lion King, which fundamentally created the entire virtual reality world of the movie. Lion King uses a lot of real-time virtual preview technology The popularization of virtual production technology In terms of budget, virtual production and LED display equipment are still relatively expensive at this stage.   Complete real-time compositing and film production at a lower cost Jon Favreau indicated that virtual production technology is a major leap forward in the film industry, allowing creators to make creative decisions before the project starts, rather than reinventing the process or after completion. Nowadays, more people can see the appearance of the lens directly during cinematography. More people can contribute their own ideas and learn from each other because they can see the final idea in advance. "In the past, when you went to a scene and left the green screen, you no longer care about it. But now, we have so many talented people and have accumulated more than one hundred years of filmmaking experience. Why should we give up just because we change the shooting system? Let’s continue to inherit the skills of film artists and develop tools that adapt to the times.” Virtual production technology will gradually become the routine process of film production in the future
More
2020-06-17
How does the studio directly produce the final shot? (1)
How does the studio directly produce the final shot? (1)
The Road to Innovation in Virtual Production How does the studio directly produce the final shot? The production team of Disney's new drama The Mandalorian used the real-time rendering technology to construct the digital virtual scene in advance. The Mandalorian, spin-off drama of The Star Wars launched by the Disney streaming platform The Mandalorian, the first spin-off drama of The Star Wars appeared on Disney+ on November 12, 2019, which was also the first blockbuster series made in Unreal Engine. The timeline of The Mandalorian is five years after Return of the Jedi, after the fall of the empire and before the emergence of the first order. The plot revolves around the travels of a lone bounty hunter in the outer reaches of the galaxy, far from the authority of the New Republic. The Mandalorian Epic Game, the development company of Unreal Engine, appeared in the list behind the scenes It is worth noting that Epic Games, the developer of the Unreal Engine, appeared on the list of thanks for the production of The Mandalorian, followed by a list of the entire virtual production team. Epic Game was previously committed to producing explosive games like Fortnite and other 3A-level masterpieces, including Kingdom Hearts III, Dragon Ball Fighter Z, and Star Wars Jedi Knight: Fallen Order and so on. In recent years, they are working with Lucasfilm to bring the real-time rendering capabilities of the Unreal Engine into the development of Disney’s streaming content for live-action production. Many shots background of The Mandalorian is presented directly from the LED wall. Jon Favreau is the producer and screenwriter of The Mandalorian, but also the pioneer using the virtual game engine in virtual production. At the SIGGRAPH 2019 computer graphics conference in Los Angeles, Jon Favreau shared an efficient way to use the game engine to help the film perform a virtual preview (Previs). "We used the V-cam system, which is essentially making movies in VR. All results would deliver to the editor. It is like we are editing a part of the movie in advance, the purpose of which is to achieve the pre-conceived goal."Jon Favreau said. Use VR technology and real-time rendering technology to view the virtual environment of the movie An evolution from the virtual preview tool to the final image production tool The virtual production system no longer only provides virtual preview services for the film. The crew used the LED video wall in The Mandalorian set as the background of the live real-time camera composites (In-camera composites). The SFX team projects the pre-rendered content, such as the environment, onto the LED wall as a dynamic green screen. The virtual studio is a cube of virtual content wrapped by four LED walls. It is driven by the Unreal Engine, and the photographer can shoot the final images directly in the camera (In-camera ). The content displayed by the LED will be adjusted and transformed in real-time according to the position of the virtual camera. Live actors and LED walls in the virtual studio "We present real-time rendered content in the camera and get composing shots directly. Considering zooming and other requirements, for some types of cinematography, we are not only able to interact on the spot but also can see the lighting, interactive light, layout, background, horizon, and others directly in the camera. Hence, these don't need to wait for the post-production." Jon Favreau noted.
More
2020-06-17
Foxrenderfarm

Powerful Render Farm Service

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    Media Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com