The Future of 3D Rendering is in the Cloud

- Authored by Shaun Swanson 

d1

When we think of 3D animation, we imagine an artist sitting at a work station plugging away in software like 3DS Max or Cinema 4D. We think of pushing polygons around, adjusting UVs and key frames. We imagine the beautifully rendered final output. What we don't often think about is the hardware it takes to render our art.

Illustrators may get by rendering stills on their own workstation. But, rendering frames for animation requires multiple computers to get the job done in a timely manor. Traditionally, this meant companies would build and manager their own render farms. But having the power to render animations on-site comes with a considerable price. The more computers that have to be taken care of, the less time there is to spend on animation and other artistic tasks. If a render farm is large enough it will require hiring dedicated IT personnel.

Well-funded studios might buy brand-new machines to serve as render slaves. But, for smaller studios and freelancers, render farms are often built from machines too old to serve as workstations anymore. Trying to maintain state-of-the-art software on old machines is often challenging. Even when newer equipment is used, there is still a high energy cost associated with operating it. The electricity required by several processors cranking out frames non-stop will quickly become expensive. Not to mention, those machines get hot. Even a single rack of slaves will need some type of climate control.

These issues have made many animation companies see the benefits of rendering in the cloud. As high-speed internet access becomes available across the globe, moving large files online has become commonplace. You can upload a file to a render service that will take on the headaches for you. They monitor the system for crashes. They install updates and patches. They worry about energy costs. Plus, there is the speed advantage. Companies dedicated to rendering are able to devote more resources to their equipment. Their farm will have more nodes. Their hardware will be more up-to-date and faster.

The solution cloud rendering provides couldn't come fast enough. There seems no end to the increasing demands made on render hardware. Artists and directors are constantly pushing the limits of 3D animation. Scenes that may have been shot traditionally a few years ago are now created with computer graphics to give directors more control. With modern 3D software it's easier to meet those creative demands. Crashing waves in a fluid simulation, thousands of knights rushing towards the camera or millions of trees swaying in the wind might be cooked up on a single workstation. But, even as software improves, processing those complex scenes takes more power than ever before.

It's not only artistic demands from content creators putting render hardware through its paces. The viewing public has enjoyed huge improvements to display resolution in recent years. The definition of high-definition keeps expanding. What many consider full HD at 1920 x 1080 is old news. Many platforms now support 4K resolution at 4096 x 2160. In many cases, that's large enough for hi-res printing!

The public is also getting use to higher frame rates. For years, the industry standard for film has been 24 frames per second (fps). But, in 2012, Peter Jackson shot The Hobbit: An Unexpected Journey at 48 fps.

While some prefer the classic look of 24 fps film, animators have to prepare for higher frame rates becoming standard.

Everything is pointing towards cloud rendering becoming the norm. While studios can put up with the hassle of their own render farms, there is little need to when companies like Rayvision can do it better. Managing your own render farm could soon be as uncommon as hosting your own website. It's something that is simply better done by a dedicated company. Welcome to the age of cloud rendering.

About Author :Shaun Swanson - who has fifteen years of experience in 3D rendering and graphic design. He has used several software packages and has a very broad knowledge of digital art ranging from entertainment to product design.

This article posted on http://goarticles.com/article/The-Future-of-3D-Rendering-Is-in-the-Cloud/9416094/

Welcome to join us

render farm free trial

Recommended reading


How to render large scenes with Redshift in Cinema 4D

2019-10-15


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


How to Render High-quality Images in Blender

2020-03-09


How the Redshift Proxy Renders the Subdivision

2018-12-28


China Film Administration Release New License Logo

2019-01-09


Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel

2018-12-26


Corona Renderer Learning - Denoising

2019-05-15


Redshift for Cinema 4D Tutorial: Interior Design Rendering

2019-10-12


Partners

Interested

What is Rendering? How to Accelerate the Rendering Speed?
What is Rendering? How to Accelerate the Rendering Speed?
In order for 3D projects to be displayed visually at all, they have to be rendered. Although this process is done by computer processors, this work step is very time-consuming due to the enormous computing power. But what exactly happens when rendering? To put it simply: A finished single image or an entire scene from moving images is calculated pixel by pixel from the raw data of a 3D model previously created in a corresponding program. In the case of moving images, one must also note that one second of the finished scene consists of several “frames”. For a movie, for example, that's 24 frames per second. And when rendering, each individual frame is calculated as a single image. The duration of the calculation of these frames are depending very much on the quality and the level of detail of the project, but above all on the set lighting conditions and the corresponding surface reflections, the given perspective and of course any errors that may be contained. In the end, the calculations are extremely complex and result from the previous work steps such as modeling, texturing, shading, lighting and animation. And so rendering is also the last big step in the implementation of a 3D project. 600 Years of rendering time for a character from "The Load of the Rings"? To give you an idea of what that means in practice: In the 2003 Oscar for the best visual effects film "The Lord of the Rings: The Two Towers", the character "Treebeard" - an animated tree creature - was in the long version can be seen for 13:45 minutes. This means that a total of 19,800 frames (i.e. single images) had to be rendered. In this figure, the calculation of a single frame took about 48 hours, which in turn means a total of 950,400 calculation hours for all frames together. That would be the equivalent of 39,600 days or over 600 years for a single computer processor. No wonder, then, that the executing VFX studio Weta Digital increased the already enormous technical equipment of around 2,250 processor cores by another 500 while production was still running. In the end, with this concentrated computing power, rendering “Treebeard” alone should have taken about 14 days. But the long version of the film lasts almost four hours and consists of a lot of animated content. It's hard to imagine how long it must have taken to render the entire movie. And mind you, this does not even include the artistic work, only the calculation is finished images. 60 Million rendering hours for the Disney film "Frozen" But whoever thinks that rendering will be accelerated by technological developments is only partially correct. Because the developing technological possibilities naturally also improve the quality of the projects. The production team of the world-famous Disney animation film "Frozen" from 2013 cope with a total of 60 million rendering hours. And although the Walt Disney Animation Studios had increased the number of processor cores from 5,300 to 30,000 within two years and thus had one of the world's largest data centres, rendering the entire film still took over 83 days with all the computing power 24 hours without a break. But Disney doesn't just work on one project, so it probably took longer. And another small comparison: Incidentally, a single supercomputer from this time would have taken more than 900 years to render the film ... These are of course examples from professional productions with the highest standards. And of course, for you and your studio, how to accelerate the rendering speed? The best cloud rendering service provider, Fox Renderfarm with enormous computing power worth choosing. Fox Renderfarm was founded in 2011 and is now the largest provider in this area in Asia. In a so-called “[render farm]( "render farm")”, many physical servers and thus computer processors are combined to form a gigantic “supercomputer”. At Fox Renderfarm, for example, there are currently over 20,000 physical servers that are currently used for rendering by over 1.5 million end-users from over 50 countries. In total, Fox Renderfarm "renders" 1 petabyte of finished graphics results per month for its customers, which corresponds to one million gigabytes. Join and try $25 free trial now!
More
2021-04-08
FGT Art 2021 January Winner Announced
FGT Art 2021 January Winner Announced
Let us introduce you to the first FGT Art of the Month 2021: Eternity (一笑一尘缘,一念一清静。)! This amazing artwork, Eternity (一笑一尘缘,一念一清静。), is created by our friend Kay John Yim, an excellent Chartered Architect based in London. The artwork also has a beautiful caption: To see a World in a Grain of Sand And a Heaven in a Wild Flower, Hold Infinity in the palm of your hand And Eternity in an hour. Who’s our next FGT Art winner? Shine your talent and win great prizes! Hit the link and know more
More
2021-02-02
3ds Max Tutorials: The Production and Sharing of "Sci-Fi Guns"(3)
3ds Max Tutorials: The Production and Sharing of "Sci-Fi Guns"(3)
The best render farm, Fox Renderfarm still shares with you the production of the work "Sci-Fi Guns" which made with software 3ds Max and Substance Painter 2019 from 3d artist Zikai Wu. And this is part three of the sharing. Texturing Next, start the texture production. First, divide the whole into several large layers, and then create a selection area based on the id map made before. First, create a layer group and add a color selection mask. Then make the texture of each part in each layer group. When making textures, my production process is to pave the color first, and then add details. First, give a basic color, then copy the layer to make the darker part, add Dirt generator to this layer, if the effect does not meet your requirements, you can modify the generator parameters and add a brush tool to draw the part you want The effect can be. I usually add the Edge generator to the bright edges to increase the volume and make the overall effect clear. For example, small white particles on part of the surface: first create a new filling layer, add a black mask to the layer, add a filling layer to the mask, then find the picture you want in the grayscale material library and drop it on top, modify the parameters to Control the particle effect. After the texture is finished, I will create a new layer on the top and add sharpening to enhance the clarity of the details. For the gun, I used some glow textures and halo effects. These halos used some basic layers, and then used a mask to draw the shape, leaving only the Emiss self-illumination, and then changed it to blue. Then duplicate this layer, and add a Blur filter on the mask, modify the blur intensity to achieve the halo effect. Other parts of the texture also use this method, and the texture is exported after production. When exporting textures, you need to pay attention to texture size, texture location, channel, and format. Rendering After the texture is produced and exported, it enters the rendering stage. We also need to export the environment map used in Substance Painter to MarmosetToolbag 3 for use. Import the low polygon and add the MarmosetToolbag 3 shader, paste the exported texture according to the channel, pay attention to the name of each channel. You need to flip the Y-axis on the normal channel, just click it. Because this also has an emissive map, you need to open the emissive channel in MarmosetToolbag 3. After the texture is the lighting, the lighting method: one main light source (warmer), one or two auxiliary lights. (Colder). The main light source is located 45 degrees above the front, and the auxiliary light is located on the side and back, which is mainly used to illuminate the dark parts and express silhouette contours. After all the settings are adjusted, you can render. Below is the final rendering effect: The above is the whole process of making this gun. In production, I keep trying to learn and find the most suitable way. I hope this tutorial can be of some help to you.
More
2021-01-19
Fackbook Customer ReviewsFoxrenderfarm

Powerful Render Farm Service

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    Marketing Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com