A Guide to the Career Growth of VFX Compositors(1)
As the leading cloud rendering services provider in the CG industry, Fox Renderfarm has an outstanding team with over 20 years’ experience in the CG industry. Team members are from Disney, Lucasfilm, Dreamworks, Sony, etc. With professional services and industry-leading innovations, they serve leading VFX companies and animation studios from over 50 countries and regions, including two Oscar winners.
In this artichle, we will share a guide to the career growth of VFX Compositors from Zhiyong Zhang, Compositing Technical Director in Base FX(Base Media is the leading visual effects and animation studio in Asia).
Part 1 Introduction
What is compositing?
There are many different explanations about compositing on the Internet. Since it's difficult to explain, I would analyze compositing from some shots.
Like the scene in "Monster Hunt", the foreground is made with a 3D model, and the background is made with a combination of the original paintings and pictures. That is to do some MP, color adjustments, elements setting and real shot adding. Finally, all the pictures are going to be integrated, which is a compositing process.
The professional value of composting Compositing is the last part of the visual effects in the later period. You need to blend and match three-dimensional things with the real shot through compositing techniques and methods. Through techniques and methods, add some 2D or 2.5D elements to make the picture more interesting. Compositing is making the projects that the entire visual effects team makes. Hence, compositors are responsible for the entire team, striving to adjust the project to the final optimal effect. This is the professional value of compositing.
The artistic charm of composting
For me, every adjustment on color, virtual focus, interactive light and others will make the scene more realistic or better-looking. Every step of the operation will improve the scene, bringing the compositors hope and happiness. It is awesome to do better in the time allowed and finally show something beautiful. This is the artistic charm of composting.
Part 2 Six basic skills of compositors
The first basic skill is erasing. Many scenes have tracking points when shooting on the green screen, or there are some things that can’t be removed from the scene. We need to do erasing, which is very basic but important.
The second skill is rotoing and chroma key. The chroma key is to keying the green screen. There is also a key called no green screen keying, which is the rotoing.
The third skill is to match colors. Adding an element to the real shot to change the color of the material is actually matching the material, not creation.
The fourth skill is to match the virtual focus or depth of field. The material itself will actually have virtual focus, so the compositor needs to also be good at photography, otherwise, it may become a bottleneck for the compositor. The ideas of the director and the main photographer will be presented in the scenes. We use compositing to re-express what they want to express in CG. We are doing matching rather than recreating.
The fifth skill is tracking. Most compositors do 2D trackings, such as billboards or on-screen patches. And sometimes they also do 3d tracking.
The last one is the matching noise. For some demanding projects, it is necessary to switch between red, green, and blue screen, or even zoom in to the local area for comparison. It also depends on the broadcast platform. Generally, if it is broadcasting on a TV platform, the noise is not easy to see the noise. If in the cinema, as long as the brightness of the cinema is large enough, the noise of some scenes can be easily seen. The noise matching can make the picture more realistic.
Part 3 Standards for cinema-level film compositing requirements
The standards here are not officially stipulated standards but are just some common standards that I personally summarize based on experience. I divided the cinema standards for film compositing into two types: technical requirements and artistic requirements.
First of all, it depends on whether the color of the original material is changed. If you are not careful or have moved other parts of the screen including the color of the material for other reasons, it may cause customer dissatisfaction.
Secondly, whether the maximum and minimum values of the screen are consistent with the material or not. Of course, the maximum and minimum values are not for a certain scene, but for all the materials of a project.
Thirdly, whether the metadata is correct or not. If the editing requires metadata to be accurately restored, it needs to be correctly restored to the material. Generally, at the beginning of the project, it needs to agree with the previous DI or editing on how to set back.
Fourthly, whether the shifting is stuck will also be a technical requirement. The shifting should be discussed with the editor. If the editing does not require any shifting, even if it seems to be stuck, it may be a fast switch of the front and rear scenes. This is one kind of editing methods. if you do something to make it smoother, re-reverting to the clip may cause problems. The last is whether there is any obvious flaw in the picture. The so-called flaw is whether there is a black border on the edge of the picture and whether the edge of the real shot is stretched.
The artistic requirements will be significantly different from the previous technical requirements. We will check whether the details of the keying edges are enough to maintain the authenticity of the picture. In the normal budget, the details of the edges can be kept authentic. If the edges look fake, it may make the whole picture fall short.
Whether the bright and dark parts of the screen is matching with the material within the acceptable range? The dark and bright parts here refer more to brightening everything in the viewing facility, to see if the added thing is the same color as the real shot and whether the degree of the color cast is the same. Without considering the budget, as a compositor, the only requirement is to do something better, so try to match it.
Then, whether the compositing elements, the virtual focus of the background and the depth of field match the real shot. The depth of field of the real shot sometimes will be very shallow, sometimes it will be very deep. Generally, there will be a range, this range is taken by the lens and the camera. We can add an element in it, and change the depth accordingly.
In addition, we must also consider whether the simulation of the lens effect meets the real shooting standard. With a normal lens, the virtual focus is 1:1, and with an anamorphic lens, the virtual focus is vertical, similar to 2:1 circular virtual focus. While doing compositing, it needs to pay special attention to matching.
The next thing to consider is whether the addition of smoke, flame and other atmospheres is truly fused. The light and shadow, the virtual focus and the sharpness are matched with the whole, and the things that really melt into it are realistic.
The last thing is to consider the interactive effects, the corresponding interaction between the real shot material and the CG elements. When we do the compositing, in addition to shadows, there are some light interactions, we should consider whether to add them.
In the process of compositing, we need to give ourselves the opportunity to exercise and do more fine, otherwise, we can't make progress. In fact, the audience is very smart now, watching a lot of blockbuster movies. Therefore, we should try our best to be realistic in the time cost of cinema-level movies. If the audience feels satisfied in the cinema, it will be successful.
How does the studio directly produce the final shot? (2)
Behind The Scenes
The rising expressive power of Real-time rendering
Jon Favreau's biggest concern about The Mandalorian is whether the game engine can reach the visual effect rendering level of Star Wars on the TV budget. But when he saw a piece of sand similar to episode 2, he abandoned the thought.
Sandy environment is built with assets scanned with photos and rendered in real-time
In the virtual studio, all the scene elements and lights can be switched and upgraded freely, and the feel of virtual reality is maintained. Part of the virtual environment of The Mandalorian directly uses assets created by Unreal Engine for some video games, saving a lot of asset construction time and opening up the possibility of asset sharing between the two industries.
Real-time rendering has been able to better present the highly reflective materials often involved in science fiction dramas
The virtual scenes on the set sometimes deceive the eyes of the people on the scene. Jon Favreau said, "There was a person who came to the studio and said,'I thought you wouldn't build the whole scene here.' I said, no, we didn't build here, in fact, there only have tables.' Because the LED wall rendering is based on a camera, there is parallax. But even if you look at it casually next to you, you will still think that you are watching a live-action scene. "
Entering the virtual studio, the actor has entered the virtual world and the final scene
The changes of Hollywood led by virtual production technology
In addition to accelerating the cycle and increasing the turnover of the budget, virtual production has also brought revolutionary convenience to actors and other teams. Actors can see the situation of the environment in real-time to perform and interact.
"This not only helps the cinematography but also helps the actors understand the surroundings, such as the horizon. It also provides interactive lighting", Jon Favreau described it as “a huge breakthrough”.
Virtual production can also help fantasy characters interact with real people
In addition to The Mandalorian, movie projects such as Lion King and Fantasy Forest have already begun to use game engines to produce movies, but they are still mainly in the visual preview stage.
Besides, Steven Spielberg and Denis Villeneuve used Unity to help achieve their visual effects in the production of Number One Player and Blade Runner 2049 respectively. This method is gradually replacing the usual storyboard production method similar to hand-drawn comic books.
Spielberg stood on the monitor to view the real-time synthesis of the shooting content
In the virtual production process, VR technology is rapidly becoming a viable option for large studios and production companies. Director Jon Favreau used a lot of virtual reality technology in the recent remake of the live-action movie The Lion King, which fundamentally created the entire virtual reality world of the movie.
Lion King uses a lot of real-time virtual preview technology
The popularization of virtual production technology
In terms of budget, virtual production and LED display equipment are still relatively expensive at this stage.
Complete real-time compositing and film production at a lower cost
Jon Favreau indicated that virtual production technology is a major leap forward in the film industry, allowing creators to make creative decisions before the project starts, rather than reinventing the process or after completion. Nowadays, more people can see the appearance of the lens directly during cinematography. More people can contribute their own ideas and learn from each other because they can see the final idea in advance.
"In the past, when you went to a scene and left the green screen, you no longer care about it. But now, we have so many talented people and have accumulated more than one hundred years of filmmaking experience. Why should we give up just because we change the shooting system? Let’s continue to inherit the skills of film artists and develop tools that adapt to the times.”
Virtual production technology will gradually become the routine process of film production in the future
How does the studio directly produce the final shot? (1)
The Road to Innovation in Virtual Production
How does the studio directly produce the final shot? The production team of Disney's new drama The Mandalorian used the real-time rendering technology to construct the digital virtual scene in advance.
The Mandalorian, spin-off drama of The Star Wars launched by the Disney streaming platform
The Mandalorian, the first spin-off drama of The Star Wars appeared on Disney+ on November 12, 2019, which was also the first blockbuster series made in Unreal Engine.
The timeline of The Mandalorian is five years after Return of the Jedi, after the fall of the empire and before the emergence of the first order. The plot revolves around the travels of a lone bounty hunter in the outer reaches of the galaxy, far from the authority of the New Republic.
Epic Game, the development company of Unreal Engine, appeared in the list behind the scenes
It is worth noting that Epic Games, the developer of the Unreal Engine, appeared on the list of thanks for the production of The Mandalorian, followed by a list of the entire virtual production team.
Epic Game was previously committed to producing explosive games like Fortnite and other 3A-level masterpieces, including Kingdom Hearts III, Dragon Ball Fighter Z, and Star Wars Jedi Knight: Fallen Order and so on. In recent years, they are working with Lucasfilm to bring the real-time rendering capabilities of the Unreal Engine into the development of Disney’s streaming content for live-action production.
Many shots background of The Mandalorian is presented directly from the LED wall. Jon Favreau is the producer and screenwriter of The Mandalorian, but also the pioneer using the virtual game engine in virtual production. At the SIGGRAPH 2019 computer graphics conference in Los Angeles, Jon Favreau shared an efficient way to use the game engine to help the film perform a virtual preview (Previs).
"We used the V-cam system, which is essentially making movies in VR. All results would deliver to the editor. It is like we are editing a part of the movie in advance, the purpose of which is to achieve the pre-conceived goal."Jon Favreau said.
Use VR technology and real-time rendering technology to view the virtual environment of the movie
An evolution from the virtual preview tool to the final image production tool
The virtual production system no longer only provides virtual preview services for the film. The crew used the LED video wall in The Mandalorian set as the background of the live real-time camera composites (In-camera composites). The SFX team projects the pre-rendered content, such as the environment, onto the LED wall as a dynamic green screen. The virtual studio is a cube of virtual content wrapped by four LED walls. It is driven by the Unreal Engine, and the photographer can shoot the final images directly in the camera (In-camera ). The content displayed by the LED will be adjusted and transformed in real-time according to the position of the virtual camera.
Live actors and LED walls in the virtual studio
"We present real-time rendered content in the camera and get composing shots directly. Considering zooming and other requirements, for some types of cinematography, we are not only able to interact on the spot but also can see the lighting, interactive light, layout, background, horizon, and others directly in the camera. Hence, these don't need to wait for the post-production." Jon Favreau noted.
The Witcher 3 Character Model Making Process
Fox Renderfarm, the best render farm in the CG industry, will bring you sharing about character model making process and ideas. Many people will encounter a lot of problems when they first come into contact with CG character art. For example, there is now a new character model that needs to be designed. Where should I start? What should it be made of? Even if there are very powerful artists to teach the experience, they will still find it impossible to start.
So how should this situation be resolved? We take the role of Vesemir in The Witcher 3: Wild Hunt as an example. Why choose this role? Because Vesemir's body shape, material, clothing, hair, etc. contain many types. The entire case contains a complete process from reference to a complete production process, which can fully meet all project production.
Before production, it is necessary to clarify the production goals: such as the model style, the details of the model, and what kind of quality to achieve in the end, these are very important. Taking the character of Vesemir in The Witcher 3: Wild Hunt as an example, you need to find references before making, such as armor reference, material texture reference, etc., even the style of clothing, sewing method, etc., all the things that need to be model need a detailed reference.
The basic structure of the model is very important. Although the character has clothing to cover the body, the structure of the body cannot be ignored. Be sure to ensure the correct body structure. (In the production of other projects, some basic body models can be saved, and some time can be saved here).
The same clothes are worn on different people, and the feeling of dressing is different. The part of the red line is to outline the outline of the desired character and then modify the model.
This is also the most critical step in making a model. If the shape is inaccurate or not correct enough, even if you put on clothes, the model will look fake and the whole will be very uncoordinated.
The picture above is the wrong case! For example, cloth, different materials in different parts, and different thicknesses, the wrinkle direction is different. The model should also consider the volume and structure of those things when making the model.
Regarding topology and UV, the focus is on low polygons. Especially for the human body, reasonable lines and surfaces conform to the direction of muscles and bones. For example, an accessory with a complicated shape, but it is extremely small, so you should not give too many polygons. Compared with the high polygon, the low polygon will always lose little things, so it is necessary to balance between the two.
The production of textures is a very important part. It is necessary to correctly analyze the changes in the real world and then restore them with layers, and understand how the texture controls different attribute nodes of the object. Substance Painter is very easy to use in texture production. Then according to different production processes, they correspond to the normal, color, highlight, glossiness, normal, color, metallicity, roughness, and other attributes of the object.
Make the basic texture in Substance Painter, and then added different details according to different props, such as broken and worn out. And distinguish the texture of props: metal, leather, weapons, etc..
The lighting used a set of previously made lights (which can save some time), and the texture can be produced while testing and observing the effect.
If the final rendering effect is more realistic, then the details are very important.
Why do you need to use a render farm?
With the rapid development of 3D movies in recent years, VFX movies have also received more and more attention. From the point of view of the box office of European movies, the top 5 movies Avengers: Endgame, Avatar, Titanic, Star Wars: The Force Awakens, Avengers: Infinity War. They are almost inseparable from VFX production. And Disney relied on the 3D animated film Frozen series, sweeping $ 2.574 billion in the world, creating a new record for the global animated film box office.
image via internet
Why do you need to use a render farm? Rendering is indispensable behind these movies, and rendering is almost done by various render farms of all sizes.
Rendering is the later part of 3D animation production, and rendering is a very time-consuming step in the later stage. An 80-minute animated movie can often be rendered in thousands to tens of thousands of hours.
In the animated film COCO, 29,000 lights were used in the scene of the train station; 18,000 lights were in the cemetery; 2000 RenderMan practical lights were used in the Undead World. The production team used the RenderMan API to create 700 special point cloud lights, which expanded to 8.2 million lights. These massive lights are a nightmare for the rendering artist. For the production requirements, the test can not complete such a huge rendering. They once tested rendering and found that these light-filled shots actually took 1,000 hours per frame! They continued their research and shortened it to 125 hours and 75 hours. The rendering time of one frame at the time of final production was 50 hours and one frame. The movie is 24 frames per second, 1440 frames per minute, if a movie is calculated as 90 minutes. That 1440 frames 90 minutes = 129600 frames, 129600 50 = 6480000 hours. Converted into adulthood is 740 years?
image via internet
"To render Avatar, Weta Digital used a 10,000-square-foot server farm. There are 4,000 servers with a total of 35,000 processor cores. The average time to render a single frame is 2 hours, 160 minutes of video. The overall rendering time takes 2880000 hours, which is equivalent to 328 years of work for one server! "
image via internet
If the rendering work is only on a general computer, this is almost impossible. So the production team created its own render farm, powerful machine performance and huge number to complete such a huge amount of work.
For small teams, studio, freelancer, and individual producers, rendering take the same time. If they want to spend a lot of time using the computer for rendering work, this is undoubtedly a time-consuming thing, and they are unacceptable. These various reasons, coupled with uncontrollable factors such as computer crashes, special effects artists often cannot complete the work within the time specified by the producer due to rendering problems. Moreover, the labor, material, and electricity required to maintain computer rendering. The cost is also a big fee for the production company!
So it is very necessary to work with some render farms. Not only can it save the time of the production staff, but also can use a large number of machines in a limited time to complete the work in a short time.
The emergence of cloud rendering technology makes the render farm more mature. Cloud rendering is to upload materials to the cloud and use this cloud computing system of the rendering company to render remotely.
Another advantage of cloud rendering is that small studios can also be responsible for rendering high-quality materials. In the film industry, this technological change is no less than the change that cloud computing itself brings to the IT industry.
Taking Fox Renderfarm as an example, no matter where the special effects technician is, they can use cloud rendering services to complete the "trilogy": 1. Upload your project; 2. Wait for the rendering to complete; 3. Download to the local machine. In addition, they can call more machines according to their urgency, and they can arrange the order of rendering and monitor the progress of rendering at any time. At the same time, the price is relatively low, which is accepted by many film and television special effects companies, rendering studios, and individual designers. So, why not have a free trial on Fox Renderfarm now? to experience the amazing cloud rendering services to speed up your rendering.
Selected Advertising Works from shots Awards The Americas 2020
The shots Awards The Americas 2020, an extension of the shots brand and celebrates and rewards truly exceptional creative work, announced its shortlist earlier this year. Let’s get inspired by the shortlist of Best Use of Animation in a Commercial.
- AT&T - Train
- Agency: BBDO
- Production Company: Furlined
- Director: Dougal Wilson
“Train,” combines elements from Westerns and family films to let viewers know they can get movies and more with AT&T unlimited plans. The film uniquely tells the tale of a Wild West train heist in two distinct parts – mashing together suspenseful live action followed by whimsical stop motion animation to surprise and delight.
The animation director on the project was Paul Harrod, who was the award-winning production designer on Wes Anderson’s Oscar-nominated Isle of Dogs.
- Corona - The History of 'La Cerveza Mas Fina'
- Agency: Observatory
- Director: Nicolas Ménard
- Production Company: Nexus Studios
'The History of La Cerveza Mas Fina,’ created by creative agency Observatory in Los Angeles, is a Corona Spanish-language campaign for the Mexican market.
From post-revolution Mexico and the birth of Corona, the film leads us through the country’s Golden Age of Cinema and Peso Crisis to see Corona’s international growth, before we arrive in modern-day Mexico.
Nexus Studios worked with Andy Gent’s team at Arch Model Studio to design, create and animate a cast of 71 miniature characters, 8 beautifully detailed sets and 350 individually made miniature glass bottles.
Photo credit: Nexus Studios
Please watch the video via this link: https://vimeo.com/361198280
- IKEA Canada - Stuff Monster
- Agency: Rethink
- Director: Mark Zibert
- Production Company: Scouts Honour
IKEA Canada reveals its approach to a more sustainable retail future. In their advertising campaign ‘“Stuff Monster”, IKEA shows the beautiful possibilities of re-using our old furniture.
The monster in the ad is a metaphor for the belongings we accumulate that weigh us down. The ad follows the monster, which is literally made up of old IKEA furniture, as it gives away pieces of itself to be reused. When the monster sees its belongings being given new life it becomes lighter and happier, ultimately revealing the human behind the monster as she sets out the last of her furniture with a ‘free’ sign.
- 2K - Gift of Mayhem
- Director: Eddie Alcazar
- Production Company: Chromista
With a background in game development - and currently in production in a full stop-motion feature - director Eddie Alcazar created a stop-motion holiday diorama for Borderlands 3, a ‘shoot and loot’ game developed by Gearbox Software.
Eddie brought a powerful hybrid of experience to both the creative and highly technical aspects of the piece. “Stop motion is a very old and time-intensive art form, but with this project, we had the opportunity to bring it into an incredibly contemporary context and to an audience who may be less familiar with the medium,” the director says.
- BMW - Legend
- Agency: Goodby Silverstein and Partners
- Director: Dante Ariola
- Production Company: MJZ/USA
Behold the Legend of Ol’ McLanden. Not every hero has what it takes to battle the bulls of the bank, beat Roy the Destroyer, and outrace the Rogues of Razorback Ridge. Then again, not every hero is sitting behind the wheel of the largest BMW ever created. Journey with McLanden and the epic X7, as they make everyday legendary.
Watch the making-of:
- Xfinity - A Holiday Reunion
- Agency: Goodby Silverstein and Partners
- Director: Lance Acord
- Production Company: Park Pictures
37 years ago, ET landed in theaters and in the hearts of moviegoers. Now, thanks to Xfinity, the ‘Extra-Terrestrial’ is back to bring everyone a nostalgic feelgood in time for the holidays.
Comcast NBCUniversal is continuing the story of friendship by reuniting ET and Elliott – played by a grown-up Henry Thomas. In the story, the iconic characters from the Universal Pictures classic tell the updated tale of connection using technology from Xfinity and Sky. Watch the making-of:
- Dick's Sporting Goods-The New Kid
- Agency: Anomaly
- Director: Shawn Levy
- Production Company: Pacific Rim Films
DICK’S Sporting Goods presents “The New Kid.” See our hero discover the true magic of sports when her first overnight turns out to be anything but a silent night.
- DisneyJunior - Mickey Mouse & The Magical Holiday Bag
- Agency: Walt Disney Studios Motion Pictures
- Director: Harry Chaskin
- Production Company: Stoopid Buddy Stoodios
A collaboration between Disney Junior and Stoopid Buddy Stoodios, the interstitials find the sensational six celebrating the holidays together with a visit to a wintery cabin where they are met with the magic of the season.
- Seamless - Fueled This City
- Agency: BBH
- Director: Andy Baker
- Production Company: Hornet
New York’s No.1 delivery app, Seamless, turns 20 this year. Which means we’ve delivered through it all. Y2K. Blizzards. Rainbow bagels. Everything and anything. All to fuel this city and every New Yorker in it.
- Crown Royal - Apple
- Agency: Anomaly
- Director: Ingi Erlingsson
- Production Company: Golden Wolf
Golden Wolf, an Emmy-nominated studio founded in 2013, taking us to another dimension with this spot for Crown Royal Apple.
- Oreo - Oreo x Game of Thrones Title Sequence
- Agency: 360i
- Director: Andy Hall | (Elastic)
- Production Company: Elastic
Mondelez’s Oreo has meticulously reconstructed the stop-motion titles of the Game of Thrones franchise with 2,750 cookies to promote one of its largest-ever brand collaborations.
The reworked version of the lengthily trail was produced by Elastic, the creators of the original Game of Thrones sequence.
You can see the whole list here. Winners will be announced via a live stream on 15th May at SHOTS.NET.Let’s wait and see.
KeyShot 9 Tutorial: Christmas Scene Rendering
This article is a 3d rendering tutorial about Christmas scene shared by 3d artist Drown. The article is organized by Fox Renderfarm, the leading cloud render farm in the CG industry. Because there are so many items in the scene, this tutorial will focus on the lighting of the scene and the skills to create a Christmas atmosphere, and the adjustment of the material will be briefly described.
Use KeyShot 9 to open the placed Christmas scene, add a camera and adjust the appropriate angle and save. The next work is to give the main body a crystal ball of glass material, and change it to the product mode in the lighting mode, so that light can pass through the glass ball to illuminate the internal details.
The tablecloth in the scene and the curtains in the background were selected from red velvet material. The official velvet material of KeyShot 9 was selected to bring out the atmosphere of the Christmas scene.
Next, we need to find a suitable HDR map. Because it is a Christmas scene, we need an indoor HDR map with many small bright light sources to simulate the scene atmosphere of the decorative chandelier on Christmas eve. After using the HDR map, rotate it to an appropriate angle to adjust the brightness, contrast, hue, saturation, etc. of the picture to make the scene warm and cold.
Enter the image panel, change the image style to photography, adjust the exposure, contrast, etc., to further enhance the warm and cold, light and dark contrast of the scene.
Next, open the HDR canvas and manually add a small warm light source to cover the cold-light incandescent light on the original image to further strengthen the Christmas Eve atmosphere. Pay attention to keep the cold light in the scene properly to avoid the scene being too warm. At the same time, the brightness of the small warm light source should be adjusted appropriately and the blending mode changed to ALPHA to ensure that the cold-light incandescent lamp can cover the original image.
In order to highlight the crystal ball of the subject, add a sphere to the scene and set the material as a spotlight. After adjusting the position, illuminate the crystal ball.
Further fine-tuning various lighting parameters, the atmosphere of the picture is already in place at this time, but now the picture is a little too warm and lacks a bit of realism.
Add a cool rectangular light on the right side of the screen to reconcile the warm atmosphere of the screen, so that the screen effect will not be too false. Here we must pay attention to changing the blending mode of rectangular light to blending to avoid the glass ball reflecting pure white rectangular light directly and causing the realism of the picture to be lost.
After the atmosphere of the screen comes out, you can start adjusting the texture. Start with the snow in the crystal ball. Use the noise texture node to add bumps to the snow and snow colors on the surface. If the screen freezes, you can hide other items in the scene, including the spotlight.
Next, use the matching map to adjust the material of the lounger inside the crystal ball. Anything can be placed here, not necessarily the chair, as long as it is a suitable material and color. It is recommended to use a brighter color scheme to match the atmosphere of the scene.
Next adjust the texture of the Christmas tree.
After lighting the bulb material on the Christmas tree, observe the current effect. Note that the glass of the crystal ball must be given a transparent glass material, otherwise the light will not find the scene inside the crystal ball and let the light penetrate into the interior scene.
Use the matching wood texture map to simply adjust the wood texture base material, and adjust it according to your favorite, such as using a black plastic base.
Then open the texture of the glass ball, add the fingerprint texture that is official with KeyShot 9, and after adjusting the position size, turn off the texture repeat, and add the diffuse reflection texture as the fingerprint texture. After that, the fingerprint texture is connected to the label, and the texture is used as the color of the texture. Here, the brightness must be adjusted, and it should be reduced as much as possible to make the fingerprint faintly visible.
Use scratched maps, adjust the contrast with color, and control the transparency effect on the glass. Repeat this step to make scratches and bumps on the glass. The effect of the scratches is a little bit, otherwise the glass will look fake.
Next use to adjust the texture on the crystal ball name tag.
Then use PS to edit a piece of text, pay attention to use a black background, pure white fonts, and use horizontal composition. After saving the black and white picture, use the 3D--Generate Normal Map option in the PS filter to generate a bump texture.
Drag the texture to the texture of the brand, create a new texture as the material of the lettering, and adjust the color according to your preference. After adjusting the texture, the black and white image is used as the opacity, and the normal texture is used as the bump. Pay attention to the normal To complete the adjustment of the brand-name texture.
Simply adjust the material of the board, and the color matching needs to be darker to avoid the board being too conspicuous. Then use the fuzzy nodes in the geometry to make fluff on the tablecloth. The surface material of the nodes can directly use the original red velvet material.
In order to prevent fluff from passing through the model, you can simply render a top view, and then draw a black and white texture according to the position of the item in PS. Use a Gaussian blur filter to blur out the edges of the black and white texture. Then use the texture map as the length texture of the fur node. The length of the fluff where the texture is black is 0. After the geometry node is re-executed, the fluff will not pass through the model.
The curtain can use the color gradient node. Change the gradient type to the viewing direction to enhance the volume of the curtain fabric. The background wall is not easy to be too dark or too bright, otherwise the background is too eye-catching and will destroy the atmosphere of the scene.
The use of glass as the texture of the candle's wrapping paper can reduce the refractive index to make the texture of transparent plastic wrapping paper. Later, the translucent material is used to adjust the candles, and the color matching can be free.
To color various gift boxes, it is recommended to color each gift box, and cool colors are recommended to use blue to correspond to the cold light atmosphere in the scene.
Next, you need to adjust the surface effect of the gift box to make the bumpy texture on the gift box. The effect does not need to be too complicated, it can be simpler.
Next, add a sphere to the inside of the crystal ball to make a snow effect inside the crystal ball. Finally, fine-tune the composition, lighting, and then render the image.
Fox's Got Talent 3D ‘Easter Egg’ Challenge
Wanna show your talent on the world stage while winning big prizes?
Today’s the day! Share your fun stories or crazy ideas about ‘Easter Eggs’ through 3D renders with us! The Top 3 artworks will be featured and promoted in multiple online channels, and their authors will get a good deal of render coupons from the world-leading cloud render farm!
Fox's Got Talent 3D ‘Easter Egg’ Challenge
Theme: Easter Egg
Spring has sprung, Easter is coming soon. Speaking of Easter eggs, you may have the image of rabbits holding colorful eggs, or you may think about the variety of candies and chocolates. And if you are a movie fan, lots of hidden surprises will conjure up in your mind… Whatever Easter Egg means to you, set your imagination free, create a 3D render, and tell us your story.
Enjoy your creation and happy render!
Time for entries: Feb. 26th - Mar. 30th(UTC+8)
Winners announcement time: Apr. 6th(UTM+8)
3 artworks will be selected and awarded with fast and easy cloud rendering services provided by Fox Renderfarm.
- Fox Renderfarm: render credits worth US $500
- Fox Renderfarm: render credits worth US $300
- Fox Renderfarm: render credits worth US $200
Besides, the winning artworks will gain a great amount of exposure and publicity.
- Interview with Fox Renderfarm
- Advertisement and promotion on our official website, social media accounts, and newsletters.
- Fox Renderfarm has close cooperation with multiple excellent CG studios and artists worldwide, come and shine your talent on the global stage!
How to submit
Join CG & VFX Artist Facebook group, post your artwork in the group with tags FGT3D and FGT3DEasterEgg2020. Or send your artwork to FGT3D@foxrenderfarm.com with your name and/or the studio’s name.
- Your entry must relate to the challenge’s theme (we strongly encourage you to set your imagination free)
- Your entry must be a 3D rendered image
- Your entry can be created by one artist or a group
- There’s no limitation on styles and the choices of software and plugins
- Your entry must be original art created specifically for the challenge (no existing projects)
- Minimal use of third party assets is allowed, as long as they are not the main focus of your scene (third party textures and materials are not included in this p and can be used freely)
- No fanart allowed
- Feel free to enhance your rendering
- Images that depict hate, racism, sexism or other discriminatory factors are not allowed
- Works must be submitted before the deadline
Render Farm Reviews from Indian Well-known Movies
As a leading render farm and the largest online render farm in the CG industry, Fox Renderfarm has earned a good reputation for its quality performance, great customer service and flexible pricing scheme. We have provided our best cloud rendering services for over 160,000 happy customers from 50+ countries and regions.
Fox Renderfarm frequently receive good reviews from our customer. On this page, we will showcase some testimonials from our beloved Indian clients.
It’s been a breeze having Fox Renderfarm as render partner. The service has always been very prompt on the requirements. However, I wish we were always given an average of higher number of blades.
——Asif Sayed, the Vice President Operations at Famulus Media & Entertainment
Baahubali: The Beginning
2015 India's top grossing blockbuster. 2015 India’s top budget historical epic movie.
“Fox Renderfarm have been very helpful to us during our crunch time on the feature. The pricing structure offered to us was competitive for the quantity of work we were producing with render nodes being always available to us for use. Within around a month we rendered approx. 80,000 frames, totaling 27,000 render hours on the Fox Renderfarm render farm.”
——Subhrojyoti Banerjee, the senior VFX artist of Makuta VFX
——A K Madhavan, the Founder & CEO of Assemblage Entertainment
Have you ever thought about creating ArchViz in gaming engine Unreal Engine?
Have you ever thought about creating ArchViz in gaming engine Unreal Engine?
Today, we bring you an interactive ArchViz work made by Jesús Gómez San Emeterio, nominee for the Interactive Category in CGarchitect.com 2019 Awards. In our talk with Jesús, he revealed how he finally made the immersive experience possible with months of “trials and errors” in UE.
What’s your view towards interactive ArchViz? Do you prefer still imagery or this novel form? Looking forward to your answers~
Master Bedroom by Jesús Gómez San Emeterio
3D Artist Ehsan Darvishi and Car Render Challenge
Hum3D.com ‘Car Render Challenge’ is one of the fantastic render challenges that artists who are passionate about both 3D creation and cars should not miss! As the sponsor, Fox Renderfarm is amazed by the fact that 3D creation not only demonstrates visual beauty, is also used as a powerful storytelling tool.
The 1st Prize Winner - Ehsan Darvishi created a stunning Chevrolet Corvette 1960 with the exquisite reflection, and the fine lighting of the diner behind. He shared with us how he did the modeling and achieved the sophisticated lighting step by step in 3ds Max.
Anyone interested cannot miss our interview: Creating the Sophisticated Chevrolet Corvette 1960 in 3ds Max
Spin VFX’s Free Tool Nuke Gizmos 2.0 Released
On January 6th, 2020, Visual effects facility Spin VFX released version 2.0 of Nuke Gizmos, a popular open-source collection of Foundry compositing software.
Former Spin VFX lead compositor Erwan Leroy ’s video overview of Nuke Gizmos, the studio ’s open-source collection of compositing tools. Version 2.0, released late last month, updates the spill correction tool. Here is a video about version 2.0 of Nuke Gizmos,
These tools are all available in studio production and cover a range of common tasks including camera projection, relighting, light wrapping, color matching and spill correction.
A versatile collection of Nuke tools for production of hit TV series
Spin VFX is known for its work on shows like The Expanse, Stranger Things, and Game of Thrones and its first made its Nuke Gizmos open-sourced in late 2018.
The toolset comprises 13 individual tools and gizmos designed to extend or replace Nuke's native functionality.
Highlights include ReProject_3D, which does camera projection based on render point position pass; and Relight_Simple, a relighting node that requires only a normal map to begin.
Light wrap node Lightwrap_Exponential provides "more physical response" than Nuke's default settings. When expanding the mask, Erode_Fine provides better control than Nuke's Erode filter.
The collection also includes tools for matching black and white points between images, suppressing specific RGB or CMYK colors, or adding synthetic camera grain, glow, or color aberration to the composition.
Version 2.0 of Nuke Gizmos also has an updated version of Spill Correct, which can be used to eliminate color overflow artifacts when locking blue-screen or green-screen footage.
The files can be downloaded on GitHub here: Spin Nuke Gizmos – Releases
China Film Administration Release New License Logo
Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel
How the Redshift Proxy Renders the Subdivision
Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?
Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?
Corona Renderer Learning - Denoising
Arnold Render Farm | Fox Render Farm
Is This Simulated By Houdini? No! This Is Max!