3D Tutorials: How to Make Dogs in Togo(3)

2020-08-31

VFX

The digital environment in the film

What we mentioned above are all about the production of CG sled dogs. There are many natural environments in the film, some of which are fully synthesized digital landscapes, and some are enhanced effects after real shots in Alberta, Canada. The two main scenes are Wheat Kinley Mountain shooting at Fortress Mountain, Alberta, and frozen Norton Bay shooting at Lake Abraham, Alberta.

Real shot material

  1. Shooting across a frozen lake in Alberta, relying on a lot of roto to complete

The reason why I chose Lake Abraham in Alberta as the location for shooting the unstable frozen lake is that the blue ice here is very clean and clear, and the place is large. It's just that the weather on the day of the shooting was not very compatible. Two days later, the lake was covered with snow. Fortunately, the production team took a lot of reference photos within these two days.

Nearly 95% of the shots are related to the sled dog and the owner Leonhard Seppala. To facilitate production, the entire sled team, sled, protective gear, fur, clothes, and Leonhard Seppala's hair need to be rooted separately, taking into account the sled dog body behind the team The cold atmosphere is stronger than the sled dogs in front, and the roto has to be created in layers.

A real shot of sled dogs, replacing the environment and enhancing the atmosphere

Under normal circumstances, it is sufficient to project the roto layer on the character's body trajectory or vehicle trajectory and then superimpose on it, but at this time the hair details are very complicated. You need to project the roto on the card, build a sled system as a card binding, and then add Each layer of particle FX, snow, atmosphere, etc. In addition, the snow and ice produced by the sled and sled dogs flying forward are simulated in Houdini and filled with some 2D elements.

Real shot

Final shot

  1. Cracked ice

The design of the ice-breaking lens is very complicated. The live shots were taken on a flat ice surface, so some ice flipped shots can only be done in full CG, while those non-CG shots used a method of destroying the lens track to make the ice surface look more" be more active, limit it to the range that does not destroy the parallax, use 2D techniques to constrain the fixed camera on a floating ice cube, or move the camera on the ice material, add 2D floating, and make the effect look less stable.

Moreover, the material itself was shot on a flat ice surface with different light, and the lighting conditions are constantly changing with time. Therefore, the material of each version is different. It is necessary to keep these materials consistent and guide the ice surface. Fragmenting piece by piece is also a long and complicated process.

  1. Simulate huge ice cubes

In the sequence of the sled dog team crossing the frozen lake, huge blocks of ice will gradually rise as the ice surface breaks. The production team used procedural methods as much as possible to guide the shape of huge ice cubes. When there are a large number of huge ice cubes, there is no way to bind each piece individually, and it is impossible to carry out carving, texture, and appearance development processing on each piece. This will limit its shape and size. Once you want to make it If you modify it, you have to go back to the previous step and recreate it. So they created a new Cascade system that allows the layout department and the environment department to create huge ice cube layouts on a shot-by-shot basis.

The Layout department created a very basic proxy shape in Maya and used Maya's curve tool to draw a huge ice shape, stretch it, place it in the scene, add binding constraints, and set up rough animations in the floating ocean. The environmental team has created a very practical toolset that can procedurally model huge ice blocks through basic geometry, generating broken edge details, internal bubbles, cracks, and faults in the ice layer. With the help of new tools, the work of the production team is basically all day shooting during the day and farm rendering at night.

If you need to change the size or shape of the huge ice cube, go to the Layout link to redraw the curve and give it to the next process. The visual effects link will also simulate the interaction between huge ice cubes and water, including details such as bubbles and splashes, and finally, render in Clarisse.

  1. The environment of other mountains

In the film, the growth of Togo is described in the form of memories of Laosai, part of which takes place among the mountains. The production team took a lot of very beautiful mountain views and modified them on this basis. For example, when shooting in Fortress Mountain, Alberta, the director thought the environment was good but there were too many trees, so some processing was done in the later stage. For another example, in the shots of the settlers' houses, there are no Alaska mountains in the real shots, and they need to be added later.

Real shot

Final shot

When designing the background, compared to the cumbersome work of drawing a large number of digital landscape maps, the production team adopted a 3D method that combined digital high-modulus, lidar scanning, and photogrammetry technology.

In the end, the mountains that we saw close to the lens were sculpted, textured, and look-dev processes were completed. There were also some Clarisse renderings of trees, leaves, and rocks. In general, there was indeed a lot of background work that needed to be processed.


3D Tutorials: How to Make Dogs in Togo(2)

2020-08-28

VFX

The dogfaces system for sled dog facial capture

In order to produce the final version of the sled dog animation, DNEG also explored the dog's motion capture and spent time developing the Facial Action Coding System (FACS) system specifically for the face of the sled dog.

  1. Research and development of sled dog motion capture

In the early stages of production, DNEG used well-trained sled dogs for motion capture in Animatrix Studios. We usually see actors wearing professional clothing for motion capture. This time the same process happened on sled dogs. The way the equipment is worn and how to set tracking marks on the fur is very technically challenging. Although these motion capture data were not used in actual production, it was still a good learning experience for the production team in the future.

  1. Research and development: dogFACS system

In order to have a detailed understanding of the facial muscles of sled dogs, and to clarify the working direction of facial binding as soon as possible, DNEG also started dogFACS research very early. Since there is a human face-catching system, there can also be sled dogs. The researchers categorized all sled dog expressions, and drew facial expressions based on the dogfaces system, including "mouth raised" and "wrinkled nose". For example, a growl is "upper lip lift", "wrinkle nose", "wrinkle lips" "The expression of these three actions combined together. Facial control schemes based on these expressions can enable animators to activate or counteract individual expressions, thereby creating more detailed and believable animations.

DNEG’s binding department has re-developed the entire four-legged binding system to simulate higher standards of real performance, including a new front leg module, which has the function of “fixing limbs” so that animators can imitate most animals. The fixed state of the front leg when the weight is on the leg; the reconstructed spine setting is also included to improve the realism and function of the animation. In general, the binding process requires a lot of effort. It is necessary to work closely with the animation department to develop and improve the four-legged binding standard, and always draw inspiration and direction from research references.

  1. Roaring time

There is a scene where a little partner named Ilsa in the sled team roars at Togo, and Togo immediately subdues Ilsa. Please note that the seemingly real performance here is actually a CG shot.

In the actual shooting, the dog handler used a "snarling device", which uses a rubber band and a prosthesis to open the sled dog's mouth. It is said that the sled dog who plays Ilsa is very docile and very coordinating with the whole process, but its eyes and constantly wagging tail reveals its extremely happy state, and it doesn't look angry at all.

Real shot material

The effect after replacing the CG head

The production team decided to use a real shot of the body + CG head to solve it, and designed some conceptual images of the sled dogs being angry and fierce. This production process is very challenging for riggers, modelers, and animators. They need to accurately grasp the subtle facial animations of sled dogs, and Ilsa's head production is more detailed and precise than Togo. After all, the lens is mainly In performance Ilsa.

Real shot material

Final shot

The most useful part of the dogfaces system is this close-up shot of Ilsa's rant. FACS expressions and mixed expressions are linear, while the sled dog's mouth and nose can be moved at will. In order to have the authenticity of proper micro-motion, the production team added many small details next to its nose. It is said that when the roaring scene animation test was shown for the first time, it was played side by side with the real shot material, and most viewers could not see that it was fake.

Real shot material

Final shot

  1. Sled dogs skating on ice

There is a plot in the film. The sled dog team braved the snow and cold to cross the unstable frozen lake under the leadership of Togo. When they returned, the ice surface was torn apart, and the sled dogs were moving forward quickly while their feet were slipping.

The DNEG team also began to find some materials on the Internet about sled dogs slipping or falling, to provide a reference for finding the unstable feeling of sled dogs standing on the ice. These materials will be put into the rough cut of the animation processed with Time Editor to create a suitable effect for the overall shot like building blocks, and also help them determine the performance of the sled dog in some specific shots. From the perspective of animation quality and details, this method should belong to a relatively advanced blocking.

  1. Animate multiple sled dogs

The above-mentioned ice skiing. In addition, there are 11 sled dogs in Togo. The behavior of the sled dogs in the team is different from individual performance. For this reason, DNEG has developed a binding system to determine each The distance between the sled dogs in the team, if the distance is too far, part of the system will be displayed in red. Although this system is not super accurate, it can still provide the team with an approximate range.

The production team carried out some layout development work when processing the animation, set up many different cycles and different speeds for the state of the sled dog action, and input them into the binding system so that the layout effect can be freely switched and loaded before loading The method of setting animation after binding is much more convenient.

The Layout department can set up the sled animation according to the selected sled dog sport Cycles and speed while ensuring the work efficiency while achieving the authenticity of the effect as much as possible. It seems that this part of their work is also very convenient.


3D Tutorials: How to Make Dogs in Togo(1)

2020-08-27

Behind The Scenes

The visual effects of this film are produced by DNEG, Lola Visual Effects, and Soho VFX. The most impressive thing in the whole film is the loyal and brave sled dogs and the local weather. DNEG began production after shooting in October 2018. The cycle lasted for about one year. A total of 872 artists from 4 different branch teams completed 778 VFX shots. Next, we will divide the parts made by DNEG (CG Dog and Digital Environment) into three parts to introduce to you.

3D scanning of sled dogs

The husky sled dogs we saw in the film are not necessarily real, some are full CG, and some are CG replacements of a dog head. The sled dogs produced by CG need to match the real shots of the sled dogs. It will involve 3D scanning of real sled dogs, collecting reference materials, redesigning hair tools, building muscles and bones, and reconstructing the roaring appearance of the sled dogs.

Live shot

  1. Testing

DNEG has done a rocket raccoon with 800,000 hairs in "Avengers 4: Endgame", and has also tested wolves in previous projects, but has not done a dog, nor has it dealt with something like this Such a huge fur/hair production in the film.

For the "Togo" project, they updated the internal hair tool Furball. Through some development and optimization work, they first created the largest amount of hair on a single dog as much as possible, and then met the hair of 11 dogs in 1 shot. In addition to the real appearance, it is necessary to simulate the state of hair with water, ice, and snow and its rendering effect.

  1. 3D scanning of sled dogs

During the shooting, DNEG performed two photogrammetric scans of 30-40 sled dogs in the studio using Clear Angle, one wearing summer clothes and the other wearing winter clothes. The dog handler introduces a single sled dog into the studio, first familiarizes himself with the environment, and then takes him to the designated C position. The scanning process must be completed all at once, otherwise, there will be no chance to do it again if the dog is scared away.

There is a separate shed next to the Clear Angle shed, which is equipped with an animation reference camera, which can capture the detailed dynamics and characteristics of the sled dogs outside of the actual shooting, and provide reference materials for creating CG character bindings and assets.

  1. From data to sled dogs

A 3D scan of a sled dog can get feet, legs, head, and rough body volume data, but not including fur/hair data. Without hair data, it is impossible to analyze the muscle mass and the tissues under the fur. The solution adopted by DNEG is to manually measure the fur on the sled dog’s neck, back, tail, and other specific locations with a measuring tape in the small shed mentioned above.

However, this method of creation is more based on the dogs involved in ideal, anatomy textbooks, not necessarily data-driven, and some specific details of the sled dogs need to be added on this basis.

Real shot

Final shot

The final creation is to customize the muscular system and the skeletal system, use Ziva Dynamics to create the muscle and fat system, Maya's nCloth to create the skin, the fur tool Furball to handle grooming, and Houdini vellum to handle the hair follicle dynamics.

The production team found that a lot of fur movement actually comes from fat. The fur itself is very stiff and will not twist left and right. Instead, fat and muscles move around underneath, forming a feeling that the fur is moving. Of course, it also needs to simulate the effect of pulling the fur after putting on the protective gear for the sled dog.

Real shot

Final shot


How does the studio directly produce the final shot? (2)

2020-06-17

Behind The Scenes

The rising expressive power of Real-time rendering

Jon Favreau's biggest concern about *The Mandalorian* is whether the game engine can reach the visual effect rendering level of Star Wars on the TV budget. But when he saw a piece of sand similar to episode 2, he abandoned the thought.

Sandy environment is built with assets scanned with photos and rendered in real-time

In the virtual studio, all the scene elements and lights can be switched and upgraded freely, and the feel of virtual reality is maintained. Part of the virtual environment of *The Mandalorian* directly uses assets created by Unreal Engine for some video games, saving a lot of asset construction time and opening up the possibility of asset sharing between the two industries.

Real-time rendering has been able to better present the highly reflective materials often involved in science fiction dramas

The virtual scenes on the set sometimes deceive the eyes of the people on the scene. Jon Favreau said, "There was a person who came to the studio and said,'I thought you wouldn't build the whole scene here.' I said, no, we didn't build here, in fact, there only have tables.' Because the LED wall rendering is based on a camera, there is parallax. But even if you look at it casually next to you, you will still think that you are watching a live-action scene. "

Entering the virtual studio, the actor has entered the virtual world and the final scene

The changes of Hollywood led by virtual production technology

In addition to accelerating the cycle and increasing the turnover of the budget, virtual production has also brought revolutionary convenience to actors and other teams. Actors can see the situation of the environment in real-time to perform and interact.

"This not only helps the cinematography but also helps the actors understand the surroundings, such as the horizon. It also provides interactive lighting", Jon Favreau described it as “a huge breakthrough”.

Virtual production can also help fantasy characters interact with real people

In addition to *The Mandalorian*, movie projects such as Lion King and Fantasy Forest have already begun to use game engines to produce movies, but they are still mainly in the visual preview stage.

Besides, Steven Spielberg and Denis Villeneuve used Unity to help achieve their visual effects in the production of Number One Player and Blade Runner 2049 respectively. This method is gradually replacing the usual storyboard production method similar to hand-drawn comic books.

Spielberg stood on the monitor to view the real-time synthesis of the shooting content

In the virtual production process, VR technology is rapidly becoming a viable option for large studios and production companies. Director Jon Favreau used a lot of virtual reality technology in the recent remake of the live-action movie The Lion King, which fundamentally created the entire virtual reality world of the movie.

Lion King uses a lot of real-time virtual preview technology

The popularization of virtual production technology

In terms of budget, virtual production and LED display equipment are still relatively expensive at this stage.  

Complete real-time compositing and film production at a lower cost

Jon Favreau indicated that virtual production technology is a major leap forward in the film industry, allowing creators to make creative decisions before the project starts, rather than reinventing the process or after completion. Nowadays, more people can see the appearance of the lens directly during cinematography. More people can contribute their own ideas and learn from each other because they can see the final idea in advance.

"In the past, when you went to a scene and left the green screen, you no longer care about it. But now, we have so many talented people and have accumulated more than one hundred years of filmmaking experience. Why should we give up just because we change the shooting system? Let’s continue to inherit the skills of film artists and develop tools that adapt to the times.”

Virtual production technology will gradually become the routine process of film production in the future


Virtual Production:Past, Present, and Future(3)

2020-06-09

Behind The Scenes

The Evolution of Virtual Production

Virtual production applied in the industry in early times with less coverage. In the early 1990s, Hollywood used virtual production to make dynamic storyboards, which is the current Previz.

360 degree super static instant short technique used in The Matrix

Titanic was released in the United States on December 19, 1997, followed by Matrix in 1999, Matrix 2 and Matrix 3 in 2003, and Transformers 1 in 2007 and its sequels. These films had made a good application and promotion of virtual production, especially the use of Motion Capture and Motion Control.

Combination of motion capture and real-time rendering

Virtual production is well known to the audience after the release of Avatar in 2009.

On December 16, 2009, Avatar and its Pandora brought the audience a beautiful feeling, making the virtual production familiar to the public. The director can directly see the compositing scenes instead of the traditional green screen in the monitor, which makes it refreshing.

Avatar takes virtual production to the mainstream

When the whole movie is shot on a green or blue background, there are more than 2000 scenes need to have post-production, including editing, VFX, sound and others; and the staff needs to match the shots one by one according to the script description. Hence, demands have driven the development of technology. After Avatar was completed, virtual production was integrated into a commercial module.

The Outlook of Virtual Production

Although there are many technical problems that need to be improved for virtual production, the development trend has been established and will be prosperous with the rising trend of the global box office.

The live version of The Lion King fully uses virtual production

The driving effect of Hollywood VFX films has made more and more directors and filmmakers recognize and contact this field, and use this new technology in their creation.

A material library for virtual production is needed in the near future.

Cultural and historical monuments, natural sceneries, and others have already become frequent visitors to the movie screen. However, all cinematography must go to the site at a high cost. The schedule of the actors is also difficult to coordinate.

Real-time compositing scenes in the monitor

Virtual production can solve the problem and reduce production costs.

For example, when shooting the Colosseum in ancient Rome by virtual production, the actors perform in the virtual shed, directly utilizing the scene of the beast in the material library for cinematography.

Movies are the art of creation. Using photos as a background can not meet the need for cinematography technology. A three-dimensional model needs to be built with real textures. Light simulation, such as early morning, dusk, or noon effect, can also be performed as needed. Then put them into the model library, transfer to the virtual production system according to the needs of creation, and let the camera flexibly choose the required angle to shoot according to the needs of the script.

The establishment of the material library is a huge digital project, which not only includes all kinds of materials but also enables clear retrieval and calling. This project requires continuous investment and construction. We can build a framework through special groups, formulate technical standards, and incorporate the conformity into the database. We can also commercialize it for a fee to make this material library enter a healthy development track and grow slowly.

Only green screen props are needed to simulate the interaction with the virtual scenes

In the future, more and more traditional studios will be transformed into virtual studios. Within the scope of human control, 24 hours of cinematography can be provided and will not delay due to any weather.

Here is a common sense of virtual production. The scenes built by traditional studios are fixed. The replacement of reality scenes takes one day or a few days, but that of the virtual scene only takes a few minutes. In the performance area, the modules can be assembled outside the venue in advance in the studio.

The production of Disney's Mandalorian brings virtual production into the LED era

The actor's schedule is very tight. Any accident, such as illness, injury, and so on, will disrupt the normal cinematography plan. The virtual production will make the scenes in the script in advance, and quickly call different scenes according to changes, coordinating the process of the crew to make filming more humane and more efficient.

Virtual production has a very good prospect. The continuous improvement of virtual production builds a proper platform to improve the quality of filming. Creativity can be presented in a better way, bringing audiences a better viewing experience.


Virtual Production:Past, Present, and Future(2)

2020-06-08

Behind The Scenes

The Birth of Virtual Production

From the perspective of time and popularity of the application, the application of TV should be earlier than that of movies.

In 1978, Eugene L. proposed the concept of "Electronic Studio Setting", pointing out that future programming can be completed in an empty studio with only personnel and cameras, and the sets and props are automatically produced by electronic systems.

Virtual studio technology widely used by current mainstream TV media

After 1992, virtual studio technology realized. As a new technology, the virtual studio became a hotspot in TV studio technology. At the IBC exhibition in 1994, the virtual studio technology debuted in various TV broadcasts.

Virtual background can be changed with the lens in the virtual studio

The Virtual Studio System (VSS) is a new TV program production system that has emerged in recent years with the rapid development of computer technology and chroma key.

F1 program presents a demonstration effect through virtual scenes and props

In the VSS, the position and scenes of the cinematography are transmitted to the virtual system in real-time. The green screen (or blue screen) is cleared by chroma key to replace the pre-made virtual three-dimensional space model. Then the host or actor of the picture emerges with the three-dimensional virtual scene into a new picture. Finally, the compositing video can be displayed on the TV in real-time.

Using virtual studio technology for remote virtual dialogue

Actually, the virtual scene does not exist in reality, but the technology that seamlessly integrates with the real shot characters makes the virtual studio a reality.

The world's first reality show using virtual production


Virtual Production:Past, Present, and Future(1)

2020-06-05

Behind The Scenes

This sharing is collected by the TPN-Accredited cloud render farm, Fox Renderfarm. We hope that can help you in learning knowledge in Film and TV production.

Virtual Production is an emerging film and TV production system, including a variety of computer-aided film production methods designed to enhance innovation and time-saving. With the help of real-time software such as Unreal Engine, traditional linear processes can be transformed into parallel processes, blurring the boundaries between pre-production, production, and post-production, making the entire process more fluid and collaborative.

Virtual production in real-time LED video wall

Since Avatar released in 2009, virtual production emerged as a new field from the digital revolution of the film industry, bringing a new perspective to viewers and professionals.

The virtual production of Avatar

The large-scale use of green screen(or blue screen) filmmaking has made it more and more important for directors to see the real-time composition of performance and virtual space. As directors could implement instant guidance on live shooting by virtual production, the technology is gradually applied to production.

Real-time combination in the green screen

The Definition of Virtual Production

Let’s describe virtual production simply. When the actor performs in front of the green screen (or blue screen), the screen is directly replaced by the virtual scene made in advance. Then the director’s monitor is presented with scenes in real-time composition which can only see in the late stage of the production process. This is called virtual production.

Virtual production sets and motion capture environments

The History of Virtual Production

The development of virtual production is inseparable from the progress of computer technology. In the 1940s and 1950s, computers were in the mainframe stage. In the 1960s and 1970s, they gradually transformed into small computers. Owing to the limitations of the hardware, software, and talent base, virtual production carried forward slowly during this period.

IBM-PC launched by IBM

Microsoft was founded in 1975, and later developed the Windows operating system; Apple was established in 1976, using Apple's own System x.xx/Mac OS operating system; in 1981, IBM introduced IBM-PC, which greatly simplified the hardware architecture and oriented ordinary people. Because the development of the operating system is still in the DOS stage at this time, the speed of popularization is still very slow.

Spielberg on the monitor to view the virtual production effect of Number One Player

In 1982, SGI was established in the United States. Jurassic Park, Titanic, Toy Story, The Lord of the Rings, and others are all having a close connection with it.

The Lord of the Rings set & VFX compositing

In 1995, SGI acquired Alias Wavefront, which is the predecessor of MAYA software. In 1991, Microsoft introduced the Windows 3.0 multi-language version of the operating system. A few years later, Win95 was launched, realizing a real graphical operating interface. Hence, the development of computers entered the fast lane.


Behind the Scenes: Spy House Production (3)

2020-06-04

Behind The Scenes

Next, we will share the production process of two other materials:

Making wood material

Pay attention to the following points when making wooden materials:

• When you split the UV, try to place the squares as much as possible to ensure that the post-texture image will not be distorted.

• It is also necessary to consider the utilization rate of UV space and try to give priority to the things you can see, which can improve the quality of the material.

The texture of the wooden structure is simple to produce, and it is necessary to find a lot of references that combine with the situation of the scene. For example, what does the wooden structure look like in an underground wet environment for a long time? Or the wood in this environment is more corroded? So to make such a feeling, a lot of references is very important.

Substance Painter wood texture production

• Import the baked Normal stickers and prepared materials into Substance Painter. Before creating something, create two fill layers, leaving only the Color option. One layer puts the AO diagram on the Color and adjusts to a positive overlay, which is used to improve the dark color. Another layer puts the Curvature map on Color and adjusts to Overlay, which is used to improve the bright color of the edge. After these operations are completed, the production of the texture begins.

Pay attention to the production of wooden texture: Consider the actual situation of the scene. Take the scene we made (spy house) as an example. For a long time in a dark and humid environment, the corrosion will be relatively serious. Afterward, it should be noted that the edges of the wood will not be particularly bright. The bright edges of the wear marks will also be darker, and then the scratches and the wear of the places where they are often used. A little brightness is needed here. Finally, there is dust accumulation and stains in places that are not often encountered. After these things are done, you can basically get a good real texture.

The material for making paper and book

When making a single-page paper with no thickness, please pay attention to the following points:

  1. When you split the UV, try to place the square as far as possible to ensure that the textures in the later period will not be distorted.

  2. The texture of the paper is simple to produce, without Hight Polygon, just baking Low Polygon Normal.

How to make paper in Substance Painter:

Import the baked Normal map and prepared materials into Substance Painter. Create a layer (unfilled layer), select Color for the material and put the material in Base Color, and select None for UV Wrap. In F1 3D/2D mode, place the picture material in the corresponding position.

Just pay attention to the book texture production: from the beginning, it is necessary to distinguish between the shell and the content. If it is made into one, it will not be easy to separate when making the texture.

Marmoset Toolbag 3 rendering

Finally imported into Marmoset Toolbag 3 for rendering, the first is to layout the light source, choose to add an HDR suitable for the scene, adjust the Child Light to the appropriate position to make the model produce a 45-degree shadow.

Because the scene is larger, more lights are added, but when rendering, pay attention to adjusting the size and illumination range of those fill lights to prevent the mutual influence of light.

I hope our learning and sharing can help you, thank you.


Behind the Scenes: Spy House Production (2)

2020-06-03

Behind The Scenes

Material production

After baking all normals, AO, and curvature, import Substance Painter to make materials. It is worth noting that the normals baked in Marmoset Toolbag 3 and Maya use OpenGL mode in Substance Painter, and the normals baked in 3ds Max use DirectX mode. The first step after successfully importing the model must be to check whether the normal direction is correct.

After successful import, the remaining textures are baked in the texture set-baking model texture.

The following uses the sofa leather material to share the basic process of the material: the production of sofa leather material.

Inherent color of the leather

Choose a suitable material map and paste it in the color channel, adjust other parameters appropriately to make the effect more natural

Bump height

Create a new fill layer to keep the height channel, paste the height map corresponding to the background color layer, and add Level to adjust the height. At the same time, add anchor points to ensure that the color changes with the height.

AO layer

The AO diagram of the model pasted in the AO channel is that the dark parts of the edges are deepened, and the bumps are more three-dimensional.

Roughness

Adjust the roughness of the sofa surface. Frequent contact with objects will wear more, such as the edge of the cushion, the inner and upper sides of the armrest, the bottom of the sofa, the top of the backrest, etc. Grunge textures can be added to make the effect more natural.

The edge fades

Create a fill layer and add a generator to produce a faded edge wear effect. Adding only one generator effect will be very stiff. Add a few more layers to interrupt the continuous effect to make the wear more natural, and finally, add hand-painted details.

Details added

Use the Grunge map to add scratches, stains, and other details to the surface.

Dust

There are two kinds of dust: dust accumulated in corners and trace dust on the surface. Create a fill layer to leave the color, roughness, and height, and then use the generator to calculate the dust position adjustment parameters to make the effect natural, and you can appropriately add hand-painted details.

Gap treatment

Manually add black to deal with gaps and other details.


White Snake’s Special Effects Behind The Scenes And Animated Performances

White Snake’s Special Effects Behind The Scenes And Animated Performances

2019-03-22

White Snake

Recently, the Chinese animation circle has caused a hot discussion about White Snake.

As an original animated film with folk legends and national style aesthetics, White Snake’ box office has been attacked twice since the release of the week. Next, we will start to reveal the special effects and animation performance behind the scenes.

Role display

Animator behind-the-scenes performance

Behind the scenes special effects production secret

This lens special effect simulates the weightless environment in the water. The main difficulties are the high-speed movement and special effects of the character. The dynamics of the fabric are in line with the environment. Different solutions were tested for this effect, hitting the fluid and emitting fluid from the character's body. Finally, the advantages of both are combined as the final solution.

In addition, the character itself has created a special force field to make the surrounding ink speed change more and back to the role action. Finally, a suitable physical environment is set for the ink, instead of the default vacuum environment to ensure that the dynamics are relatively correct.

In the early stage of role effects, a large number of tests and asset classifications were carried out. The force field expression noise map and some self-developed tools were used to simulate and control the underwater state. After multiple versions of the iterative test, the final result was achieved.

In this lens, the difficulty in the absorption of the beads is the irregular movement and shape. In order to be dynamic and natural and conform to the needs of the shape, the special effects have developed a complex force field to guide the movement of particles and fluids as the main body, and on this basis, 10 layers of elements are derived to enrich the effect. Finally, the lighting department synthesizes the final result.

In the lens of the Pool, the creative team needs to show a very strange and evil feeling. Special effects use flip to achieve the dynamics of the pool water. According to the dynamic processing into as much information as possible, the particles, volume, and geometry are converted into renderable layers based on these data, and finally the final result is achieved in the synthesis.
Recently, the Chinese animation circle has caused a hot discussion about White Snake.

As an original animated film with folk legends and national style aesthetics, White Snake’ box office has been attacked twice since the release of the week. Next, we will start to reveal the special effects and animation performance behind the scenes.

Role display

Animator behind-the-scenes performance

Behind the scenes special effects production secret

This lens special effect simulates the weightless environment in the water. The main difficulties are the high-speed movement and special effects of the character. The dynamics of the fabric are in line with the environment. Different solutions were tested for this effect, hitting the fluid and emitting fluid from the character's body. Finally, the advantages of both are combined as the final solution.

In addition, the character itself has created a special force field to make the surrounding ink speed change more and back to the role action. Finally, a suitable physical environment is set for the ink, instead of the default vacuum environment to ensure that the dynamics are relatively correct.

In the early stage of role effects, a large number of tests and asset classifications were carried out. The force field expression noise map and some self-developed tools were used to simulate and control the underwater state. After multiple versions of the iterative test, the final result was achieved.

In this lens, the difficulty in the absorption of the beads is the irregular movement and shape. In order to be dynamic and natural and conform to the needs of the shape, the special effects have developed a complex force field to guide the movement of particles and fluids as the main body, and on this basis, 10 layers of elements are derived to enrich the effect. Finally, the lighting department synthesizes the final result.

In the lens of the Pool, the creative team needs to show a very strange and evil feeling. Special effects use flip to achieve the dynamics of the pool water. According to the dynamic processing into as much information as possible, the particles, volume, and geometry are converted into renderable layers based on these data, and finally the final result is achieved in the synthesis.


Welcome to join us

render farm free trial

Recommended reading


How to render large scenes with Redshift in Cinema 4D

2019-10-15


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


How to Render High-quality Images in Blender

2020-03-09


How the Redshift Proxy Renders the Subdivision

2018-12-28


China Film Administration Release New License Logo

2019-01-09


Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel

2018-12-26


Corona Renderer Learning - Denoising

2019-05-15


Redshift for Cinema 4D Tutorial: Interior Design Rendering

2019-10-12


Partners

Fackbook Customer ReviewsFoxrenderfarm

Powerful Render Farm Service

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    Marketing Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com