The Creation Analysis And Sharing of Work ‘Lotus Flower Is Immortal’

Introduction: In February 2017, 3D Artist Liu Jingjing participated in the V-Ray Rendering Competition organized by HXSD and Chaos Group. She won the competition and shared the process of creating this work. She also analyzed the success and insufficient of the work. Software: 3ds Max, ZBrush, V-Ray, Photoshop
Final renderings:

About the topic
This work is based on the Chinese style and uses lotus as the entry point. The 3S material characteristics of the lotus flower are very remarkable, and the light transmission is opaque, which should be a good performance object. With the original intention of creation, the theme of the work is set.
What was originally done in the role was a little monk.

Early little monk...
After the model was built, it was discovered when the material was made in 3ds Max. The dress of the monk was too plain, the lotus flower was white, and the clothes were all the same color. It felt that they could not set off each other very well, so the character model was almost completely re-made.
Modeling
1. Lotus model
A lot of pictures of lotus were collected and one of them was chosen as a reference.

The lotus model is all done in ZB. The work of adding petals is more complicated. The shape of all the petals is almost the same. You can add some faint bump texture on the top to give the lotus some details. These textures can then be converted to bump maps for use.

2. Character Model
The character is to use Dynamesh in ZB to quickly create head, hand and body fabrics. High-mode engraving after the topology. Because it is a child, the hand will be fatter, referring to the baby's hand and arm.

Apsaras costumes are an important reference for me.

There is no special skill in modeling, patience, just do it one by one, cloak, skirt, belt...
When you are carving a high mold, it is very fast. The piece of cloth is not too much to be scrutinized. There should be improve.

3. Decoration
The shape of the ornament is relatively simple. The beads, earrings, boutonniere and headwear are mostly large in 3D Max, and some are imported into ZB for engraving. The characters' decorations draw on the elements of Tibetan Buddhism, such as red, green and yellow colors, turquoise, and bead earrings.

The decorations are added step by step as the work draws to a close, and new decorations are added while making.

UV expansion and mapping
There are very few textures in this work. Most of the models are solid colors with a little bump, and UV uses ZB's own plug-in UV Master to automatically expand. If you have more textures that need to be carefully mapped, you still need to manually expand them.

Lotus, face, hand, and a small number of fabrics require a diffuse texture, and the drawing is still in ZB. After all is done, reduce the high modulus, output to Max, prepare the next lighting material and render.
Lighting, materials and rendering
The lighting material rendering is the focus, I hope that there is contrast in the picture, the white and bright lotus vs color is strong (relatively dark) character. So I put the main light source close to the lotus and added a lot of light assistance to the character.

This may seem like the lotus is too dark, the character's head is too bright, this is - repeated test of the light, gi value, and the results of the material and texture.
The material of the lotus flower is made of V-RayFastSSS2 material, which is adjusted with the position and intensity of the light for crystal clear effect.
I hope that the lotus is a little self-illuminating, so I explored adding V-RayMtlWrapper on top of the 3S material to enhance its GI generation value and test an acceptable result.

The materials in other parts are easier to add, such as color, texture, bump, reflection, translucency, etc.

Hair and other
The character's hair is draped, and the hair is not difficult. Here is the use of Ornatrix, the current popular hair plugin.
Finally, add two small dragonflies to play a decorative role.

Add water drops to the lotus.

Post-production
Adding a simple background, smoke, and the apsaras should be naturally flying. Finally, the entire image is toned and partially shaded, contrasted, and adjusted to the satisfaction of yourself.

Welcome to join us

render farm free trialfox got talent

Recommended reading


China Film Administration Release New License Logo

2019-01-09


Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel

2018-12-26


How the Redshift Proxy Renders the Subdivision

2018-12-28


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Corona Renderer Learning - Denoising

2019-05-15


Arnold Render Farm | Fox Render Farm

2018-11-27


Is This Simulated By Houdini? No! This Is Max!

2019-02-22


Partners

Interested

A Guide to the Career Growth of VFX Compositors(1)
A Guide to the Career Growth of VFX Compositors(1)
As the leading cloud rendering services provider in the CG industry, Fox Renderfarm has an outstanding team with over 20 years’ experience in the CG industry. Team members are from Disney, Lucasfilm, Dreamworks, Sony, etc. With professional services and industry-leading innovations, they serve leading VFX companies and animation studios from over 50 countries and regions, including two Oscar winners. In this artichle, we will share a guide to the career growth of VFX Compositors from Zhiyong Zhang, Compositing Technical Director in Base FX(Base Media is the leading visual effects and animation studio in Asia). Part 1 Introduction What is compositing? There are many different explanations about compositing on the Internet. Since it's difficult to explain, I would analyze compositing from some shots. Like the scene in "Monster Hunt", the foreground is made with a 3D model, and the background is made with a combination of the original paintings and pictures. That is to do some MP, color adjustments, elements setting and real shot adding. Finally, all the pictures are going to be integrated, which is a compositing process. The professional value of composting Compositing is the last part of the visual effects in the later period. You need to blend and match three-dimensional things with the real shot through compositing techniques and methods. Through techniques and methods, add some 2D or 2.5D elements to make the picture more interesting. Compositing is making the projects that the entire visual effects team makes. Hence, compositors are responsible for the entire team, striving to adjust the project to the final optimal effect. This is the professional value of compositing. The artistic charm of composting For me, every adjustment on color, virtual focus, interactive light and others will make the scene more realistic or better-looking. Every step of the operation will improve the scene, bringing the compositors hope and happiness. It is awesome to do better in the time allowed and finally show something beautiful. This is the artistic charm of composting. Part 2 Six basic skills of compositors The first basic skill is erasing. Many scenes have tracking points when shooting on the green screen, or there are some things that can’t be removed from the scene. We need to do erasing, which is very basic but important. The second skill is rotoing and chroma key. The chroma key is to keying the green screen. There is also a key called no green screen keying, which is the rotoing. The third skill is to match colors. Adding an element to the real shot to change the color of the material is actually matching the material, not creation. The fourth skill is to match the virtual focus or depth of field. The material itself will actually have virtual focus, so the compositor needs to also be good at photography, otherwise, it may become a bottleneck for the compositor. The ideas of the director and the main photographer will be presented in the scenes. We use compositing to re-express what they want to express in CG. We are doing matching rather than recreating. The fifth skill is tracking. Most compositors do 2D trackings, such as billboards or on-screen patches. And sometimes they also do 3d tracking. The last one is the matching noise. For some demanding projects, it is necessary to switch between red, green, and blue screen, or even zoom in to the local area for comparison. It also depends on the broadcast platform. Generally, if it is broadcasting on a TV platform, the noise is not easy to see the noise. If in the cinema, as long as the brightness of the cinema is large enough, the noise of some scenes can be easily seen. The noise matching can make the picture more realistic. Part 3 Standards for cinema-level film compositing requirements The standards here are not officially stipulated standards but are just some common standards that I personally summarize based on experience. I divided the cinema standards for film compositing into two types: technical requirements and artistic requirements. Technical Requirements First of all, it depends on whether the color of the original material is changed. If you are not careful or have moved other parts of the screen including the color of the material for other reasons, it may cause customer dissatisfaction. Secondly, whether the maximum and minimum values of the screen are consistent with the material or not. Of course, the maximum and minimum values are not for a certain scene, but for all the materials of a project. Thirdly, whether the metadata is correct or not. If the editing requires metadata to be accurately restored, it needs to be correctly restored to the material. Generally, at the beginning of the project, it needs to agree with the previous DI or editing on how to set back. Fourthly, whether the shifting is stuck will also be a technical requirement. The shifting should be discussed with the editor. If the editing does not require any shifting, even if it seems to be stuck, it may be a fast switch of the front and rear scenes. This is one kind of editing methods. if you do something to make it smoother, re-reverting to the clip may cause problems. The last is whether there is any obvious flaw in the picture. The so-called flaw is whether there is a black border on the edge of the picture and whether the edge of the real shot is stretched. Artistic requirements The artistic requirements will be significantly different from the previous technical requirements. We will check whether the details of the keying edges are enough to maintain the authenticity of the picture. In the normal budget, the details of the edges can be kept authentic. If the edges look fake, it may make the whole picture fall short. Whether the bright and dark parts of the screen is matching with the material within the acceptable range? The dark and bright parts here refer more to brightening everything in the viewing facility, to see if the added thing is the same color as the real shot and whether the degree of the color cast is the same. Without considering the budget, as a compositor, the only requirement is to do something better, so try to match it. Then, whether the compositing elements, the virtual focus of the background and the depth of field match the real shot. The depth of field of the real shot sometimes will be very shallow, sometimes it will be very deep. Generally, there will be a range, this range is taken by the lens and the camera. We can add an element in it, and change the depth accordingly. In addition, we must also consider whether the simulation of the lens effect meets the real shooting standard. With a normal lens, the virtual focus is 1:1, and with an anamorphic lens, the virtual focus is vertical, similar to 2:1 circular virtual focus. While doing compositing, it needs to pay special attention to matching. The next thing to consider is whether the addition of smoke, flame and other atmospheres is truly fused. The light and shadow, the virtual focus and the sharpness are matched with the whole, and the things that really melt into it are realistic. The last thing is to consider the interactive effects, the corresponding interaction between the real shot material and the CG elements. When we do the compositing, in addition to shadows, there are some light interactions, we should consider whether to add them. In the process of compositing, we need to give ourselves the opportunity to exercise and do more fine, otherwise, we can't make progress. In fact, the audience is very smart now, watching a lot of blockbuster movies. Therefore, we should try our best to be realistic in the time cost of cinema-level movies. If the audience feels satisfied in the cinema, it will be successful.
More
2020-07-03
How does the studio directly produce the final shot? (2)
How does the studio directly produce the final shot? (2)
The rising expressive power of Real-time rendering Jon Favreau's biggest concern about The Mandalorian is whether the game engine can reach the visual effect rendering level of Star Wars on the TV budget. But when he saw a piece of sand similar to episode 2, he abandoned the thought. Sandy environment is built with assets scanned with photos and rendered in real-time In the virtual studio, all the scene elements and lights can be switched and upgraded freely, and the feel of virtual reality is maintained. Part of the virtual environment of The Mandalorian directly uses assets created by Unreal Engine for some video games, saving a lot of asset construction time and opening up the possibility of asset sharing between the two industries. Real-time rendering has been able to better present the highly reflective materials often involved in science fiction dramas The virtual scenes on the set sometimes deceive the eyes of the people on the scene. Jon Favreau said, "There was a person who came to the studio and said,'I thought you wouldn't build the whole scene here.' I said, no, we didn't build here, in fact, there only have tables.' Because the LED wall rendering is based on a camera, there is parallax. But even if you look at it casually next to you, you will still think that you are watching a live-action scene. " Entering the virtual studio, the actor has entered the virtual world and the final scene The changes of Hollywood led by virtual production technology In addition to accelerating the cycle and increasing the turnover of the budget, virtual production has also brought revolutionary convenience to actors and other teams. Actors can see the situation of the environment in real-time to perform and interact. "This not only helps the cinematography but also helps the actors understand the surroundings, such as the horizon. It also provides interactive lighting", Jon Favreau described it as “a huge breakthrough”. Virtual production can also help fantasy characters interact with real people In addition to The Mandalorian, movie projects such as Lion King and Fantasy Forest have already begun to use game engines to produce movies, but they are still mainly in the visual preview stage. Besides, Steven Spielberg and Denis Villeneuve used Unity to help achieve their visual effects in the production of Number One Player and Blade Runner 2049 respectively. This method is gradually replacing the usual storyboard production method similar to hand-drawn comic books. Spielberg stood on the monitor to view the real-time synthesis of the shooting content In the virtual production process, VR technology is rapidly becoming a viable option for large studios and production companies. Director Jon Favreau used a lot of virtual reality technology in the recent remake of the live-action movie The Lion King, which fundamentally created the entire virtual reality world of the movie. Lion King uses a lot of real-time virtual preview technology The popularization of virtual production technology In terms of budget, virtual production and LED display equipment are still relatively expensive at this stage.   Complete real-time compositing and film production at a lower cost Jon Favreau indicated that virtual production technology is a major leap forward in the film industry, allowing creators to make creative decisions before the project starts, rather than reinventing the process or after completion. Nowadays, more people can see the appearance of the lens directly during cinematography. More people can contribute their own ideas and learn from each other because they can see the final idea in advance. "In the past, when you went to a scene and left the green screen, you no longer care about it. But now, we have so many talented people and have accumulated more than one hundred years of filmmaking experience. Why should we give up just because we change the shooting system? Let’s continue to inherit the skills of film artists and develop tools that adapt to the times.” Virtual production technology will gradually become the routine process of film production in the future
More
2020-06-17
How does the studio directly produce the final shot? (1)
How does the studio directly produce the final shot? (1)
The Road to Innovation in Virtual Production How does the studio directly produce the final shot? The production team of Disney's new drama The Mandalorian used the real-time rendering technology to construct the digital virtual scene in advance. The Mandalorian, spin-off drama of The Star Wars launched by the Disney streaming platform The Mandalorian, the first spin-off drama of The Star Wars appeared on Disney+ on November 12, 2019, which was also the first blockbuster series made in Unreal Engine. The timeline of The Mandalorian is five years after Return of the Jedi, after the fall of the empire and before the emergence of the first order. The plot revolves around the travels of a lone bounty hunter in the outer reaches of the galaxy, far from the authority of the New Republic. The Mandalorian Epic Game, the development company of Unreal Engine, appeared in the list behind the scenes It is worth noting that Epic Games, the developer of the Unreal Engine, appeared on the list of thanks for the production of The Mandalorian, followed by a list of the entire virtual production team. Epic Game was previously committed to producing explosive games like Fortnite and other 3A-level masterpieces, including Kingdom Hearts III, Dragon Ball Fighter Z, and Star Wars Jedi Knight: Fallen Order and so on. In recent years, they are working with Lucasfilm to bring the real-time rendering capabilities of the Unreal Engine into the development of Disney’s streaming content for live-action production. Many shots background of The Mandalorian is presented directly from the LED wall. Jon Favreau is the producer and screenwriter of The Mandalorian, but also the pioneer using the virtual game engine in virtual production. At the SIGGRAPH 2019 computer graphics conference in Los Angeles, Jon Favreau shared an efficient way to use the game engine to help the film perform a virtual preview (Previs). "We used the V-cam system, which is essentially making movies in VR. All results would deliver to the editor. It is like we are editing a part of the movie in advance, the purpose of which is to achieve the pre-conceived goal."Jon Favreau said. Use VR technology and real-time rendering technology to view the virtual environment of the movie An evolution from the virtual preview tool to the final image production tool The virtual production system no longer only provides virtual preview services for the film. The crew used the LED video wall in The Mandalorian set as the background of the live real-time camera composites (In-camera composites). The SFX team projects the pre-rendered content, such as the environment, onto the LED wall as a dynamic green screen. The virtual studio is a cube of virtual content wrapped by four LED walls. It is driven by the Unreal Engine, and the photographer can shoot the final images directly in the camera (In-camera ). The content displayed by the LED will be adjusted and transformed in real-time according to the position of the virtual camera. Live actors and LED walls in the virtual studio "We present real-time rendered content in the camera and get composing shots directly. Considering zooming and other requirements, for some types of cinematography, we are not only able to interact on the spot but also can see the lighting, interactive light, layout, background, horizon, and others directly in the camera. Hence, these don't need to wait for the post-production." Jon Favreau noted.
More
2020-06-17
Foxrenderfarm

Powerful Render Farm Service

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    Media Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com