The VR Rigging Tool - Masterpiece Motion Has Released

As one of the world's best collaborative art creation tools and art communities, Masterpiece VR provides global users with the ability to easily create 3D content in virtual reality. They used practical actions to solve the problem of "whether it is a newcomer or a practicing artist who has some experience, it will take some time in the process of Rigging, Skinning and Posing".
Recently, Masterpiece VR has released a new VR rigging tool, Masterpiece Motion. This "sports creativity" software is the first professional VR software for content creation, providing the most convenient way to rig animations and games, and is suitable for any 3D model. Click on the video to get an idea.

Users can not only draw bones in a 3D environment, import existing rig, but also experience auto-rig of software. After the rig is created, the skin weight can be manipulated and drawn in the VR, or the role action can be modified according to the rigg in the VR environment. ■ Rigging process is simpler

Drawing bones in a virtual 3D environment is a convenient and simple process. Import a pre-bound model or use the auto-rig feature to speed up this step.
Skinning process is more convenient

Connect the rig to the mesh with smooth motion, change the size of the tool to control the skin weights, move the bones to preview the pose, or perform skin editing.
The Posing process is more intuitive

By grabbing and moving 3D model bones, you can easily create dynamic poses for character creatures, etc., and save and export multiple poses.

Welcome to join us

render farm free trialfox got talent

Recommended reading


China Film Administration Release New License Logo

2019-01-09


Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel

2018-12-26


How the Redshift Proxy Renders the Subdivision

2018-12-28


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Corona Renderer Learning - Denoising

2019-05-15


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Is This Simulated By Houdini? No! This Is Max!

2019-02-22


Arnold Render Farm | Fox Render Farm

2018-11-27


Partners

Interested

How does the studio directly produce the final shot? (2)
How does the studio directly produce the final shot? (2)
The rising expressive power of Real-time rendering Jon Favreau's biggest concern about The Mandalorian is whether the game engine can reach the visual effect rendering level of Star Wars on the TV budget. But when he saw a piece of sand similar to episode 2, he abandoned the thought. Sandy environment is built with assets scanned with photos and rendered in real-time In the virtual studio, all the scene elements and lights can be switched and upgraded freely, and the feel of virtual reality is maintained. Part of the virtual environment of The Mandalorian directly uses assets created by Unreal Engine for some video games, saving a lot of asset construction time and opening up the possibility of asset sharing between the two industries. Real-time rendering has been able to better present the highly reflective materials often involved in science fiction dramas The virtual scenes on the set sometimes deceive the eyes of the people on the scene. Jon Favreau said, "There was a person who came to the studio and said,'I thought you wouldn't build the whole scene here.' I said, no, we didn't build here, in fact, there only have tables.' Because the LED wall rendering is based on a camera, there is parallax. But even if you look at it casually next to you, you will still think that you are watching a live-action scene. " Entering the virtual studio, the actor has entered the virtual world and the final scene The changes of Hollywood led by virtual production technology In addition to accelerating the cycle and increasing the turnover of the budget, virtual production has also brought revolutionary convenience to actors and other teams. Actors can see the situation of the environment in real-time to perform and interact. "This not only helps the cinematography but also helps the actors understand the surroundings, such as the horizon. It also provides interactive lighting", Jon Favreau described it as “a huge breakthrough”. Virtual production can also help fantasy characters interact with real people In addition to The Mandalorian, movie projects such as Lion King and Fantasy Forest have already begun to use game engines to produce movies, but they are still mainly in the visual preview stage. Besides, Steven Spielberg and Denis Villeneuve used Unity to help achieve their visual effects in the production of Number One Player and Blade Runner 2049 respectively. This method is gradually replacing the usual storyboard production method similar to hand-drawn comic books. Spielberg stood on the monitor to view the real-time synthesis of the shooting content In the virtual production process, VR technology is rapidly becoming a viable option for large studios and production companies. Director Jon Favreau used a lot of virtual reality technology in the recent remake of the live-action movie The Lion King, which fundamentally created the entire virtual reality world of the movie. Lion King uses a lot of real-time virtual preview technology The popularization of virtual production technology In terms of budget, virtual production and LED display equipment are still relatively expensive at this stage.   Complete real-time compositing and film production at a lower cost Jon Favreau indicated that virtual production technology is a major leap forward in the film industry, allowing creators to make creative decisions before the project starts, rather than reinventing the process or after completion. Nowadays, more people can see the appearance of the lens directly during cinematography. More people can contribute their own ideas and learn from each other because they can see the final idea in advance. "In the past, when you went to a scene and left the green screen, you no longer care about it. But now, we have so many talented people and have accumulated more than one hundred years of filmmaking experience. Why should we give up just because we change the shooting system? Let’s continue to inherit the skills of film artists and develop tools that adapt to the times.” Virtual production technology will gradually become the routine process of film production in the future
More
2020-06-17
How does the studio directly produce the final shot? (1)
How does the studio directly produce the final shot? (1)
The Road to Innovation in Virtual Production How does the studio directly produce the final shot? The production team of Disney's new drama The Mandalorian used the real-time rendering technology to construct the digital virtual scene in advance. The Mandalorian, spin-off drama of The Star Wars launched by the Disney streaming platform The Mandalorian, the first spin-off drama of The Star Wars appeared on Disney+ on November 12, 2019, which was also the first blockbuster series made in Unreal Engine. The timeline of The Mandalorian is five years after Return of the Jedi, after the fall of the empire and before the emergence of the first order. The plot revolves around the travels of a lone bounty hunter in the outer reaches of the galaxy, far from the authority of the New Republic. The Mandalorian Epic Game, the development company of Unreal Engine, appeared in the list behind the scenes It is worth noting that Epic Games, the developer of the Unreal Engine, appeared on the list of thanks for the production of The Mandalorian, followed by a list of the entire virtual production team. Epic Game was previously committed to producing explosive games like Fortnite and other 3A-level masterpieces, including Kingdom Hearts III, Dragon Ball Fighter Z, and Star Wars Jedi Knight: Fallen Order and so on. In recent years, they are working with Lucasfilm to bring the real-time rendering capabilities of the Unreal Engine into the development of Disney’s streaming content for live-action production. Many shots background of The Mandalorian is presented directly from the LED wall. Jon Favreau is the producer and screenwriter of The Mandalorian, but also the pioneer using the virtual game engine in virtual production. At the SIGGRAPH 2019 computer graphics conference in Los Angeles, Jon Favreau shared an efficient way to use the game engine to help the film perform a virtual preview (Previs). "We used the V-cam system, which is essentially making movies in VR. All results would deliver to the editor. It is like we are editing a part of the movie in advance, the purpose of which is to achieve the pre-conceived goal."Jon Favreau said. Use VR technology and real-time rendering technology to view the virtual environment of the movie An evolution from the virtual preview tool to the final image production tool The virtual production system no longer only provides virtual preview services for the film. The crew used the LED video wall in The Mandalorian set as the background of the live real-time camera composites (In-camera composites). The SFX team projects the pre-rendered content, such as the environment, onto the LED wall as a dynamic green screen. The virtual studio is a cube of virtual content wrapped by four LED walls. It is driven by the Unreal Engine, and the photographer can shoot the final images directly in the camera (In-camera ). The content displayed by the LED will be adjusted and transformed in real-time according to the position of the virtual camera. Live actors and LED walls in the virtual studio "We present real-time rendered content in the camera and get composing shots directly. Considering zooming and other requirements, for some types of cinematography, we are not only able to interact on the spot but also can see the lighting, interactive light, layout, background, horizon, and others directly in the camera. Hence, these don't need to wait for the post-production." Jon Favreau noted.
More
2020-06-17
The Witcher 3 Character Model Making Process
The Witcher 3 Character Model Making Process
Fox Renderfarm, the best render farm in the CG industry, will bring you sharing about character model making process and ideas. Many people will encounter a lot of problems when they first come into contact with CG character art. For example, there is now a new character model that needs to be designed. Where should I start? What should it be made of? Even if there are very powerful artists to teach the experience, they will still find it impossible to start. So how should this situation be resolved? We take the role of Vesemir in The Witcher 3: Wild Hunt as an example. Why choose this role? Because Vesemir's body shape, material, clothing, hair, etc. contain many types. The entire case contains a complete process from reference to a complete production process, which can fully meet all project production. Reference Before production, it is necessary to clarify the production goals: such as the model style, the details of the model, and what kind of quality to achieve in the end, these are very important. Taking the character of Vesemir in The Witcher 3: Wild Hunt as an example, you need to find references before making, such as armor reference, material texture reference, etc., even the style of clothing, sewing method, etc., all the things that need to be model need a detailed reference. Modeling The basic structure of the model is very important. Although the character has clothing to cover the body, the structure of the body cannot be ignored. Be sure to ensure the correct body structure. (In the production of other projects, some basic body models can be saved, and some time can be saved here). The same clothes are worn on different people, and the feeling of dressing is different. The part of the red line is to outline the outline of the desired character and then modify the model. This is also the most critical step in making a model. If the shape is inaccurate or not correct enough, even if you put on clothes, the model will look fake and the whole will be very uncoordinated. The picture above is the wrong case! For example, cloth, different materials in different parts, and different thicknesses, the wrinkle direction is different. The model should also consider the volume and structure of those things when making the model. UVs Regarding topology and UV, the focus is on low polygons. Especially for the human body, reasonable lines and surfaces conform to the direction of muscles and bones. For example, an accessory with a complicated shape, but it is extremely small, so you should not give too many polygons. Compared with the high polygon, the low polygon will always lose little things, so it is necessary to balance between the two. Texture The production of textures is a very important part. It is necessary to correctly analyze the changes in the real world and then restore them with layers, and understand how the texture controls different attribute nodes of the object. Substance Painter is very easy to use in texture production. Then according to different production processes, they correspond to the normal, color, highlight, glossiness, normal, color, metallicity, roughness, and other attributes of the object. Make the basic texture in Substance Painter, and then added different details according to different props, such as broken and worn out. And distinguish the texture of props: metal, leather, weapons, etc.. The lighting used a set of previously made lights (which can save some time), and the texture can be produced while testing and observing the effect. If the final rendering effect is more realistic, then the details are very important.
More
2020-05-29
Foxrenderfarm

Powerful Render Farm Service

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    null

    Media Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com