Create flame effects with Phoenix FD in 3ds Max

Create-flame-effects-with-Phoenix-FD-in-3ds-Max

Phoenix FD 2.2 is a very complex and powerful fluid simulation plug-in that supports Maya and 3ds Max. This article will share how to make a basic flame effect. The workflow is basically using Particle Flow as the flame source, adjusting other dynamic parameters and good exposure conditions. PhoenixFD can also easily make realistic flame effects.

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 0

Overall settings

The entire scene setup is very simple. We need a PHXSimulator as the simulation area, a PHXSource helper, and a Particle Flow as the source of the flame.

Ready to work

The units of the scene have a great influence on the fluid simulation! Please make sure your flame is not simulated in an unreasonable range (for example, 100 km x 100 km). If you want to get exactly the same results as this tutorial, I suggest you use my scene unit settings or you can download this tutorial The final completion file.

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 1 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 1 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 2

Particle Flow

Please follow the screenshot below to create Particle Flow and use particles as the source of the flame. The basic concept is to launch many small ball particles to move to the center to simulate the dynamic effects of a fire.

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 6 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 8 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 9Create-flame-effects-with-Phoenix-FD-in-3ds-Max 5 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 4 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 6 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 8 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 9Create-flame-effects-with-Phoenix-FD-in-3ds-Max 5 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 4 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 7 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 6 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 8 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 9Create-flame-effects-with-Phoenix-FD-in-3ds-Max 5 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 4 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 7 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 6 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 8 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 9Create-flame-effects-with-Phoenix-FD-in-3ds-Max 5 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 4 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 3

Go to Create Panel / Helpers / PhoenixFD and drag out in the scene to generate PHXSorce helper. Add Particle Flow to it: PF Source-> Event 002 (as the source of the flame). Please follow the setting parameters of the fluorescent sign in the screen. Setting the Velocity value to 4.0 will make the flame have a spray effect, which can be weakened a little according to your needs.

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 10

Parameters of fluid dynamics

Create PhoenixFD simulation grid-PHXSimulator

Go to Create Panel> Geometry and click the drop-down menu. Select PhoenixFD.

Press the PHXSimulator button to drag the simulation out of the scene. This PHXSimulator grid must cover your flames. PhoenixFD's preset will take all objects and PHXSource into consideration, so you don't need to manually add objects to participate in simulation like FumeFX.

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 11

The important parameters are marked with a highlighter: The Cooling function is equivalent to the Burn Rate of FumeFX. It is necessary to increase this parameter slightly, otherwise, it is easy to heat up to overheat. Regarding energy conservation, you can use "Symmetric" or "Smooth". For Material Transfer, I use the "Multi-Pass" method, which can produce more details. The default Gravity makes it easy to pull the flame too long, so I lowered the gravity to 0.5. Enable burning can make the overall simulation more realistic.

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 12

Preset preview effect | Adjusted preview effect | Final rendering effect

The preset preview parameters are not accurate enough (left picture) and the final calculation result (right picture) is very different! in In PHXSimulaor / Preview you can adjust "As Fire" from 1000.0 to 1400.0 to make the preview closer. The final flame effect (middle).

Colors and rendering

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 13

The results of the "Colors and Transparency" preset by PhoenixFD are rough and unrealistic (above). I find it very unintuitive to adjust the colors and curves, and it is difficult to call up the desired effect (compared to FumeFX). Fortunately, PhoenixFD provides a method to load the settings of other scenes. You can open the sample file "burning plane.max" attached to the software after installation and save its "Rendering Settings" as a * .apr file, and then save it in the current The scene is loaded directly with good color and curve settings.

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 15 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 15 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 14 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 15 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 14 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 15 Create-flame-effects-with-Phoenix-FD-in-3ds-Max 16

After loading the appropriate * .apr file, you can directly use the gradient and curve as shown above, which is suitable for flame calculation.

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 17

In order to speed up the calculation, you can increase the Step (%) parameter to 15 to achieve a good balance between speed and quality.

Exposure control

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 19

PhoenixFD's preset rendering effects are prone to overexposure (possibly due to the relationship between physical precision attributes). This will cause details to be lost. You can apply Environment and Effects / Add Logarithmic exposure control to solve the problem of overexposure. Press the Render Preview button and adjust the Contrast and Brightness values until you are satisfied. You can also use the Color Correction function to correct the color temperature problem.

Final effect

Create-flame-effects-with-Phoenix-FD-in-3ds-Max 18

The above is the final animation effect of this tutorial. I hope you enjoy this teaching, so start to make your flame special effects with PhoenixFD!

Welcome to join us

render farm free trialfox got talent

Recommended reading


China Film Administration Release New License Logo

2019-01-09


Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel

2018-12-26


How the Redshift Proxy Renders the Subdivision

2018-12-28


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Corona Renderer Learning - Denoising

2019-05-15


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Is This Simulated By Houdini? No! This Is Max!

2019-02-22


Arnold Render Farm | Fox Render Farm

2018-11-27


Partners

Interested

A Guide to the Career Growth of VFX Compositors(1)
A Guide to the Career Growth of VFX Compositors(1)
As the leading cloud rendering services provider in the CG industry, Fox Renderfarm has an outstanding team with over 20 years’ experience in the CG industry. Team members are from Disney, Lucasfilm, Dreamworks, Sony, etc. With professional services and industry-leading innovations, they serve leading VFX companies and animation studios from over 50 countries and regions, including two Oscar winners. In this artichle, we will share a guide to the career growth of VFX Compositors from Zhiyong Zhang, Compositing Technical Director in Base FX(Base Media is the leading visual effects and animation studio in Asia). Part 1 Introduction What is compositing? There are many different explanations about compositing on the Internet. Since it's difficult to explain, I would analyze compositing from some shots. Like the scene in "Monster Hunt", the foreground is made with a 3D model, and the background is made with a combination of the original paintings and pictures. That is to do some MP, color adjustments, elements setting and real shot adding. Finally, all the pictures are going to be integrated, which is a compositing process. The professional value of composting Compositing is the last part of the visual effects in the later period. You need to blend and match three-dimensional things with the real shot through compositing techniques and methods. Through techniques and methods, add some 2D or 2.5D elements to make the picture more interesting. Compositing is making the projects that the entire visual effects team makes. Hence, compositors are responsible for the entire team, striving to adjust the project to the final optimal effect. This is the professional value of compositing. The artistic charm of composting For me, every adjustment on color, virtual focus, interactive light and others will make the scene more realistic or better-looking. Every step of the operation will improve the scene, bringing the compositors hope and happiness. It is awesome to do better in the time allowed and finally show something beautiful. This is the artistic charm of composting. Part 2 Six basic skills of compositors The first basic skill is erasing. Many scenes have tracking points when shooting on the green screen, or there are some things that can’t be removed from the scene. We need to do erasing, which is very basic but important. The second skill is rotoing and chroma key. The chroma key is to keying the green screen. There is also a key called no green screen keying, which is the rotoing. The third skill is to match colors. Adding an element to the real shot to change the color of the material is actually matching the material, not creation. The fourth skill is to match the virtual focus or depth of field. The material itself will actually have virtual focus, so the compositor needs to also be good at photography, otherwise, it may become a bottleneck for the compositor. The ideas of the director and the main photographer will be presented in the scenes. We use compositing to re-express what they want to express in CG. We are doing matching rather than recreating. The fifth skill is tracking. Most compositors do 2D trackings, such as billboards or on-screen patches. And sometimes they also do 3d tracking. The last one is the matching noise. For some demanding projects, it is necessary to switch between red, green, and blue screen, or even zoom in to the local area for comparison. It also depends on the broadcast platform. Generally, if it is broadcasting on a TV platform, the noise is not easy to see the noise. If in the cinema, as long as the brightness of the cinema is large enough, the noise of some scenes can be easily seen. The noise matching can make the picture more realistic. Part 3 Standards for cinema-level film compositing requirements The standards here are not officially stipulated standards but are just some common standards that I personally summarize based on experience. I divided the cinema standards for film compositing into two types: technical requirements and artistic requirements. Technical Requirements First of all, it depends on whether the color of the original material is changed. If you are not careful or have moved other parts of the screen including the color of the material for other reasons, it may cause customer dissatisfaction. Secondly, whether the maximum and minimum values of the screen are consistent with the material or not. Of course, the maximum and minimum values are not for a certain scene, but for all the materials of a project. Thirdly, whether the metadata is correct or not. If the editing requires metadata to be accurately restored, it needs to be correctly restored to the material. Generally, at the beginning of the project, it needs to agree with the previous DI or editing on how to set back. Fourthly, whether the shifting is stuck will also be a technical requirement. The shifting should be discussed with the editor. If the editing does not require any shifting, even if it seems to be stuck, it may be a fast switch of the front and rear scenes. This is one kind of editing methods. if you do something to make it smoother, re-reverting to the clip may cause problems. The last is whether there is any obvious flaw in the picture. The so-called flaw is whether there is a black border on the edge of the picture and whether the edge of the real shot is stretched. Artistic requirements The artistic requirements will be significantly different from the previous technical requirements. We will check whether the details of the keying edges are enough to maintain the authenticity of the picture. In the normal budget, the details of the edges can be kept authentic. If the edges look fake, it may make the whole picture fall short. Whether the bright and dark parts of the screen is matching with the material within the acceptable range? The dark and bright parts here refer more to brightening everything in the viewing facility, to see if the added thing is the same color as the real shot and whether the degree of the color cast is the same. Without considering the budget, as a compositor, the only requirement is to do something better, so try to match it. Then, whether the compositing elements, the virtual focus of the background and the depth of field match the real shot. The depth of field of the real shot sometimes will be very shallow, sometimes it will be very deep. Generally, there will be a range, this range is taken by the lens and the camera. We can add an element in it, and change the depth accordingly. In addition, we must also consider whether the simulation of the lens effect meets the real shooting standard. With a normal lens, the virtual focus is 1:1, and with an anamorphic lens, the virtual focus is vertical, similar to 2:1 circular virtual focus. While doing compositing, it needs to pay special attention to matching. The next thing to consider is whether the addition of smoke, flame and other atmospheres is truly fused. The light and shadow, the virtual focus and the sharpness are matched with the whole, and the things that really melt into it are realistic. The last thing is to consider the interactive effects, the corresponding interaction between the real shot material and the CG elements. When we do the compositing, in addition to shadows, there are some light interactions, we should consider whether to add them. In the process of compositing, we need to give ourselves the opportunity to exercise and do more fine, otherwise, we can't make progress. In fact, the audience is very smart now, watching a lot of blockbuster movies. Therefore, we should try our best to be realistic in the time cost of cinema-level movies. If the audience feels satisfied in the cinema, it will be successful.
More
2020-07-03
How does the studio directly produce the final shot? (2)
How does the studio directly produce the final shot? (2)
The rising expressive power of Real-time rendering Jon Favreau's biggest concern about The Mandalorian is whether the game engine can reach the visual effect rendering level of Star Wars on the TV budget. But when he saw a piece of sand similar to episode 2, he abandoned the thought. Sandy environment is built with assets scanned with photos and rendered in real-time In the virtual studio, all the scene elements and lights can be switched and upgraded freely, and the feel of virtual reality is maintained. Part of the virtual environment of The Mandalorian directly uses assets created by Unreal Engine for some video games, saving a lot of asset construction time and opening up the possibility of asset sharing between the two industries. Real-time rendering has been able to better present the highly reflective materials often involved in science fiction dramas The virtual scenes on the set sometimes deceive the eyes of the people on the scene. Jon Favreau said, "There was a person who came to the studio and said,'I thought you wouldn't build the whole scene here.' I said, no, we didn't build here, in fact, there only have tables.' Because the LED wall rendering is based on a camera, there is parallax. But even if you look at it casually next to you, you will still think that you are watching a live-action scene. " Entering the virtual studio, the actor has entered the virtual world and the final scene The changes of Hollywood led by virtual production technology In addition to accelerating the cycle and increasing the turnover of the budget, virtual production has also brought revolutionary convenience to actors and other teams. Actors can see the situation of the environment in real-time to perform and interact. "This not only helps the cinematography but also helps the actors understand the surroundings, such as the horizon. It also provides interactive lighting", Jon Favreau described it as “a huge breakthrough”. Virtual production can also help fantasy characters interact with real people In addition to The Mandalorian, movie projects such as Lion King and Fantasy Forest have already begun to use game engines to produce movies, but they are still mainly in the visual preview stage. Besides, Steven Spielberg and Denis Villeneuve used Unity to help achieve their visual effects in the production of Number One Player and Blade Runner 2049 respectively. This method is gradually replacing the usual storyboard production method similar to hand-drawn comic books. Spielberg stood on the monitor to view the real-time synthesis of the shooting content In the virtual production process, VR technology is rapidly becoming a viable option for large studios and production companies. Director Jon Favreau used a lot of virtual reality technology in the recent remake of the live-action movie The Lion King, which fundamentally created the entire virtual reality world of the movie. Lion King uses a lot of real-time virtual preview technology The popularization of virtual production technology In terms of budget, virtual production and LED display equipment are still relatively expensive at this stage.   Complete real-time compositing and film production at a lower cost Jon Favreau indicated that virtual production technology is a major leap forward in the film industry, allowing creators to make creative decisions before the project starts, rather than reinventing the process or after completion. Nowadays, more people can see the appearance of the lens directly during cinematography. More people can contribute their own ideas and learn from each other because they can see the final idea in advance. "In the past, when you went to a scene and left the green screen, you no longer care about it. But now, we have so many talented people and have accumulated more than one hundred years of filmmaking experience. Why should we give up just because we change the shooting system? Let’s continue to inherit the skills of film artists and develop tools that adapt to the times.” Virtual production technology will gradually become the routine process of film production in the future
More
2020-06-17
How does the studio directly produce the final shot? (1)
How does the studio directly produce the final shot? (1)
The Road to Innovation in Virtual Production How does the studio directly produce the final shot? The production team of Disney's new drama The Mandalorian used the real-time rendering technology to construct the digital virtual scene in advance. The Mandalorian, spin-off drama of The Star Wars launched by the Disney streaming platform The Mandalorian, the first spin-off drama of The Star Wars appeared on Disney+ on November 12, 2019, which was also the first blockbuster series made in Unreal Engine. The timeline of The Mandalorian is five years after Return of the Jedi, after the fall of the empire and before the emergence of the first order. The plot revolves around the travels of a lone bounty hunter in the outer reaches of the galaxy, far from the authority of the New Republic. The Mandalorian Epic Game, the development company of Unreal Engine, appeared in the list behind the scenes It is worth noting that Epic Games, the developer of the Unreal Engine, appeared on the list of thanks for the production of The Mandalorian, followed by a list of the entire virtual production team. Epic Game was previously committed to producing explosive games like Fortnite and other 3A-level masterpieces, including Kingdom Hearts III, Dragon Ball Fighter Z, and Star Wars Jedi Knight: Fallen Order and so on. In recent years, they are working with Lucasfilm to bring the real-time rendering capabilities of the Unreal Engine into the development of Disney’s streaming content for live-action production. Many shots background of The Mandalorian is presented directly from the LED wall. Jon Favreau is the producer and screenwriter of The Mandalorian, but also the pioneer using the virtual game engine in virtual production. At the SIGGRAPH 2019 computer graphics conference in Los Angeles, Jon Favreau shared an efficient way to use the game engine to help the film perform a virtual preview (Previs). "We used the V-cam system, which is essentially making movies in VR. All results would deliver to the editor. It is like we are editing a part of the movie in advance, the purpose of which is to achieve the pre-conceived goal."Jon Favreau said. Use VR technology and real-time rendering technology to view the virtual environment of the movie An evolution from the virtual preview tool to the final image production tool The virtual production system no longer only provides virtual preview services for the film. The crew used the LED video wall in The Mandalorian set as the background of the live real-time camera composites (In-camera composites). The SFX team projects the pre-rendered content, such as the environment, onto the LED wall as a dynamic green screen. The virtual studio is a cube of virtual content wrapped by four LED walls. It is driven by the Unreal Engine, and the photographer can shoot the final images directly in the camera (In-camera ). The content displayed by the LED will be adjusted and transformed in real-time according to the position of the virtual camera. Live actors and LED walls in the virtual studio "We present real-time rendered content in the camera and get composing shots directly. Considering zooming and other requirements, for some types of cinematography, we are not only able to interact on the spot but also can see the lighting, interactive light, layout, background, horizon, and others directly in the camera. Hence, these don't need to wait for the post-production." Jon Favreau noted.
More
2020-06-17
Foxrenderfarm

Powerful Render Farm Service

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    null

    Media Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com