Cycles for Blender: Making Pixel Style

2020-05-27

Blender render farm

Fox Renderfarm, a powerful but affordable Blender render farm. This article is compiled from 3D artist Leroy's sharing of Cycles for Blender** to create pixel style.

The focus of the pixel style performance is on the coloring, that is, to extract the required pixels from a delicate picture and make it with a pixelated style. Although in Cinema 4D, the coloring using MotionGraph can be quickly completed, but Leroy wants to achieve it in Blender.

And Blender's Cycles renderer is very powerful. Since Cycles comes with a mapping sampling node, making pixel-style things is also very simple.

The mapping sampling of the material uses the From Dupli function of the UV Map node. As for production, there are two ways to achieve: 1. Use parent copy, 2. Use particle copy; Two ways, the controllability of particles More, it can simply produce more complex and many effects of changes.

The production steps are as follows:

Method 1

Create a new plane in the scene, magnify it 5 times, and enter the edit mode for the third subdivision.

Return to the object level, create a new plane (the default state is selected after new creation), hold down Shift to select the plane just now, and select Object from the menu, that is, set the large plane as the parent object of the small plane.

Select the parent object, in the object properties of the properties panel, set the copy mode to Faces, check Scale, and set the zoom value to 0.4, as shown in the following figure,

Select the parent object and keep it selected by default in edit mode, then select Unwrap on the menu. If the model is relatively simple, just use auto split UV,

The production of the model is almost complete. The texture part will be started below. Next, you need to select the sub-object, click New in the material node to create a new texture, delete the default diffuse shading, and add a self-illuminating shading.

Create a new image texture node (here you need to use the image material to be pixelated), add a UV texture node, check the From Dupli option, and connect it to the texture node, Shift + R preview rendering in 3D view, the effect is as follows,

Is it very simple?

Of course, you can also use video animation and other materials to create pixel animation. Next, let's look at the method of making particles, or use the above scene and material.

Method 2

Select the child object of the scene, then add the parent object, select clear parent in the pop-up menu, and set the copy in the parent object attribute to None, select the child object, G10 will move it to the side, as shown in the figure below,

Select a large plane, in the particle properties panel properties, click New, create a new particle system, set the particle type to Hair, and set it to Object in the Render tab, select the small plane, uncheck Emitter, as the picture shows,

Turn on the advanced hair, and set the initial orientation to None in the spin tab. The preview effect is as follows,

The next step is to adjust the particles: size, distribution, number, etc. Only some important operations are demonstrated below. The default particle distribution is messy and needs to be adjusted neatly. In the emitter, cancel randomization and set the particle/face to 1, so that only one particle is generated on each face, and finally adjust the size of the particles as needed.

The plasticity of the particles is very strong. Although the pixels in the above case are all square, they can be made into various shapes such as triangles, quadrilaterals, and circles after adjustment, and they can also increase in size, rotation, and random distribution particle. In addition to using UV map nodes for texture mapping, texture coordinate nodes can also be used, and there are also options from the copy. When copying particles, in order to avoid overlapping of particles, it is best to set the number of particles emitted to the number of faces of the mesh emitter (in edit mode, you can see in the information panel).


Inspired by Animal Crossing: New Horizons, Korean artist use Blender to create exclusive cabins

2020-05-21

Blender render farm

As soon as Nintendo Switch launches Animal Crossing: New Horizons, it has been loved by game fans all over the world. Due to the epidemic, many people can enjoy traveling abroad through games during quarantine, which makes the Switch a hit again. And many CG artists eager to create many fan art or amazing creations. Korean designer Seungho Jeong (Neon3D) was invited to talk about his latest work "Miniature style cute character 3d artwork" (Soondol) and talk about the toy model made for the animation of "Molang".

Seungho Jeong (Neon3D)

Seungho Jeong, from South Korea, is good at product design and 3D printing. Most of them are created in Fusion360 and Blender. He is currently designing toy models for "Molang". He also runs a YouTube channel to share the process behind the scenes.

Miniature style cute character 3d artwork (Soondol) Year of completion: 2020 Software: Blender

Tell us about the artwork.

Seungho Jeong: Recently I was playing Animal Crossing: New Horizons, and I was attracted by the cute graphic style in the game, so I proposed a series of creative projects, with my original character "Soon-dol" as the main character, and designed a series of The miniature style house in Mori.

I use Cycle render in Blender to create. It is worth mentioning that instead of using texture mapping, I changed the color of the object or made some images using Illustrator, and used Blender's Shrinkwrap modifier to place the image on the character's face. The most difficult part of this work is the lighting effect. After all, to create "emits natural light from the outside of the window" is a test of the creator's Lighting skills. At first, I put a glass on the window, but then I found that no glass looks more natural, so I had to take it away.

What is your usual job?

Seungho Jeong: Mainly making character toy models. For example, the "Molang" series is currently on sale. I designed many different sets of costumes for the character, including Halloween, New Year, Hanbok and other styles. The main creative process is the use of product design software Fusion 360. It has a free-form modeling function, similar to Blender's modeling method.

I also made an introduction video for the company's YouTube channel "Behind Molang". First, I redesigned all 2D characters into 3D. Since our company is not an animation studio but a character commodity company, most of the 3D creation process of characters is the process of making plastic models, such as Fusion 360 (some are designed directly in Blender). When creating images in 3D, as mentioned above, instead of using texture mapping, we change the color of the object or use Illustrator to create the image, which is created in accordance with the process of making dolls and doll models.

Is animation different from product design?

Seungho Jeong: Making animation is very difficult, after all, that is not my major. Although product production and animation look very similar, there are still some system differences. Therefore, I have to learn by myself and complete the animation creation, so I simplified the character movement as much as possible.

Can you share your creative tips?

Seungho Jeong: If you want to create on a hard surface, Fusion 360 is recommended, which can be modeled for the accuracy of the size, which is very convenient. In addition, the ShrinkWrap modifier is used to attach the image to Blender without any texture drawing.

Who is your Favorite artist?

Seungho Jeong: Absolutely my boss Hye-Ji Yoon, he is the character designer of "Molang". And Molang's animation is currently being played on Netflix, which is receiving the attention of the global audience! I respect my boss very much and want to be a well-known creator like him.

Seungho Jeong’s Artstion: https://www.artstation.com/neon3d


Blender Creations: How to Sprinkle Some Fun in Characters

2020-03-16

Blender render farm

Talking with prize-winning 3D artists

  • How to absorb inspiration from life and other forms of art
  • How to make scattered sources of inspiration into a coherent character
  • How to realize your ideas and deal with technical problems in 3D software

GraveyardAndrey Agafonov Inspired by Coco and Lancelot Brown

Exclusive Interview → A Self-Taught Creator Realized His Unique Idea in Blender

The AlchemistPeshang Ahmed Inspired by Yoda and Spirited Away

Exclusive Interview → Creating an Alien Alchemist Inspired by Yoda and Spirited Away

Street Musician ReindeerDante Resendez Delgado Inspired by street singers in Vancouver

Exclusive Interview → Street Musician Reindeer Made in Blender: Sprinkle Some Fun in Character

How to Improve Your Techniques and Creativity

In interviews with numerous outstanding 3D artists that won prizes in all sectors of 3D creations, participating in competitions is what they all mentioned when speaking of ways to improve.

Why not do it now, join FGT3D challenge themed 'Easter Egg', practice what you've learned, and win some decent prizes!

SUBMIT NOW

Send your artwork to FGT3D@foxrenderfarm.com with your name and/or the studio's name.

OR

Join CG & VFX Artist Facebook group,

post your artwork in the group with tags FGT3D and FGT3DEasterEgg2020


How to Render High-quality Images in Blender

2020-03-09

Blender render farm

With the development of computer technology, it has been widely used in the field of graphic design, so that ordinary people also have the opportunity to come into contact with Computer Graphics technology referred to as "CG technology". Computer Graphic Image is also computer 3D animation technology, referred to as "CGI". Whether it is "Toy Story", "The Lion King" or "1917", the success of these world-class blockbusters can not be separated from the establishment of models in 3D software, and then to visual effects, post-production software composition, Editing, and more.

Image via blendercn.org

As a new star in 3D software, blender integrates modeling, sculpting, binding, particles, animation, etc., and is a software that supports commercial creation for free forever. So how do we use blender to output high-quality images?

Blender has 2 renderers that can convert a 3D scene into a 2D image.

  • Eevee is a physics-based real-time renderer.
  • Cycles is a physically-based ray-tracing renderer.

Use plugins to add more third-party rendering engines. Each renderer has its own rendering settings to control rendering quality and performance. And the rendering effect is determined by the camera, lighting,and materials, which also determine the quality of the output images.

Image via blendercn.org

Here is the Cycles renderer that comes with Blender as an example. As a GPU-based rendering engine, rendering effects have become more and more mature, and its speed is much faster than the CPU renderer. So where does it affect the output image quality?

Sampling

The sampling method determines how the light is calculated. The light is emitted from the camera into the scene and bounces back and forth until they find a light source, such as a light object, a glowing object, or the ambient background light. Number of light path traces for a single pixel in the final render. The more samples there are, the less noise there will be in the result and the more accurate it will be.

Path tracking

The rendering attribute is the amount of light emitted from the camera to each pixel. The view attribute is used for scene rendering in the view. A value of 32 means that each pixel will be tracked by 32 rays.)

Noise

Noise in the image will greatly affect the image quality because 32 beams of light per pixel are not enough to produce a high-quality image (To reduce the noise by 50%, double the number of lights, so the rendering time will also double, depending on the computer configuration to set the value, at least 1024 if you want a better effect.

Rendering properties

The rendering properties greatly affect the rendering quality, try to adjust to the maximum value that the view can reach, and adjust the view back to 32 to maximize the quality.

Optical Path Length

It refers to the value of the number of bounces to track the path of each beam. The difference between 0 and the maximum will be very large.

Camera Properties

Camera size: It needs a proper ratio and size, and the focal length needs to be adjusted according to the content of the image. The appropriate size can display the details of the Depth of field.

Depth of field: The choice of lens will affect the depth of field details. The depth of field determines how far away the object will be displayed clearly.

Aperture radius: determines how far the object in the radius is clear, and it will become blurred

Focus: Focus needs to be adjusted. Set important objects in the image as axes and adjust the appropriate aperture radius to show the correct object to the viewer.

If you want to improve the quality of the images, you can try some of the suggestions above. Improving the quality of images is not something that can be improved by reading an article. Continuous learning and practice is the best way.

Cycles is undoubtedly a young renderer. Blender is also constantly improving some of the features of Cycles, and it has a lot of room for development in the future. As your TPN-accredited cloud render farm, Fox Render**farm supports most mainstream 3D software, renderers, and plugins, including Blender, 3ds Max, Maya, Cinema 4D, Clarisse, and more. We always provide fast, secure and affordable cloud rendering services to reduce your rendering time. Welcome to the blender** render farm get a $ 25 free trial.


Which hardware will affect the blender to reduce rendering time?

2019-11-22

Blender render farm

What hardware should I use to reduce rendering time in blender? In the production of blender, it takes a lot of time to render. While there are many factors that can affect rendering time, which hardware can speed up rendering time? Although a properly configured hardware will increase efficiency, you can choose different hardware for different production processes.

3D design and production can generally be roughly divided into several major steps: Modeling, UV Mapping, Texturing and Shaders, Rigging, Animation, Lighting, Rendering Setting, Compositing. This step is the most important step in the final product, the final rendering, and always using a render farm in this step.

Professional graphics cards are optimized for professional 3D design software, so they are more accurate on the display. Professional graphics cards can make your previews more realistic. Improve the speed of modeling and the performance of materials in 3D. Professional graphics can also greatly reduce the load on the CPU during the production process. For example, if you move a complex model, the game graphics card will be skipped and not smooth, resulting in inaccurate movement. It may move many times and still Unable to move to the exact location. However, whether the display is wrong, whether the operation is smooth or stable, will greatly affect the efficiency of production, and even affect the mood of the producer.

In the step of blender rendering, such as the Cycles renderer, you can use the CPU or GPU for rendering calculations, but its rendering speed is independent of the graphics card, only related to the processing power of the CPU or GPU, and secondly related to memory. And for the final rendering, the professional graphics card and the game graphics card are no different (if you really want to say that there is a difference, it is only the display of the picture, the color is better). The biggest impact on the rendering speed is the CPU or GPU, especially for the Blender renderer, which is very dependent on the GPU, and the excellent GPU has faster rendering images than the excellent CPU, and the graphics card can be added to the device continuously.

Many people also think that memory will affect the rendering speed. This view is not correct. When the memory is enough, the rendering speed will not have much impact. What is enough?? Generally, 3D use is to open a scene file and it will take up a lot of memory. The more polygons in the scene, the larger the memory. When the memory is not enough, there will be a serious delay, because the hard disk space is called to make virtual memory. When the memory satisfies the amount required by the scene file, the rendering requires almost no memory except for the memory calculation that requires more memory.

In summary, a high-performance machine has a great impact on the speed of production and rendering. The reasonable hardware configuration and software adjustment can bring out the best works.


Blender Tutorial: To Create A Wild Jungle(2)

2019-09-17

Blender render farm

In the previous article “Blender Tutorial: To Create A Wild Jungle”, we have shown the production of the scene model part in Blender, and then let’s starts the part of the texture. I hope to help you with your production process. Thanks again to the author of this article, 3D artist Leo Lee.

Texture

Material Shader can choose ‘Principle BSDF’, which is a kind of useful in Blender, and supports some commonly used texture mapping parameters. UV treatment before mapping. Switch to Edit Mode, select all faces, and select Unwrap UV - Smart Unwrap in the mesh menu. Since the model here is an irregular model, there is no need to manually process the UV. Then add these texture options in Node, select Image Texture, and import the corresponding material map.

In Basic Color, you can choose a ground material that looks moist. Connect to the Basic Color and use the Multiply node to make the color darker.

Add the Roughness map again, and also use Multiply to control the intensity of the reflection. Then import the Normal Map and Height Map in turn, and link to the Normal of the Principle through the Bump node. The size of the texture can be controlled by adding a Mapping node to control the angle, position, and size of the texture.

In order to make some changes to the surface, another map similar to grass is added. The purpose is to add the raised part to the material of the grass according to the undulation of the ground, and the wetland material is recessed to make the ground appear more real.

So how do you mix two different materials according to the fluctuations of the surface? The node that needs to be used here is called Geometry. After adjusting the black and white over with the color ramp, the convex part is set to white, and the concave part is black, thus distinguishing the surface according to the geometric structure. Start and volt, then mix the two materials together through the Mix Shader.

Then switch the node wrangler to the mix shader to see the mixed effect and use the same method to attach materials to other objects.

What needs to be used here is the transparent method of material. First, you need to select a PNG or TGA transparent material map in the image texture, then use the Transparent BSDF to link it with the Diff BSDF through the Shader Mix Alpha, you can follow the Alpha The parameter hides the black part of the transparent material.

In the same way, the leaf part of the tree is also made of material. If the leaves and the trunk are in the same Object, then in the edit mode, the trunk and the leaves can be separated by the separate function, and then the new material is assigned to the selected area.

In the same way, small plants are also made of transparent materials.

Rendering

Before rendering, you need to adjust some of the following parameters. We choose the GPU rendering mode, which will be faster than the CPU. Then set the length and width pixels to increase the rendering detail to 100%, the default is 50%. Transparency is activated in the film menu below, and the background can be transparently processed after rendering.

If you want to blur the foreground, you can select the camera, and then set the distance of the focus in the depth of view, which can better render the spatial sense of the picture.

Here, some effects are processed on the rendered image, similar to the film and television in the VFX workflow, to superimpose some fog and Sun Bean effects and color adjustment according to the space of the object.

Article translation from: Https://www.weibo.com/ttarticle/p/show?id=2309404294284563106633


Blender Tutorial: To Create A Wild Jungle(1)

2019-09-11

Blender render farm

Blender is a powerful and free open-source application, especially in 3D scene construction, rendering, and special effects, compared to similar software has obvious advantages. As the leading cloud rendering service provider in CG industry, Fox Renderfarm supports both blender rendering, 3ds Max rendering, Maya rendering, C4D rendering, Clarisse rendering, SketchUp rendering, etc.. This article wrote by 3D artist Leo Lee, will show you the potential of Blender to quickly create scenes and texture rendering, and will show some experience and hope to help your production process. Let learn how to create a wild jungle in the blender.

Add a Plane to the view, the size can be larger, switch to Sculpt Mode. It is similar to some familiar engraving software, such as Zbrush, which can choose different brushes to process the model. Check Dyntopo at the menu (no need to manually subdivide the model), and select ‘Constant’ for the detail mode, and check the smooth shadow.

As shown in the figure below, when the model is engraved, the system automatically subdivides the number of faces.

Here are a few other ways to create terrain. Use Subdivide to subdivide the surface as follows: subdivide into 50-100. Then add a black and white material through the Displacement Modifier in Blender, similar to the Height Map, which can be hand-painted in PS or downloaded. Click Add Material Map to import the material.

The strength of the ground fluctuations can be controlled by adjusting the parameters. Then add Subsurface Modifier to smooth the surface. In addition, if you want to adjust the terrain manual area, you can activate the area control button below, the middle button wheel to adjust the area range, select a point to drag and drop.

Here is a finished terrain. The next step is to add some decorations to the terrain, such as trees, grass, stones, and medium-sized plants, to create a more realistic picture.

The model of several trees is imported into the scene, as shown below. It can be imported via ‘append’ ‘Import’. Append can only add original Blender files. Import can import some commonly used 3D files, such as 3ds, FBX or OBJ.

Previously, the terrain and trees were layered. The layering function in Blender can manage the model very effectively, compared to the layer function in PS. Select the model, ‘M’ to move the active model to any layer, press the number keys to move to the corresponding layer. Hold down the Shift key to select multiple layers at the same time to display the contents of their layers.

The imported trees are arranged in order according to space and the composition, and then the position is selected to place the camera. As shown below,

Below are the layers of stones and grasses and small and medium plants.

Due to the large size and a limited number of trees, manual placement can be used. But like those small and large objects, you can use the ‘Particle’ series to let Blender randomly place small objects according to the range we set.

As shown in the figure below, first switch to ‘Weight Mode’, you can first lower the transparency, then draw the range of the area where you need to place small objects on the terrain, blue is the vacant area, gradually yellow, and the dense area is excessive.

Then switch to "Particle System" in the menu on the right, click the Add button, here is similar to the step of adding material to the object, and the material here is the object that needs to be placed, the first layer I chose the grass, can pass the view The button hides other objects. Activate the data indicated by the arrow in turn.

Each of the previous objects must be Ctrl+G in advance and merged into their respective groups. Then here Dupli Group chooses the grass group. The values in the red box are some random parameters for which various adjustment objects are placed, such as orientation, size, rotation angle, and so on. If you adjust the density and length units more deeply, you can also select the corresponding Group in turn.

In the same way, the stones and other vegetation are also brushed in. As shown below.

After the addition of plants and ground, the contents of the camera lens are richer.


Made By You, Blender 2.80 Officially Released

2019-08-01

Blender

After four years of development, the Blender 2.80 version was officially released! Blender is an open source, cross-platform, all-round 3D animation software that provides a range of animated short film production solutions from modeling, animation, materials, rendering, to audio processing, and video editing. Blender 2.80, this version update brings a lot of new features, a redesigned user interface, a new physics-based real-time rendering engine Eevee (which can be used as the final rendering output, or as a real-time rendering engine for views) and more.

Blender 2.8, made by you, a brand new start

Blender 2.80 has reworked the user interface to make the interface more concise and consistent, making it easier for users to use tools, joysticks and more. Blender redesigned a new black theme and modern icons to focus on your artistic creation. The icon can also be controlled by the theme to make it easier to visualize the icon under a variety of themes.

The workspace gives you quick access to things like sculpting, texture drawing or motion tracking. Click on the tabs at the top of the window to change the usage, or customize your workspace to make your work more efficient.

Engraving workspace

Modeling workspace

Animation workspace

Clip coloring workspace

The 2.80 version has a more advanced 3D view display capability, and the display mode is more diverse and free. The new Workbench rendering engine makes it easier for you to get the job done.

Alpha Studio Random Cavity Shadow

Alpha Wire Random

Alpha Lookdev Texturing

Real-time rendering engine Eevee

Eevee has many advanced features such as volumetric fog, screen space refraction, subsurface reflections, soft and contact shadows, depth of field, camera motion blur and glow.

Eevee's material nodes are the same as Cycles, so it's easy to render existing scenes. For Cycles users, Eevee can also be used as a live preview.

2D Animation

The 2.80 version greatly enhances 2D rendering capabilities. The new Grease Pencil interface is more friendly to 2D artists, and you can have both 2D and 3D production capabilities to make your creations easier.

Grease Pencil is more than just a drawing tool, it is highly integrated with existing object selection, editing, management and linking tools, and has powerful deformers and brush tools.

Normalized hair BSDF Rendering physics-based hair is easier, eliminating the need to set up complex shader networks.

Subdivision and micro replacement

Cryptomatte This function saves the inconvenience of setting the object object and the object material ID for the later stage. You only need to check this before rendering, and then select the desired selection in the post-synthesis.

Random walk subsurface scattering

 For more information and downloads, please visit https://www.blender.org/

The leading cloud rendering service provider, Fox Renderfarm supports blender, welcome to use the platform and get $20 free trial.


How To Create An Old Camera In Blender?

How To Create An Old Camera In Blender?

2019-04-16

Blender render farm




This article shows you how to create an old camera in **blender**. The production process refers to an article by a Christian Wachter artist and introduces some new processes and complements the production techniques. 
  1. Modeling Lens lens The camera lens is a curved surface, but the arc surface created by the latitude and longitude ball will cause errors after subdivision because of the triangular surface. The more scientific method is to use a box, first add a subdivision modify and re-cast, and then scale an axis to get a lens arc surface with perfect topology.

Complex parts For parts with complex structures and small parts, they are generally not subdivided, and it is enough to simply chamfer them with high precision. In order to save computer resources and processes, these finely divided parts can only be chamfered with high precision.

  1. UV Here are some of the next-generation game production processes, such as multiple objects sharing a set of UVs. In this way, whether you export the model to the third-party software to make the texture, or after the texture is created, import the blender and manage the texture material in the blender.
  1. Texture The texture is selected to use the substance painter. When drawing the edge wear, in addition to using the SP's own program texture, some more serious wear is manually drawn in the sharp part of the model, and the gradient wear of the gear adjustment knob is defined. .
  1. Material What is worth noting in the material is that because the camera lens has an anti-reflection coating, the reflection of the lens is blue-violet. The top half of the picture as below. In Blender, I tried to simulate a dispersive material using the red, green, and blue color overlays, but the effect was unsatisfactory and the rendering points were extremely large. As shown in the lower part of the figure below. Finally, use a Fresnel as the basis of the gradient, change it to a blue cyan gradient, and mix it on the surface of the gloss to simulate the material of the camera lens.
  1. Synthesis After De focus rendering is complete, select to blend in Blender, as shown in the two images below, and change the gradient on the Z channel to control the post-defocus effect. One makes the rear part of the camera slightly out of focus, and one makes the floor behind the camera seriously blurred out focus.

Deformation and hue Finally, add a little lens distortion to it, and use the color balance node to give it some stylized color.


Blender's EEVEE Engine, Subverting The CG Production Process

Blender's EEVEE Engine, Subverting The CG Production Process

2019-04-25

Maya render farm

Blender is a great 3D software with a real-time rendering engine - EEVEE. Similar to unity3d and unreal, this engine can create, adjust, and represent 3D objects and materials by using PBR workflows.

How to use the EEVEE engine:

You can see that the changes are reflected in the upper rendering engine, the default is the EEVVEE rendering engine. And the whole window is a bit gray, in fact, it is in the state of rendering, but the material of this box is not PBR at all. EEVVEE uses the PBR workflow, which is the same as unity and unreal, and has both Metallic and Specular workflows. As shown below, you can choose the nodes you need:

There are four textures required for the Metallic process: 1.Base Color 2.Metallic 3.Roughness 4.Normal There are four textures required for the Specular process: 1.Base Color 2.Specular 3.Roughness 4.Normal Of course, these four textures are basically required. If you have other textures, you can import them for free. Below we import the model

Correspond to the texture, plus HDR map

Well, the effect is not bad. But there are a few more features that make the picture look better, and then find the Post Processs Stack in the properties bar.

They are: 1. Ambient Occlusion Ambient Occlusion 2. Motion Blur motion blur 3. Depth of Field Depth of Field 4. Bloom glow High energy part Blender's latest version of the Cycle engine has joined the Principled node of the PBR process. This node is used to render the material that uses the PBR process offline. Already convenient, right? However, after seeing a lot of official documents and trying, I found that the Principled rendering node, working in both the Cycle engine and the EEVEE engine, and the effect is close! what does this mean? The following two images are shown, which are the same model, the same set of textures, the same material nodes, and the same lighting. In short, the settings are the same, that is, the real-time rendering and offline rendering engine switching.

From-cycle

EEvEE engine screenshot

Only switch the Cycle engine and open the window for real-time rendering Although there are differences, they are basically similar. The traditional rendering process is as follows: Vray or Arnold for Maya 1. Draw a texture It may be a PBR process, perhaps a traditional process, in a substance painter, Quixel production, or other software. However, there is no guarantee that the material preview will be the same as the final render. 2. Rendering test Import textures into Maya and test renderings using Vray, Arnold, or other rendering engines. And confirm the final model material assets. 3. Adjust the light according to the lens The rendering staff lights in different movie scenes according to different environments. With Maya plus Vray Arnold, rendering tests can take a lot of time and eventually render the final film. With EEvEE's Blender, these four steps will reduce the time of production and are more friendly to the creator: Blender_EEvEE_Cycle_workflow

  1. Draw a texture Whether you are making a texture in the substance painter or Quixel, as long as it is a PBR material, you only need to pay attention to the effect of the material at this time, because the effects of the EEvEE and Cycle engines will be basically the same.
  2. Rendering test Importing the texture of the PBR process into Blender, whether it is EEvEE or Cycle engine, the performance is quite consistent with the software for making the material. Here, you can use the EEvEE real-time rendering engine to quickly create a variety of lights, a variety of ambient light, and quickly view the interaction between your material and light. And you can switch directly to Cycle without any adjustments and directly render offline. Cycle can also render in real time in windows, and it is very fast. It also saves a lot of guessing and waiting time. It is more convenient to light up than before, and the effect can be seen directly, and the rendering is basically the same.
  3. Adjust the light according to the lens In this step, the rendering can be quickly illuminated with EEvEE and interacted in real time. You can see the interaction of light with the surface of the object, closer to the final result. Then directly switch Cycle offline rendering to further adjust the final rendering of the finished product, real-time and fast. I hope this will give you some inspiration and help. Author: DigitalCat

Welcome to join us

render farm free trialfox got talent

Recommended reading


China Film Administration Release New License Logo

2019-01-09


Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel

2018-12-26


How the Redshift Proxy Renders the Subdivision

2018-12-28


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Corona Renderer Learning - Denoising

2019-05-15


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Is This Simulated By Houdini? No! This Is Max!

2019-02-22


Arnold Render Farm | Fox Render Farm

2018-11-27


Partners

Foxrenderfarm

Powerful Render Farm Service

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    null

    Media Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com