Reboot Pirates of The Caribbean Series Because Depp Is Too Expensive?

Reboot Pirates of the Caribbean series without Johnny Depp, in the eyes of many people, this is almost impossible. After all, Depp has always been the selling point of this series, rather than finding a newcomer to play Captain Jack. It's better to say that Johnny Depp... . But Disney seems to have confidence in this. In fact, there is a good reason for them to do so, that is, "money."
According to Forbes statistics, in the absence of Depp, every production of a Pirates of the Caribbean movie will save $90 million in budget. In order to make this payout more valuable, the writers have to focus more on Captain Jack, which is one of the reasons why the quality of the Pirates of the Caribbean series is not as good as before. According to The Playlist, Pirates of the Caribbean 5: Dead Men Tell No Tales, although the global box office reached $800 million, but did not meet Disney's expectations. Depp's $90 million pay, plus 20 million, can shoot a Deadpool 2. At the same time, $90 million can also shoot three films of A Star is Born and two films of Bohemian Rhapsody.
Now think back to the first Pirates of the Caribbean: The Curse of the Black Pearl. At that time, Depp's play did not occupy a large proportion. The story of Orlando's blacksmith and Keira Knightley as Elizabeth's story line, is equally comparable. But because of Captain Jack’s role, later sequels began to focus more and more on this character.
Deadpool’s screenwriters Lutter Rees and Paul Wernick are preparing to reboot the script. For this arrangement, Sean Bailey, chairman of Disney's production department, said: "We want fresh energy, I love pirate movies, but we want to regroup, this is the task I gave them."

Welcome to join us

render farm free trial

Recommended reading


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


Arnold\ V-Ray\ Octane\ Conora\ RedShift, Which Renderer Is The Best?

2019-04-15


China Film Administration Release New License Logo

2019-01-09


How the Redshift Proxy Renders the Subdivision

2018-12-28


Maya 2018 Arnold's Method of Making Motion Vector Blur in AOV Channel

2018-12-26


Corona Renderer Learning - Denoising

2019-05-15


Arnold Render Farm | Fox Render Farm

2018-11-27


How the Redshift Proxy Renders the Subdivision

2018-12-28


Partners

Interested

3D Tutorials: How to Make Dogs in Togo(3)
3D Tutorials: How to Make Dogs in Togo(3)
The digital environment in the film What we mentioned above are all about the production of CG sled dogs. There are many natural environments in the film, some of which are fully synthesized digital landscapes, and some are enhanced effects after real shots in Alberta, Canada. The two main scenes are Wheat Kinley Mountain shooting at Fortress Mountain, Alberta, and frozen Norton Bay shooting at Lake Abraham, Alberta. Real shot material 1. Shooting across a frozen lake in Alberta, relying on a lot of roto to complete The reason why I chose Lake Abraham in Alberta as the location for shooting the unstable frozen lake is that the blue ice here is very clean and clear, and the place is large. It's just that the weather on the day of the shooting was not very compatible. Two days later, the lake was covered with snow. Fortunately, the production team took a lot of reference photos within these two days. Nearly 95% of the shots are related to the sled dog and the owner Leonhard Seppala. To facilitate production, the entire sled team, sled, protective gear, fur, clothes, and Leonhard Seppala's hair need to be rooted separately, taking into account the sled dog body behind the team The cold atmosphere is stronger than the sled dogs in front, and the roto has to be created in layers. A real shot of sled dogs, replacing the environment and enhancing the atmosphere Under normal circumstances, it is sufficient to project the roto layer on the character's body trajectory or vehicle trajectory and then superimpose on it, but at this time the hair details are very complicated. You need to project the roto on the card, build a sled system as a card binding, and then add Each layer of particle FX, snow, atmosphere, etc. In addition, the snow and ice produced by the sled and sled dogs flying forward are simulated in Houdini and filled with some 2D elements. Real shot Final shot 2. Cracked ice The design of the ice-breaking lens is very complicated. The live shots were taken on a flat ice surface, so some ice flipped shots can only be done in full CG, while those non-CG shots used a method of destroying the lens track to make the ice surface look more" be more active, limit it to the range that does not destroy the parallax, use 2D techniques to constrain the fixed camera on a floating ice cube, or move the camera on the ice material, add 2D floating, and make the effect look less stable. Moreover, the material itself was shot on a flat ice surface with different light, and the lighting conditions are constantly changing with time. Therefore, the material of each version is different. It is necessary to keep these materials consistent and guide the ice surface. Fragmenting piece by piece is also a long and complicated process. 3. Simulate huge ice cubes In the sequence of the sled dog team crossing the frozen lake, huge blocks of ice will gradually rise as the ice surface breaks. The production team used procedural methods as much as possible to guide the shape of huge ice cubes. When there are a large number of huge ice cubes, there is no way to bind each piece individually, and it is impossible to carry out carving, texture, and appearance development processing on each piece. This will limit its shape and size. Once you want to make it If you modify it, you have to go back to the previous step and recreate it. So they created a new Cascade system that allows the layout department and the environment department to create huge ice cube layouts on a shot-by-shot basis. The Layout department created a very basic proxy shape in Maya and used Maya's curve tool to draw a huge ice shape, stretch it, place it in the scene, add binding constraints, and set up rough animations in the floating ocean. The environmental team has created a very practical toolset that can procedurally model huge ice blocks through basic geometry, generating broken edge details, internal bubbles, cracks, and faults in the ice layer. With the help of new tools, the work of the production team is basically all day shooting during the day and farm rendering at night. If you need to change the size or shape of the huge ice cube, go to the Layout link to redraw the curve and give it to the next process. The visual effects link will also simulate the interaction between huge ice cubes and water, including details such as bubbles and splashes, and finally, render in Clarisse. 4. The environment of other mountains In the film, the growth of Togo is described in the form of memories of Laosai, part of which takes place among the mountains. The production team took a lot of very beautiful mountain views and modified them on this basis. For example, when shooting in Fortress Mountain, Alberta, the director thought the environment was good but there were too many trees, so some processing was done in the later stage. For another example, in the shots of the settlers' houses, there are no Alaska mountains in the real shots, and they need to be added later. Real shot Final shot When designing the background, compared to the cumbersome work of drawing a large number of digital landscape maps, the production team adopted a 3D method that combined digital high-modulus, lidar scanning, and photogrammetry technology. In the end, the mountains that we saw close to the lens were sculpted, textured, and look-dev processes were completed. There were also some Clarisse renderings of trees, leaves, and rocks. In general, there was indeed a lot of background work that needed to be processed.
More
2020-08-31
3D Tutorials: How to Make Dogs in Togo(2)
3D Tutorials: How to Make Dogs in Togo(2)
The dogfaces system for sled dog facial capture In order to produce the final version of the sled dog animation, DNEG also explored the dog's motion capture and spent time developing the Facial Action Coding System (FACS) system specifically for the face of the sled dog. 1. Research and development of sled dog motion capture In the early stages of production, DNEG used well-trained sled dogs for motion capture in Animatrix Studios. We usually see actors wearing professional clothing for motion capture. This time the same process happened on sled dogs. The way the equipment is worn and how to set tracking marks on the fur is very technically challenging. Although these motion capture data were not used in actual production, it was still a good learning experience for the production team in the future. 2. Research and development: dogFACS system In order to have a detailed understanding of the facial muscles of sled dogs, and to clarify the working direction of facial binding as soon as possible, DNEG also started dogFACS research very early. Since there is a human face-catching system, there can also be sled dogs. The researchers categorized all sled dog expressions, and drew facial expressions based on the dogfaces system, including "mouth raised" and "wrinkled nose". For example, a growl is "upper lip lift", "wrinkle nose", "wrinkle lips" "The expression of these three actions combined together. Facial control schemes based on these expressions can enable animators to activate or counteract individual expressions, thereby creating more detailed and believable animations. DNEG’s binding department has re-developed the entire four-legged binding system to simulate higher standards of real performance, including a new front leg module, which has the function of “fixing limbs” so that animators can imitate most animals. The fixed state of the front leg when the weight is on the leg; the reconstructed spine setting is also included to improve the realism and function of the animation. In general, the binding process requires a lot of effort. It is necessary to work closely with the animation department to develop and improve the four-legged binding standard, and always draw inspiration and direction from research references. 3. Roaring time There is a scene where a little partner named Ilsa in the sled team roars at Togo, and Togo immediately subdues Ilsa. Please note that the seemingly real performance here is actually a CG shot. In the actual shooting, the dog handler used a "snarling device", which uses a rubber band and a prosthesis to open the sled dog's mouth. It is said that the sled dog who plays Ilsa is very docile and very coordinating with the whole process, but its eyes and constantly wagging tail reveals its extremely happy state, and it doesn't look angry at all. Real shot material The effect after replacing the CG head The production team decided to use a real shot of the body + CG head to solve it, and designed some conceptual images of the sled dogs being angry and fierce. This production process is very challenging for riggers, modelers, and animators. They need to accurately grasp the subtle facial animations of sled dogs, and Ilsa's head production is more detailed and precise than Togo. After all, the lens is mainly In performance Ilsa. Real shot material Final shot The most useful part of the dogfaces system is this close-up shot of Ilsa's rant. FACS expressions and mixed expressions are linear, while the sled dog's mouth and nose can be moved at will. In order to have the authenticity of proper micro-motion, the production team added many small details next to its nose. It is said that when the roaring scene animation test was shown for the first time, it was played side by side with the real shot material, and most viewers could not see that it was fake. Real shot material Final shot 4. Sled dogs skating on ice There is a plot in the film. The sled dog team braved the snow and cold to cross the unstable frozen lake under the leadership of Togo. When they returned, the ice surface was torn apart, and the sled dogs were moving forward quickly while their feet were slipping. The DNEG team also began to find some materials on the Internet about sled dogs slipping or falling, to provide a reference for finding the unstable feeling of sled dogs standing on the ice. These materials will be put into the rough cut of the animation processed with Time Editor to create a suitable effect for the overall shot like building blocks, and also help them determine the performance of the sled dog in some specific shots. From the perspective of animation quality and details, this method should belong to a relatively advanced blocking. 5. Animate multiple sled dogs The above-mentioned ice skiing. In addition, there are 11 sled dogs in Togo. The behavior of the sled dogs in the team is different from individual performance. For this reason, DNEG has developed a binding system to determine each The distance between the sled dogs in the team, if the distance is too far, part of the system will be displayed in red. Although this system is not super accurate, it can still provide the team with an approximate range. The production team carried out some layout development work when processing the animation, set up many different cycles and different speeds for the state of the sled dog action, and input them into the binding system so that the layout effect can be freely switched and loaded before loading The method of setting animation after binding is much more convenient. The Layout department can set up the sled animation according to the selected sled dog sport Cycles and speed while ensuring the work efficiency while achieving the authenticity of the effect as much as possible. It seems that this part of their work is also very convenient.
More
2020-08-28
Virtual Production:Past, Present, and Future(3)
Virtual Production:Past, Present, and Future(3)
The Evolution of Virtual Production Virtual production applied in the industry in early times with less coverage. In the early 1990s, Hollywood used virtual production to make dynamic storyboards, which is the current Previz. 360 degree super static instant short technique used in The Matrix Titanic was released in the United States on December 19, 1997, followed by Matrix in 1999, Matrix 2 and Matrix 3 in 2003, and Transformers 1 in 2007 and its sequels. These films had made a good application and promotion of virtual production, especially the use of Motion Capture and Motion Control. Combination of motion capture and real-time rendering Virtual production is well known to the audience after the release of Avatar in 2009. On December 16, 2009, Avatar and its Pandora brought the audience a beautiful feeling, making the virtual production familiar to the public. The director can directly see the compositing scenes instead of the traditional green screen in the monitor, which makes it refreshing. Avatar takes virtual production to the mainstream When the whole movie is shot on a green or blue background, there are more than 2000 scenes need to have post-production, including editing, VFX, sound and others; and the staff needs to match the shots one by one according to the script description. Hence, demands have driven the development of technology. After Avatar was completed, virtual production was integrated into a commercial module. The Outlook of Virtual Production Although there are many technical problems that need to be improved for virtual production, the development trend has been established and will be prosperous with the rising trend of the global box office. The live version of The Lion King fully uses virtual production The driving effect of Hollywood VFX films has made more and more directors and filmmakers recognize and contact this field, and use this new technology in their creation. A material library for virtual production is needed in the near future. Cultural and historical monuments, natural sceneries, and others have already become frequent visitors to the movie screen. However, all cinematography must go to the site at a high cost. The schedule of the actors is also difficult to coordinate. Real-time compositing scenes in the monitor Virtual production can solve the problem and reduce production costs. For example, when shooting the Colosseum in ancient Rome by virtual production, the actors perform in the virtual shed, directly utilizing the scene of the beast in the material library for cinematography. Movies are the art of creation. Using photos as a background can not meet the need for cinematography technology. A three-dimensional model needs to be built with real textures. Light simulation, such as early morning, dusk, or noon effect, can also be performed as needed. Then put them into the model library, transfer to the virtual production system according to the needs of creation, and let the camera flexibly choose the required angle to shoot according to the needs of the script. The establishment of the material library is a huge digital project, which not only includes all kinds of materials but also enables clear retrieval and calling. This project requires continuous investment and construction. We can build a framework through special groups, formulate technical standards, and incorporate the conformity into the database. We can also commercialize it for a fee to make this material library enter a healthy development track and grow slowly. Only green screen props are needed to simulate the interaction with the virtual scenes In the future, more and more traditional studios will be transformed into virtual studios. Within the scope of human control, 24 hours of cinematography can be provided and will not delay due to any weather. Here is a common sense of virtual production. The scenes built by traditional studios are fixed. The replacement of reality scenes takes one day or a few days, but that of the virtual scene only takes a few minutes. In the performance area, the modules can be assembled outside the venue in advance in the studio. The production of Disney's Mandalorian brings virtual production into the LED era The actor's schedule is very tight. Any accident, such as illness, injury, and so on, will disrupt the normal cinematography plan. The virtual production will make the scenes in the script in advance, and quickly call different scenes according to changes, coordinating the process of the crew to make filming more humane and more efficient. Virtual production has a very good prospect. The continuous improvement of virtual production builds a proper platform to improve the quality of filming. Creativity can be presented in a better way, bringing audiences a better viewing experience.
More
2020-06-09
Fackbook Customer ReviewsFoxrenderfarm

Powerful Render Farm Service

    Business Consulting

    Global Agent Contact:Gordon Shaw

    Email: gordon@foxrenderfarm.com

    Marketing Contact: Rachel Chen

    Email: rachel@foxrenderfarm.com