来年のSIGGRAPH Asia 2021は2021年12月14日から17日まで日本の東京国際フォーラムで再び開催されます。
「2020年は狂気の一年でしたが、来年のこの時期までに、日本の東京で開催されるSIGGRAPH Asia 2021で、自由に旅行し、東京で会うことができるようになることを願っています。 SIGGRAPH Asia 2021のテーマは、ライブインタラクションが私たちのコンテキストに不可欠であり、テクノロジーの人間化が避けられなくなったため、「LIVE」です」とSIGGRAPH Asia2021カンファレンスチェアを務める塩田周三さんは述べていました。
塩田周三さんはまた、SIGGRAPH Asia 2021のキービジュアルであるデジタルウォーターフォールの背後にある意味を明らかにしました。これは、創造と破壊の継続的なサイクル、豊富なエネルギーと活気を象徴しています。これはまさに彼が2021年のイベントを想定した方法です。 SIGGRAPH Asia 2021の詳細については https://sa2020.siggraph.org/en/ をご覧ください。
Interview with Mike Seymour, an Outstanding Digital Humans Researcher
What happens when technology has a human face? How digital humans will affect our lives? These are the questions that Mike Seymour is exploring. Mike is a Digital Humans researcher who researches on new forms of effective communication and education using photoreal, realtime computer generated faces.
Mike Seymour @ SIGGRAPH Asia 2019
Mike was Chair of Real-Time Live! in SIGGRAPH Asia 2019, organizing the program showcased the cutting-edge real-time technologies, from mobile games to console games to virtual and augmented reality from around the world. He is also the co-founder of MOTUS Lab at The University of Sydney.
Mike Seymour at TEDxSydney 2019
As the lead researcher in the MOTUS Lab, Mike is exploring using interactive photoreal faces in new forms of Human Computer Interfaces (HCI) and looking at deploying realistic digital companions and embodied conversational agents. This work has special relevance for aged care and related medical applications such as stroke victims, and those with memory issues.
He suggests that we need to find new ways to provide interaction for people, beyond typing or simply talking to our devices, and that face-to-face communication is central to the human experience. At the same time, he examined some of the many ethical implications these new forms of HCI present.
He is well known for his work as a writer, consultant and educator with the websites fxguide.com and fxphd.com which explore technologies in the film industry. These websites now have huge followings, as they provide an important link between the film and VFX community and the researchers and innovators who constantly push the limits of technology.
Some films and TV series Mike has worked on
In addition to fxguide.com and fxphd.com, Mike has worked as VFX supervisor, Second Unit Director or Producer on some TV series and films, winning AFI Awards Best Visual Effects for the movie Hunt Angels in 2007 and being nominated for Primetime Emmy Awards for the TV mini-series Farscape: The Peacekeeper Wars in 2005.
Fox Renderfarm was honored to have an interview with Mike Seymour in SIGGRAPH Asia 2019. Here’s the interview between Mike Seymour and Fox Renderfarm.
Fox Renderfarm: Would you give a brief introduction to Human Computer Interfaces (HCI)?
Mike: So I research Human Computer Interfaces or HCI, which is the idea of how we deal with computers. And if you think about it, most computers are just getting input from a mouse or a keyboard, but what if we could talk to our computers, what if the computers could respond to us emotionally. So the work that I do with digital humans or virtual humans is putting a face on technology, we’re putting a face there so that we can interact with that. Because after all, we work really well with faces, we respond to faces, we travel great distances to see someone face to face. So we think it'd be really interesting if we could take that idea of having a face, and put it on a computer, and allow us to work with that in a much more natural and human way.
Fox Renderfarm: What are your biggest achievements of HCI so far?
Mike: So one of the interesting things that's happened just in the last couple of years has been this amazing nexus of technology and approaches. We got this combination of things that are really blowing the doors of what's possible. Because we can start to produce very photorealistic digital humans, in other words, people that really look like us. Now, this is super important because if we produce something that looks not very good, we actually have a negative reaction to it. It's not like audio, whether you have sort of good quality, better quality, and then great quality. With people, we have either cartoons, or we need very very high quality. But if we have something that's not so good, people actually reject it out of hand. So we call it a non-linear response, in other words, as it gets better in quality, your reaction varies up and down a lot. So only recently, we've been able to produce this incredibly realistic faces. And most importantly for HCI, those faces can run in real time, so they can smile at you in real-time, talk to you in real-time, nod and gesture, just very different from a video or something you might see in a feature film, where they might have hours and hours to produce a clip. We need to produce these things in sometimes as short as about 9 to 12 milliseconds.
MEET MIKE @ SIGGRAPH 2017
Fox Renderfarm: Have you met any challenges in the HCI development process?
Mike: One of the big challenges we have is actually we've done a lot of really great work on faces and on being out to produce digital humans. That work’s not done, but it's certainly advanced tremendously in the last sort of three or four years. We're now having the grapple with how do we solve some of the issues over voices. If I'm actually talking to someone in China and I'm in Sydney, and like my colleague is from China, and of course he speaks the language that I don’t. So if we're on a conference call, and somebody at the other end doesn't speak Chinese, like I don't speak Chinese. We have this problem that I have to solve the language. Now, if I've got an avatar, something that I'm puppeting, then I would be able to speak in English, and have a version of me speak in Mandarin, and be able to understand across barriers. That’s good, and that's great. But what if I'm not puppeteering it, what if I actually want the computer to talk to me. I now need to make a synthetic voice. And the challenge right now is to see if we can do what we’ve done for faces to audio, to voices. It’s kind of a thing you may not expect. But of course, what we want is the computer to speak in a really natural way, to have the right cadence, the right kind of tone, the right kind of attitude. So getting that natural sounding and audio, it's not that it's harder than it is to do the vision. But we actually are a lot less tolerant of problems with audio. If you're watching a movie and the vision isn't quite right, then you can hear everything, you’ll be really happy. But if you were in a situation that the vision looks great, but you couldn't hear what the actors were saying, you'd switch the channel or go do something else. So what we're trying to do now is get the audio to be impeccably good so that it can go along with what we've been doing in vision.
MEET MIKE @ SIGGRAPH 2017
Fox Renderfarm: How do you think our life will be changed by HCI, with deep learning algorithms, GPU graphics cards rendering, and 5G?
Mike: The astounding thing is that now, we actually have more compute power than we need to do some of the functions we want to do with the computer. We can afford to spend some of the compute power, producing these amazingly interactive user interfaces. That's part one, and that's obviously been influenced enormously by GPU, and the much faster graphics. And on top of that, we've had a new approach to how to use the graphics which is AI or deep learning. So now we have the second part of the jigsaw puzzle which allows us to do incredibly clever things by letting the machine learn my face, and then synthesize a plausible version of my face, again, in real-time, because of that GPU. And then the third part of that jigsaw puzzle is that we're able to do that now increasingly with 5G. Now, 5G is obviously very new, but what it offers us is not just bandwidth, which we imagined it would be able to sort of transfer more data, that's part of it. But one of the real secrets for 5G is low latency. So, in fact, we can have interactivity, so things come to live when they are realistic, and rendered quickly. Because we've used actual faces to construct them, and then we have this very low latency, so we can interact. All of that is just going to change how we do communication education, even in areas you might not imagine, such as health.
Fox Renderfarm: Fox Renderfarm is going to provide online real-time rendering services, is that possible to cooperate with you on the HCI research?
Mike: We are really keen to work with people all over the world, and it's the mantra of our lab that the research that we do, we actually don't own the IP, so we give away all the data. We work with companies around the world so that we can give back to the community. Our interest is seeing that this moves forward. And one of the great things about rendering on the cloud, and the idea of being able to have a really good infrastructure that's on a global basis is that, with high-speed communications, and with 5G, we are increasingly seeing this being something that we can adopt into things that general people can use. So, at the moment we’ve got a history where I might be using a render farm if I'm a really big company. But what we're seeing now is this move to the importance of being able to do things that can be democratized, and I think we're gonna see this vast explosion where we want to have quite a lot of power on our personal device, but actually tapping into a broader deep learning, AI kind of environment to provide this great interactivity. And as that happens with low latency, and the kind of infrastructure we're seeing. The ability to scale up is just going to produce sensational results.
Fox Renderfarm: As the Chair of Real-Time Live! in SIGGRAPH Asia 2019, what’s your biggest surprise?
Mike: There are a lot of submissions to Real-Time Live! this year. But Real-Time Live! is a little different from other things because you need to actually mount a performance. It's a bit like volunteering for a stage show. If I am coming here to do a show, I will bring my powerpoint on my laptop. But if I'm coming here to do Real-Time Live!, like the Matt AI project, and a number of other projects that are being seen, you actually have to bring a whole lot of computers, a whole lot of gear and actually mount a live presentation. You have nine minutes to sort of wow the audience, and of course, it's very unforgiving because, in nine minutes, you can’t afford to switch the computer off and start again. So we've been really impressed by the variety of the projects, and the variety of applications that they’re addressing. So we have teams that are addressing making digital characters talk, which is one of my favorites, I love that one. But we've also got ones that people are looking at how to use VR and real-time graphics for science research, for communication, as well as just artistic pieces that are very much just producing a really amazing show in their own right.
Real-Time Live! in SIGGRAPH Asia 2019
Fox Renderfarm: You were doing VFX before, and you are a researcher and also Co-Founder for fxguide.com, what’s the biggest influence along your multi-dimensional career path? What do you do to keep yourself inspired and motivated?
Mike: I was in the visual effects industry for many years and got nominated for Emmys and AFIs, and that was all great. I enjoyed that and it was terrific work. What I decided a little while ago, having done quite a lot of research and teaching and increasingly doing consulting work to companies around the world, which we still do, I thought it would be really interesting to up that research component and get more involved with hardcore research. So I still come consulting, I do work for major Hollywood studios, and I enjoy that work tremendously. But what I'm interested in is can we, in addition to that work, in the entertainment industry, take that tech and apply it to these other areas. So, for example, my research area at the moment is seeing if we can take some of these digital human technology and use it for stroke victims. So, people that have had a stroke and have trouble forming short-term memories, are very good with long-term memories. But they literally find everything that's going on around them today a little unfamiliar and disconcerting. As an extraordinary high level of stroke in the world, a lot of people have strokes, and quite a high percentage are actually under the age of 65 and wanting to still continue to contribute and work, because they are of that younger age. Now, of course, we want everybody to benefit from this, but particularly those people that are still trying to work in the world, if you have problems with short term memory, all technology starts to become a challenge. And we expect someone to use a computer just (as) to use a phone these days. Well if we could put a familiar face on the technology, a face from their past, a face that is I don’t think is a real person, but they are familiar, reassuring. Then this new thing, this new technology whatever it is, suddenly no longer seem quite so harsh, so unfamiliar, so disconcerting. And we think that's a really good way of being able to help with rehabilitation. So this is just one of the areas that we are looking at, taking this terrific tech from the entertainment industry, which I love to death, but just seeing if we can help people that are less fortunate, that have been through really hard circumstances.
Fox Renderfarm: Who or what projects inspire you most in VFX and Interactive Technology respectively?
Mike: So it's been really great work done in technology around the world. Obviously, some of the big film companies like Weta Digital and ILM have been doing terrific work. The research that I've been doing, we've managed to partner with companies around the world. So when we were doing a digital version of me, for example, we are partnering with Epic Games, but also with Tencent, which is terrific. And companies in Serbia, in England, and so it's an international kind of collective. And one of the things that really inspires me is how open these companies are working together and sharing what's going on. Because there's a lot more to be gained by expanding what we can do, than people worrying about individual bits. So the community that's doing this work has been really generous and really open with their work.
Fox Renderfarm: What’s your comment on Gemini Man?
Mike: Gemini Man is one of the most startling and just groundbreaking pieces of production that I've certainly seen, I was really impressed by a number of things. Firstly, they were doing work at Weta Digital, where we really knew the character very well at both ages. We know Will Smith as he is today, and Will Smith earlier in his career. We know from our own research that the more you are familiar with the face, the harsher you are. So if you have a younger version of someone you didn’t know, it may look great to your eyes, but their brothers or sisters would be very upset by that wouldn't feel right to them. So what we're trying to see is if companies like Weta can produce very familiar faces in a way that we find acceptable, reassuring, entertaining, and I think they've really done that with Gemini Man. The second thing that really impressed me is that in that film, while it's an action film there are a lot of slower emotional scenes, where there is really no way to hide. The young Will Smith is on screen and the camera isn't flying around. Sure, there are bike chases but there are other scenes he is really acting so that the audience can buy into that performance, I think it's terrific. I really applaud the work that the team of Weta Digital have done, it's absolutely well breaking.
images source: fxguide.com
Fox Renderfarm: Any other things you want to share with CG enthusiasts?
Mike: I think one of the things that I've been really happy about is how internationally the community has come together. There are teams now have got like pockets of excellence. There's a couple of teams in China that are just spectacularly good. And obviously, what we've seen with the work in China, I’ve actually lectured up in China, and visited many times, is we've got a real depth of both technical expertise and creativity. So it's really great to see the infrastructure being built up, things like the render farm and so on. So that they can provide that technical support that will match the creativity, I think that’s been really good. Now there are two teams in China, I can think of, there's a team in Europe, a team in New Zealand, a team in Serbia, and in London, and of course, America. And so what's great is to see that this is a very balanced international effort, and I love the fact that here at SIGGRAPH Asia, we’ve got all of the teams coming and presenting their work and sharing things. Because, as I said earlier, there’s so much can be gained by people cooperating and working collaboratively together. And from all my years in the film industry, it's a thousand people that do the visual effects on a film. So you need this great collaboration of artists this great infrastructure from companies supporting that. And then, of course, you need people willing to be open and share their ideas, as they're doing here at SIGGRAPH Asia. So, it's really great.
Interview with SIGGRAPH 2021 Conference Chair, Pol Jeremias-Vila, A Man of Many Responsibilities
Pol Jeremias-Vila is the Co-Chair of SIGGRAPH Asia 2019 Computer Animation Festival (CAF). He has been a consistent force in helping to elevate the conference in many years.
Originally from Spain, Pol is the Lead Graphics Engineer at Pixar Animation Studios where he develops algorithms to help artists make movies. He is credited in multiple movies including Toy Story 4, Incredibles 2, Coco and Finding Dory. In addition to his credits on films, he is also the co-founder of Shadertoy.com, a website that enables graphics enthusiasts to create and share rendering knowledge.
Since 2012, Pol has been actively involved with SIGGRAPH, holding multiple roles on past conference committees, including as Computer Animation Festival Director, Real-Time Live! Chair, and Virtual, Augmented and Mixed Reality Chair, as well as serving as a content contributor and juror.
Also, he will chair the SIGGRAPH 2021 conference in Los Angeles. Let’s look forward to a new CG memorable ride.
Here’s the interview between Pol Jeremias-Vila and Fox Renderfarm, in which Pol shared his SIGGRAPH experience and the unforgettable memory in SIGGRAPH Asia 2019.
Fox Renderfarm: Why are you so passionate about Computer Animation Festival (CAF)?
Pol Jeremias-Vila: One of the things I like about the Computer Animation Festival is something as technical, as how we render a polygon can be used to tell meaningful stories that can either inform people how the lights went off in Puerto Rico, and how that affected the rest of the country, or it can tell a story about Mascot. It can help with the development of feature films through visual effects, like this simple piece of technology can help tell all these different stories, and it can help create this medium. And I think that's a very interesting field and I personally really like it of course.
Fox Renderfarm: What’s your goal for the CAF in SIGGRAPH Asia 2019?
Pol Jeremias-Vila: We wanted to create a show this year that had a lot of varieties that you could see the different ways in which computer graphics are used. And for us, it was important to showcase scientific visualization. We believe this is a field that uses computer graphics in a very important way, and we wanted to support that. So we actively supported that, you can see it in the film, similarly, visual effects. I think for us that was one of the goals, say, we're gonna make a show that really tells the story that you can use this technology in different ways. And it doesn't need to be just short films. It could be advertising as well.
CAF in SIGGRAPH Asia 2019
Fox Renderfarm: Any unforgettable memory about this CAF?
Pol Jeremias-Vila: This year I’m co-chairing along with Jinny Choo, the SIGGRAPH Asia Computer Animation Festival. So great memories, I think seeing the numbers of submissions coming in was really satisfying. We did a lot of work to outreach to areas in which we hadn't done it as intensively. And seeing all those numbers and seeing all the submissions from schools all over the world, it was really rewarding.
Fox Renderfarm: There is an ‘inter-see show’ in this CAF, any efforts behind that?
Pol Jeremias-Vila: For us, it was a way to break the rhythm of the show and make sure that there was a little bit of surprise as well, like something that wasn't really expected. We have inter-see shows after each piece, so you can expect the inter-see show, and then we put something that may be surprising for some people, hopefully, funny, and trying to make it all be a more coherent experience. Even though they're disconnected stories, we try to create a flow that lasts an hour and 40 minutes. It doesn't feel like too disconnected. It needs to flow.
Fox Renderfarm: Could you share some rendering technology development trends with us?
Pol Jeremias-Vila: One of the things that we are seeing in the Computer Animation Festival (is) that there are more submissions that are using real-time engines to produce short films. And this is always interesting and we try to support every technology that is used for filmmaking. So this year we showcased some works that were using real-time engines as well as other techniques that are already using, offline renderers, path tracing or ray tracing. Again, we don't necessarily look at the technology per se more, so like the artistic composition and a story. We do try to showcase the different ways in which you can use computer graphics to tell stories. For example, this year we have scientific visualization, we have advertising; we have visual effects breakdowns, short films. All of them use computer graphics regardless of being a real-time render or being an offline render. They all use this medium to tell stories and that's what really matters, and it's at the core of the festival.
Some works of CAF Electronic Theater
Fox Renderfarm: Any difficulties that you’ve met when you were working on CAF? And how did you solve it?
Pol Jeremias-Vila: So there is an obvious physical difficulty when you are on site. You have to deal with screens, projectors, light that might be coming in from other rooms. We try to create this perfect environment and to enjoy films. And we try to be as respectful as we can with the works that are submitted to our conference. We care a lot about how that video is playback. (I’m) not sure (if) that is difficult but it's definitely one of the parts that we try to really take good care (of). Another part that's interesting always is how to deal with these big numbers of submissions, and how to make sure that they are all properly reviewed that we have enough opinions on each of the pieces. So we can have our jurors provide good decisions, so they have enough information to do a good decision. What we see though is that we do have a lot of content that gets through our hands and we would love to have more spaces, in which we can show it. So this year we have also an Animation Theater that goes and runs all day long.
CAF in SIGGRAPH Asia 2019
Fox Renderfarm: SIGGRAPH is closely associated with the emerging technologies, how do you integrate them better?
Pol Jeremias-Vila: Personally, one of the things we did in North America in 2017, was to invest in a new way of seeing 360 and VR films. And for us, it was to create a new physical space that people could go in the same way that you go into the Electronic Theater to see the best of the 2D films. Can we create a physical space that people can go and enjoy VR? I think it was a great success. It's happening as well here in SIGGRAPH Asia. And I'm sure that sometime, we will see some forms of stories that are grounded in the real world through AR or something like that. I don’t know exactly what that will be, and I think that's why SIGGRAPH always needs to be aware of what's happening in those spaces, what are those stories going to look like, and how we are going to support those creators. That's where I see that as SIGGRAPH members, we need to be thinking about those things, talking about it, and talking to the people that are creating those stories to make sure they have a place here, and they can show it.
Fox Renderfarm: SIGGRAPH 2021 will be in Los Angeles! As Conference Chair for SIGGRAPH 2021, anything you want to share with us?
Pol Jeremias-Vila: Really excited! We will start working the way on preparing SIGGRAPH 2021. We bring the team together. It's going to be in Los Angeles. I'm not sure what technologies will be around in 2021. We might have some surprises. The team that we are working together is spectacular. I'm really confident that we're gonna have a really awesome show. We're gonna have actually our first on-site meeting in February of next year. So, this project takes time to prepare. So, really excited about it!
Fox Renderfarm: Any other things you want to share with CG enthusiasts?
Pol Jeremias-Vila: Yes! I have it (the brochure) right here!
Fox Renderfarm: Any other things you want to share with CG enthusiasts?
Pol Jeremias-Vila: If you have an opportunity to see the Computer Animation Festival 2019. We hope you really enjoy it! Please request showing in your local areas. We'll be more than happy to try to arrange that. We hope you enjoy the show!
Special thanks to Rajeev Dwivedi from Live Pixel Technologies.
Interview with Ernest Petti, Revealing the Production Secrets of Frozen 2
Fox Renderfarm Interview
During the visual and information feast - SIGGRAPH Asia 2019, Fox Renderfarm is delighted to have the chance to talk with Mr. Ernest Petti, Studio CG Supervisor at Walt Disney Animation Studios, who has also devoted to the production in Frozen 2, the biggest-worldwide-opening animated film of all time.
Ernest Petti has been working with Walt Disney Animation Studios for over 19 years, now is the Studio CG Supervisor. In this role he acts as a bridge between Production and Technology for long-term strategic initiatives, orchestrating the initiatives and projects of the Workflow team and uniting them to fit within the studio’s vision for workflow. Prior to this, he served as Technical Supervisor on Ralph Breaks the Internet (2018) and the 2016 Oscar-winning feature Zootopia. Ernest joined Disney in 2000 as a software engineer in the Technology group and has served as a supervisor in Lighting, Look Development, and Tactics. Credits include 2014's Oscar-winning feature Big Hero 6, as well as Wreck-it Ralph (2012), Tangled (2010), and Bolt (2008).
In the Featured Sessions of SIGGRAPH Asia 2019, Ernest delivered a presentation named - "Frozen 2" and the Past, Present, and Future of Tech at Disney Animation, and he was also being part of the discussion and communication: Proactive Large-Scale Pipeline Efficiency Management, with a panel from large-scale Animation and VFX studios to share insights to their challenges on how to balance between creating amazing visuals as well as given a tight production time frame.
During our interview, Ernest expressed his excitement about this year’s SIGGRAPH - how interested he was to connect with other people, companies and technologies. Besides, among all the cutting-edged technologies shown, machine learning sparks his curiosity about its application during his work. Of course, the development of rendering technology arouses his wonder about how to make a more interactive and direct manipulation with rendering, especially with GPU that comes along.
More insights into the production of Frozen 2 is definitely what Fox Renderfarm would not miss, and are also what we can’t wait to share with you. Let’s check out the interview video and article, and see how Walt Disney Animation Studios combines timeless storytelling with innovative technology.
(F=Fox Renderfarm, EP=Ernest Petti)
F: Could you tell us your main responsibilities in Frozen 2? How did you cooperate with the VFX departments along the production?
EP: My role is Studio CG Supervisor, I’m in the studio level position that kind of overlooks the long-term technical development and artistic workflows over the course of shows.
I work closely with the technology group and with the productions, and try to find the bridge between those over time. I was the Technical Supervisor on Ralph Breaks the Internet. On that show, we did stuff like the first steps into nested proceduralism for some of the buildings on the internet that paved the way then, and was built on top of further for Frozen 2. So there is that sort of continuity of shows that we passed on. And then in my current role, looking at workflow is a big thing that we are focused on and is in the concurrent collaboration and making that as smooth as possible between different departments. So talking to the groups in Frozen 2, like all the Visual Effects Supervisor like Steve Goldberg and the Technical Supervisor like Mark Hammel, and working with them and understanding what they're doing on their show, and making sure it's in line with the shows before, and moving into the future so that we can really build to what will come next.
Basically, when the new set of leadership starts on a show, we try to connect with them, then start understanding what our show’s specific needs, and what are things that we want to advance in the studio that makes sense to, also try to dock on at that show so that we can have some continuity.
F: How did you cooperate with the Production Director and the Production Designer to actualize the creativity through the technologies?
EP: When that story starts forming and the show leadership really is working with the Production Designer and with the Director to understand the story and what the look of the film is, achieving that comes first. So we really want to partner closely on what technology might be needed to make that happen. It’s very important that we’re able to achieve that. Then in partnership with that, it’s can that build off things that were already in the plan; should that accelerate things that we may have been thinking about but weren't going to necessarily line up with that timing; and are there things that aren't necessarily tied to show needs, but we do want to advance and this would be the right timing to do that, for instance, the work in USD - of course, we're hearing about lots of studios, we’re trying to make significant advances in USD in our pipeline for Raya and the Last Dragon, which is our movie coming out next November - so that's not a show need, it's nothing out of the artistic vision of that movie that said we need USD, but it'll help advance a lot of future tools and workflows. And we need to find the right place to start feathering that in.
F: Which part do you like the most in the production of Frozen 2? Why?
EP: It’s a movie that has a lot of scope and scale to it. I like that it kind of takes you in more surprising directions. It takes you outside of what you've seen before in the first one, so it's not staying in the same zone, it's leaving Arendelle. It’s going out into the wild into a different environment and world, and it has sort of unique Spirits and settings that we haven't necessarily done before.
F: How did you achieve the scale of autumnal trees and foliage through technical changes?
EP: In a lot of our films, we have (been) trying to strike the balance between artistic stylization and procedural simulation to make sure we have the complexity and richness that we want, and yet still the stylization that we need. And we've worked overtime to build the tools to give that stylization for, say a single tree. And then you place them well to get a cluster of trees that looks nice. But now, when you have a whole forest that has a certain level of stylization to it, and it has a lot of depth to the ground cover, the pebbles and everything else around it as well. We needed to prove our toolset so that we would not only have that sort of balance of stylization and complexity on the single tree level, and then make a whole forest of them, we can stylize the appearance of the forest as well. So we had nested proceduralism which would allow us to build up, like here's a pebble, here's a cluster of pebbles, now here's a ground cover that includes some leaves and a cluster of pebbles; and then it includes a tree, and then there's a grove of trees, and then the grove of trees expands to the forest. And you can sort of stylize but also build up and populate at each of those levels. And then we created a tool called Droplet that was essentially a procedural painting tool that you could then paint down the trees in a more painterly fashion, so that you could have more direct control over the style and flow of the forest as a whole, and all the trees throughout it. So it did definitely lead to expanding on our Bonsai tree tools and our Aurora instancer, as well as developing the new tool like Droplet.
Bonsai Instancing Zootopia Test
F: What’s the challenging part of the production? How to solve it?
EP: I think there's a couple of areas environment side we had a very lush rich forest environment that includes very colorful diverse autumn forest, but also because its fall leaves are on the ground that also had to be very rich. On top of that when you start adding in the elemental spirits and you have something like Gale, the Wind Spirit, you're tying that environment as a character and having to make sure there's a lot of coordination between how the environment is built; and how the character of the wind plays through that, and then interacts with the rest of the environment, and with any characters and the scenes with Anna or Elsa or any of the other characters. So this film presented a lot of challenges with collaboration. A lot of things that like the Water Spirit and like the Gale that didn't fit neatly into one department, one group of people or a linear pipeline. So the challenge is finding the ways to iterate smoothly when you're having to have a very tight connection between people across departments.
I think we always start with the sort of research into trying to ground the challenge that we're looking at, and what the closest connection is to the physical world. When you have the Water Spirit taking the form of a horse, you study water, you study horses, and then you bring all the people across departments together, and everywhere from art and trying to understand the stylization, and how far you want to go in wateriness versus sort of solidity. The effects departments, the spray and the foam of the mane and the tail to the animators, so you really have everyone working together to look at the challenges together, form more of a team around the problems you're trying to solve.
F: What did you do to make these characters realistic?
EP: There is the realism you want, the realism of a horse movement or the realism of water movement. And where do those conflict, and how do you find the right balance between those, and the choices you may make for a beautiful horse animation may not work when the mane and tail are refractive water that you can see through. Say, the mane goes in front of the face, it's not actually completely covering the face, you're kind of seeing through that. So that's again where what decision might be made in animation may need to be iterated on when you see a render. Because of the effects of the water on the character, so it's definitely a challenge to find just the right balance for that character.
F: In this process, what kinds of tests did you do to give the designers the idea?
EP: I think with all of the tests and with the Nokk as well, we did start with some hand-drawn tests. Even seeing once again the example of legs, and how much the leg should sort of splash away into water, and how much they could stay fairly solidified, was something that we tested with some hand-drawn tests first. And then you take that into animation, and then you would try to run little sort of various types of character tests, like a still test of the Nokk with just some head animation. That informed that we needed to take the water distortion and reduce it on the face. Because there were subtle movements, that distortion was making the rig harder and keep that just on the body. Then you would do a test on how much spray and spindrift should be in there. And you do a running test. So it's kind of you really work closely as a group and sort of run these tests to explore different aspects and keep the Directors in the loop for that time.
F: Could you explain more about the unified rendering?
EP: I think when we talk about unified rendering and looking forward, at a lot of places at Disney animation, we have a glViewport that we use for when we're viewing things in our various departments and getting previews as we're working, and then you do a final frame render that on a render farm and takes a significantly longer chunk of time. Sometimes those technical requirements require different paths and different pipelines. We would love to find paths where almost what you see is what you get, and so there's more of a continuum from the preview that you see, to the final frame. It's almost more of a transition from speed to quality over time, but it's less of a dichotomy.
F: Any suggestions for the audience when watching Frozen 2?
EP: The movie takes place three years after the original story. The movies made six years after the original one came out, so there's been a lot of tech technology advancements. And I hope people can see it in all the beautiful images that are on the screen. At the same time, we want to bring you back to the same characters that you love from the first film. And you'll see some nice additions, like of advancement. Olaf now has a permafrost covering so that he won't melt as it's getting into autumn. He's learned to read now, and all the characters have sort of progressed. Because there has been a time period that's passed in the film as well.
F: You have made so many great animation features, which one do you feel most proud of? Why?
EP: I love different aspects of all of them. I have a special connection to Zootopia to a certain degree because XGen was one of the first developing (tool), when I first started at the company way back. And it was a big sort of fur-based show, and there was a lot in there that connected with me. Returning to Wreck-it Ralph with Ralph Breaks the Internet, it's always fun to revisit a place that you've been to before. And even going all the way back to Bolt that had a certain painterly style to it. That was exploring sort of a looser look that was a very different look at that time.
Thank Mr. Ernest Petti again for accepting our interview. Keep up with Fox Renderfarm and follow us on social media platforms, more interesting and insightful content is waiting for you!
Special thanks to Dan Sarto from Animation World Network, Ian Failes from VFXVoice and Chang Wei-Chung from InCG Media.
SIGGRAPH Asia 2021 Will Be Held at Tokyo
Next year’s edition, SIGGRAPH Asia 2021, will once again be held at the iconic Tokyo International Forum in Tokyo, Japan, from 14 – 17 December 2021.
“It has been a crazy 2020, but I am hopeful that by this time next year, we should be able to freely travel and be able to meet each other physically at SIGGRAPH Asia 2021 in Tokyo, Japan. The theme for SIGGRAPH Asia 2021 is ‘LIVE’ as live interactions is integral in our context, and the humanization of technologies has become unavoidable,” said Shuzo John Shiota, SIGGRAPH Asia 2021 Conference Chair.
Shuzo also revealed the meaning behind the SIGGRAPH Asia 2021 key visual – a digital waterfall – which symbolizes a continuous cycle of creation and destruction, abundance of energy and vibrancy. This is exactly how he envisioned the 2021 event to be.
Visit http://sa2021.siggraph.org for more event details.
SIGGRAPH Asia 2020 Virtual Concludes on a High Note
Attendees were impressed by the high quality of inspiring sessions and sharing by industry heavyweights from Double Negative, Netflix, Pixar Animation Studios, Walt Disney Animation Studios, DreamWorks Animation and more.
17 December 2020 – The very first virtual edition of SIGGRAPH Asia concluded its week-long gathering of the global computer graphics community, with over 1,500 attendees tuning in to the on-demand and live sessions from over 38 countries and regions. Although the live event is over, registration remains open, and all content is available on-demand to participants till March 2021.
“10 days flew by in a flash and I can’t believe that we have now come to the end of SIGGRAPH Asia 2020 Virtual. It has been a truly amazing and inspiring ten days with a diverse set of pre-recorded sessions, as well as live and premiere sessions. I have learnt a lot from the insightful and informative sessions delivered by our amazing speakers from all over the world, and I am sure that the virtual event this year was yet another inspiring conference in the history of SIGGRAPH Asia,” shared Conference Chair, Jinny HyeJin Choo.
“For me, the best part of SIGGRAPH Asia is being able to connect and bring friends (both old and new) in the industry together, in one space. While we miss in-person, physical events, I am glad that my team and I managed to successfully transition and bring everyone together in a virtual setting despite the many challenges that came along due to the pandemic,” she added.
The virtual event featured over 400 talks and presentations that deep dived into the latest technologies in computer graphics, visual effects, virtual reality, augmented reality, and interactive techniques. Highlights from the virtual conference included keynote sessions by Academy Award-winning animator Glen Keane; VFX Supervisor & Creative Director from Double Negative, Paul Franklin; invited speakers from Epic Games, Facebook Reality Labs, Google, Soul Machines, Face the FACS, Motus Lab, Digital Domain, Pinscreen, USC Institute for Creative Technologies, Naughty Dog, Reel FX, Ubisoft, Pixar Animation Studios; 113 Technical Papers, 22 top films from the Computer Animation Festival’s Electronic Theater; a Diversity and Inclusion Summit, 19 specially curated Emerging Technologies and more. For Programs like XR (Extended Reality), that traditionally had live demonstrations within the Experience Hall at the physical event, some of the presenters managed to go beyond the conventional video conferencing to find creative ways to engage the audience and conduct live demonstrations of their projects in the zoom rooms.
The event ended on a high note with the first-ever virtual party held on Gather Town – with rooms customized and created by our very own Student Volunteers. Over a hundred delegates joined us at this party on the last day of our event.
To access SIGGRAPH Asia 2020’s on-demand content, register here: https://bit.ly/sa20reg
‘Shooom’s Odyssey,’ ‘Migrants,’ and ‘Box Assassin’ Take SIGGRAPH Asia 2020 Computer Animation Festival Honors
15 December 2020 – SIGGRAPH Asia 2020 Virtual has officially announced the three award-winning films from its Computer Animation Festival. Selected from a pool of 577 CG animated projects, 51 pre-selection jurors whittled the group down to 139; from there, an international jury of top professionals reviewed each project, choosing three award winners for the CAF Winners Screening, 22 top works for the Electronic Theater screening as well as another 22 for the Animation Theater program.
Take a moment to check out the Computer Animation Festival Trailer,
“Our CAF winners, Electronic Theater and Animation Theater screenings highlight the work of an incredibly diverse and talented group of artists from around the world,” said SIGGRAPH Asia 2020 Computer Animation Festival Chair, Dan Sarto. “Our jury helped assemble an extraordinary collection of animated projects, including 47 animated shorts, VFX reels, commercials, scientific visualizations and other works, presented online during the festival. Congratulations to all the winners and to everyone who took time to share their work with us.”
The SIGGRAPH the 2020 award winners are:
BEST IN SHOW: Shoom’s Odyssey
- Director: Julien Bisaro
- Producer: Claire Paoletti
- Production Company: Picolo Pictures
- Country: France
The 26-minute film, which won the Cristal for Best TV Production at Annecy 2020, tells the story of Shooom, a baby owl, that hatches just as a storm turns the bayou surrounding her tree upside down. No sooner has she fallen from her nest, then the little fledgling totters off into the mangrove, pushing a second egg from the brood along with her. Come hell or high water, she’s determined to find a mother... even if that mom turns out to be an alligator or a raccoon!
BEST STUDENT PROJECT: Migrants
- Directors: Hugo Caby, Antoine Dupriez, Aubin Kubiak, Lucas Lermytte, and Zoé Devise
- Producer: Carlos De Carvalho
- School: Pôle 3D
- Country: France
- Distributor: Je Regarde
Two polar bears are driven into exile due to global warming. They will encounter brown bears along their journey, with whom they will try to cohabitate.
JURY SPECIAL: Box Assassin
- Producer / Director: Jeremy Schaefer
- School: Ringling College of Art & Design
- Country: U.S.A.
Craig, a pizza delivery boy, recalls the night he was making a delivery to a cold-blooded gang boss and his goons. Unknowingly, Craig delivers not pizza, but the legendary Box Assassin. The assassin shows up and wreaks havoc among the gangsters. Craig attempts to escape but is taken hostage by the gang boss. At the mercy of a cold-blooded gangster, Craig fears this might be the end.
This year’s 22 films / projects selected for the Electronic Theater are as follows:
Déjeuner sur l'herbe
Jocelyn Charles, Jules Bourgès, Nathan Harbonn-Viaud, and Pierre Rougemont
Benjamin Berrebi, Jakub Bednarz, Diego Cristófano, Mohammad Babakoohi,Théo Lenoble, Karlo Pavicic-Ravlic, and Marthinus Van Rooyen
Benoît Filippin, Paul Gautier, Laureline Massias, and Mathieu Milaret
Shaofu Zhang and Andrew Chesworth
Daniela Dwek, Maya Mendonca, and Chrisy Baek
HBO Asia / Dream Raider / TV Show Opening Sequence
In Event of Moon Disaster
Francesca Panetta and Halsey Burgund
Arthur Allender, Mathieu Antoine, Léna Belmonte, Cyrielle Guillermin, Victor Kirsch, and Elliot Thomasson
Hinaleimoana Wong-Kalu, Dean Hamer, and Joe Wilson
Lenny Makes Some-Thing - “Turn Me On (feat.ARCX)” MV
Love and Fifty Megatons (VFX)
Christophe Sarraco, Pauline Lavelle, Hadrien Augier, Théo Blanchard, and Hugo Fredoueil
Soap versus COVID-19
Sous la glace
Milan Baulard, Ismaïl Berrahma, Flore Dupont, Laurie Estampes, Quentin Nory, and Hugo Potin
The Journey of Mankind is the Sea of Stars
The Computer Animation Festival’s 11 dedicated jury members, who spent two weeks reviewing 139 films and projects before choosing the final three award winners, 22 Electronic Theatre and 22 Animation Theatre selections, are as follows:
- BoKyung Park
Seoul Business Agency
Acting for Animators
Meditation with a Pencil
Walt Disney Animation Studios
The Monk Studios
- Dancing Atoms Click here to learn more about SIGGRAPH Asia 2020’s Computer Animation Festival.
Pack Your Virtual Bags & Get Ready for SIGGRAPH Asia 2020
Fox Renderfarm News
The 13th ACM SIGGRAPH Conference and Exhibition on Computer Graphics and Interactive Techniques in Asia will take place in a fully virtual format from 4 – 13 December. The annual event, which rotates around the Asian region in normal circumstances, attracts the most respected technical and creative people from all over the world who are excited by research, science, art, animation, gaming, interactivity, education and emerging technologies.
As the long-term partner of SIGGRAPH Asia, Fox Renderfarm hopes you enjoy Asia’s largest Computer Graphics & Interactive Techniques virtual event!
The theme for this year’s SIGGRAPH Asia ‘Driving Diversity’ will take on a new meaning as SIGGRAPH gives our diverse group of worldwide technical and artistic contributors the opportunity to connect with and inspire new communities.
“My team and I are committed to delivering a strong SIGGRAPH Asia 2020 that celebrates this year’s innovation, advances and achievements in computer graphics, interactive techniques and beyond,” said Jinny HyeJin Choo, the SIGGRAPH Asia 2020 Conference Chair.
With the event going virtual this year, more from our global community will be able to come together and participate in new and innovative ways and drive forward the forefront of our field.
So don’t miss out on the very first virtual @SIGGRAPH**Asia** conference on CG, VFX, VR, AR & InteractiveTechniques. Experience 200+ On-demand sessions & 80+ live and premier sessions from 4 – 13 Dec.
Here's The Schedule At-A-Glance:
From 4 December: Over 200 pre-recorded sessions will be available on-demand, allowing attendees to learn and review presentations prior to the scheduled live Q&A sessions.
10 – 13 December: Over 80 live & premier sessions such as Keynote, Featured Sessions, Q&A Sessions with the respective Program Contributors, will be scheduled during these 4 days.
There’s still time to register for SIGGRAPH**Asia** 2020 by click here.
See you there!
SIGGRAPH Asia 2020がバーチャルに！
11月17日から20日まで韓国の大邱で開催される予定だった4日間のSIGGRAPH Asia 2020ライブイベントは、完全に仮想化されます。 詳細はやがて明らかになりました。
2020年7月2日 – COVID-19のパンデミックによる進行中の世界的な旅行制限により、11月に韓国の大邱での従来型のSIGGRAPH Asia 2020を開催することが不可能になりました。 そのため、SIGGRAPH Asia 2020をバーチャルイベントに移行することを決定しました。
これは簡単な決定ではありませんでしたが、この現在の状況下では必要であるという共通の合意に達しました。 すべての参加者の安全は何よりも優先されなければなりません。 SIGGRAPH Asia 2020会議の議長を務めるJinny HyeJin Choo氏は、我々のコミュニティにとって初めての試みであるSIGGRAPH Asiaのバーチャルイベント化に対する関係者の理解が得られたことに感謝の意を表しています。
今年のSIGGRAPH Asiaのテーマである「Driving Diversity」は、世界中の様々な分野の技術者およびクリエイター達がこのバーチャルイベントに於いて互いに刺激を与え合う機会を得たたことが新たな意味を持つようになるでしょう。 会議の議長であるJinny Chooと彼女のチームは、この一年間に成し遂げられたＣＧ関連での技術革新とその成果をSIGGRAPH Asia 2020としての強力に発信しています。今回私たちは、この仮想フォーマットによって与えられる新しいグローバルコミュニティに集うことになりますが、この革新的な手法は私たちをこれまで以上に新しい次元に連れて行ってくれるに違いないと確信しています。
SIGGRAPH Asia 2020 Computer Animation Festival Submission Deadline Extended To 15 August 2020
Entries being accepted in multiple categories, including shorts, visualizations, VFX sequences, game cinematics, and real-time pieces
22 July 2020 – To support the recently announced move online for SIGGRAPH Asia 2020, ACM SIGGRAPH’s annual conference and exhibition on computer graphics and interactive techniques in Asia, the SIGGRAPH Asia 2020 Computer Animation Festival (SACAF 2020) has extended the deadline for submissions to 15 August 2020. Conference dates and more program details will be announced as they become available.
The festival is looking for new, innovative animated content in a host of categories. If your project used a computer for some part of the production, SACAF 2020 wants to see it! Creators from around the world are invited to submit their projects and help showcase the world’s best, most inspiring, entertaining, and cutting-edge computer animation.
An international jury of computer animation experts will judge the best works entered in each category; from that pool of top picks, they will hand out three prestigious 2020 awards: Best Student Project, Jury Special, and Best in Show. In addition, judges will select between 30-40 (or more) “best of the best” works from across the submission pool, to be curated into the two “stars” of the festival: the always spectacular Electronic Theater and Animation Theater programs.
Each year, the Electronic Theater and Animation Theater never fail to dazzle and delight conference audiences eager to sit back and enjoy an entertaining and thought-provoking sample of the world’s best CG animation. This year, online audiences will have instant access to both shows, as well as all the festival’s special programming!
There is no entry fee, and you can enter as many projects as you like. Join the festivities and help the festival honor the best in CG animation -- submit your latest projects in any of the following categories:
- Computer Animated Shorts: Includes character animation, narrative works, experimental works, opening sequences, game cinematics, selections and/or montages of animated television series, new-media format.
- Animated Feature Films: Selections and/or montages of computer animation created for animated feature films.
- Music Videos: Commissioned and/or independent works that use any combination of computer animation, digital effects, and live action to illustrate, enhance, and/or complement a musical creation.
- TV and Web Commercials: Advertisements created entirely or partially with computer animation and/or digital effects. This category also includes promotional spots, broadcast bumpers and graphics, and public service announcements.
- Visualizations and Simulations: Computer animations created to explain, analyze, or visualize information for applications including scientific research, architecture, engineering, systems simulations, education, and documentary projects.
- Visual Effects for Films and TV Programs: Selections and/or montages of visual effects created for live-action films and/or for television programs.
- Real-Time Animation: Game, web, and mobile animations that are rendered in the same amount of time that it takes to play them back. Real-time technology demos are also encouraged! Real-time technology demos should be submitted to Real-Time Live!
- Others: Computer Animations that do not fit in any of the above categories.
The submission deadline is 15 August 2020. Visit the SACAF 2020 webpage for more information, including how to submit, submission rules and requirements, and an FAQ.
Or, you can submit your work directly by logging into the SIGGRAPH Asia Submission System.
Please share the news with classmates, team members, and colleagues - SACAF 2020 wants to see their work too!
“The unique circumstances we face in staying safe while battling the deadly COVID-19 pandemic has necessitated moving the conference, including SACAF 2020, online,” says SACAF 2020 Chair and AWN publisher and editor-in-chief, Dan Sarto. “But we’re planning to take advantage of this new digital format and share our exciting programs of the best CG animated works with people around the world. We’ve extended the festival submission deadline to 15 August, so now there’s really no excuse for you not to enter!”
"These are tough times for the artist community as we come to terms with the economic impact of the deadly coronavirus COVID-19,” adds SACAF 2020 Co-Chair and Destiny Logic Director & Producer, Siva Kumar Kasetty. “Let’s stay strong and support each other in whatever possible ways we can. Let’s fuel our resilience and passion to tide us through these tough times!”
As the long-term partner of SIGGRAPH Asia, Fox Renderfarm, your TPN-Accredited cloud rendering service provider, is committed to providing more opportunities for CG artists to show their talents, so we invite creators from around the world to submit their projects and showcase the world’s most innovative and exciting computer animation.
SIGGRAPH Asia 2020 Goes Virtual
The four-day live event originally scheduled to take place from 17 – 20 November in Daegu, South Korea, will now be fully virtual; more details will be revealed in due course
2 July 2020 – Ongoing worldwide travel restrictions due to the COVID-19 pandemic make it impossible to host an in-person SIGGRAPH Asia 2020 in Daegu, South Korea in November. Therefore, we have decided to move to a virtual conference for SIGGRAPH Asia 2020.
“This has not been an easy decision, but we came to a common consensus that it was necessary in this current climate. The safety and well-being of all our participants remains our top priority. We appreciate your understanding and patience as we adjust our plans and refocus our efforts to put together the very first SIGGRAPH Asia virtual event for the community,” shared Jinny HyeJin Choo, SIGGRAPH Asia 2020 Conference Chair.
The theme for this year’s SIGGRAPH Asia ‘Driving Diversity’ will take on a new meaning as we give our diverse group of worldwide technical and artistic contributors the opportunity to connect with and inspire new communities. The conference chair, Jinny Choo, and her team of program chairs are committed to delivering a strong SIGGRAPH Asia 2020 that celebrates this year’s innovation, advances and achievements in computer graphics, interactive techniques and beyond. We are optimistic that the virtual format will allow our global community to come together and participate in new and innovative ways and drive forward the forefront of our field.
As the details of this virtual experience are resolved, the exact dates and format will be revealed in the coming weeks and will be announced through http://sa2020.siggraph.org.
Call for Submissions: SIGGRAPH Asia 2020 Computer Animation Festival
Fox Renderfarm News
The SIGGRAPH Asia 2020 Computer Animation Festival is open for submissions! There's only a month left!
As the long-term partner of SIGGRAPH Asia, Fox Renderfarm is committed to providing more opportunities for CG artists to show their talents, so now we invite creators from around the world to submit their projects and showcase the world’s most innovative and exciting computer animation.
This coming 17 - 20 November, the SIGGRAPH Asia 2020 Computer Animation Festival will convene in Daegu, South Korea, to celebrate the vibrant, diverse, and inspiring world of computer animation. From short films to scientific visualizations to AI-enhanced deepfakes, this year’s festival promises its most expansive and compelling program ever.
An international jury of top computer animation experts will judge the best works entered in each category; from that pool of top picks, they’ll hand out three prestigious 2020 awards: Best Student Project, Jury Special, and Best in Show. In addition, the judges will select between 30-40 (or more) “best of the best” works from across the submission pool, to be curated into the two “stars” of the festival: the always spectacular Electronic Theater and Animation Theater screenings.
Each year, both the Electronic Theater and Animation Theater never fail to dazzle and delight conference audiences eager to sit back and enjoy an entertaining and thought-provoking sample of the world’s best CG animation.
If it’s animated, and a computer was used at some part of its production, we want to see it! Join the festivities and honor the best in CG animation -- submit your latest projects in any of the following categories:
Computer Animated Shorts: Includes character animation, narrative works, experimental works, opening sequences, game cinematics, selections and/or montages of animated television series, new-media format.
Animated Feature Films: Selections and/or montages of computer animation created for animated feature films.
Music Videos: Commissioned and/or independent works that use any combination of computer animation, digital effects, and live-action to illustrate, enhance, and/or complement a musical creation.
TV and Web Commercials: Advertisements created entirely or partially with computer animation and/or digital effects. This category also includes promotional spots, broadcast bumpers and graphics, and public service announcements.
Visualizations and Simulations: Computer animations created to explain, analyze, or visualize information for applications including scientific research, architecture, engineering, systems simulations, education, and documentary projects.
Visual Effects for Films and TV Programs: Selections and/or montages of visual effects created for live-action films and/or for television programs.
Real-Time Animation: Game, web, and mobile animations that are rendered in the same amount of time that it takes to play them back. Real-time technology demos are also encouraged! Real-time technology demos should be submitted to Real-Time Live!
Others: Computer Animations that do not fit in any of the above categories.
Make sure you remember that all submissions must be received by 31 July, 2020, 23:59 UTC/GMT.
Please share the news with classmates, team members, and colleagues - SACAF 2020 wants to see their work too!
Click here for more information, including how to submit, submission rules and requirements, and an FAQ.
Aiming to promote the CG industry, Fox Renderfarm provides fast and secure render farm service for CG Artists all over the world to empower them to focus more on creation.
Stay safe, keep creating, Fox Renderfarm hopes to see you at SIGGRAPH Asia 2020 Computer Animation Festival this coming November!
Recent News List
FGT3D Santa's New Ride Winners Announced!2021-01-11T08:45:15.343Z
Enjoy Creating 'Art for Your Walls' and Keep Creating2021-01-04T09:23:02.354Z
A Happy Hobbyist Shows the Beauty of CG2020-12-21T06:27:47.381Z
Fox Renderfarm Desktop Client 188.8.131.52がリリースされました！2020-12-18T03:44:00.289Z
SIGGRAPH Asia 2021 Will Be Held at Tokyo2020-12-18T03:02:20.631Z
SIGGRAPH Asia 2020 Virtual Concludes on a High Note2020-12-18T02:48:11.096Z