Journey to Maya

Texan Boot

Starting with this boot as a reference
we imported the picture into Maya to work closely to the image
We downloaded a 360 background in order to locate our boot

Mask

Drift Mask from Fortnite Please Read : This is just the STL files to be  able to print your own mask on a 3D printer. If you p… | Kitsune mask, 3d
My reference’s Mask
We started from this 3d head model. Using the quad draw we achieved to get the base for our mask.
once we create the face we then start to model it as mach similar to our reference
At this point my mask is almost ready and I sent it to mudbox in order to modify the surface, adding details, material and colors…
But first… let’s check the UV
Still a “work in progress…”
3d render video Maya

Denoise, Track and Stabilize

original plate, focusing on the two signs on both side of the street
Here I have denoised the clip in order to track the elements better. After that I tracked the two elements and applied the new logos.
process on nuke



original plate, the clip is a bit shaky so I am going to stabilize tracking two points in the scene
The clip is stabilized and I have tracked the sign on the left, applying a new image on it
process on nuke
timelapse: Stabilize process

VIRTUAL PRODUCTION

The film industry continue to be imbued with more complex action, and invariably more complex visual effects, filmmakers are turning to new production techniques to imagine these scenes virtually even before they’ve been shot, and then to scout sets, interact live with CG assets and characters, and to shoot, revise and iterate virtual scenes on the fly.

As VFX have grown to be a greater part of movies and television today, there’s a growing divide between what the filmmakers can see through the camera on the live-action set, and what they have to imagine will be added digitally many months later. Virtual Production attempts to unite those two worlds in real-time using game engines like Unreal and Unity, combined with high-power graphics cards, camera tracking, as well as VR and AR, filmmakers are now able to create scenes across the physical and digital worlds, together.

A new type of filmmaking

The virtual production empowers the filmmakers to create shots based on what “feels” right, rather than looking at a computer and trying to guess where things should go when it’s not feeling right. And they can do it quickly without having to spend hours moving heavy equipment. 

The technology puts the tools of storytelling back in the hands of the filmmakers, rather than an army of technicians.This lets them explore ideas a lot faster with intuition and a stronger creative vision.

Art of (LED Wall) Virtual Production Sets, Part Two: 'How you make one' –  fxguide
The full video explores shooting with real-time parallax; blending CG and real-world sets with set extension tools; using VR tools to scout, set dress, and measure environments; working collaboratively in live, multi-user sessions; controlling lighting and environment from an iPad; and using nDisplay to blend the output from multiple render nodes.

New worlds

Several films with hugely fantastical worlds and characters have taken advantage of virtual production techniques to get their images on the screen. 

Avatar represented a huge leap forward in virtual production (and there are several sequels on the way). Ready Player One relied on virtual production to bring its OASIS world to life. It’s in the OASIS where avatars of human characters interact. That virtual world was ‘filmed’ with motion captured actors using a game engine-powered real-time simul-cam set-up that allowed director Steven Spielberg to ‘find’ the right shots, since the sets the actors occupied had been pre-made in rough form digitally. He could also re-visit scenes afterwards, re-framing and re-taking scenes even after the performance was done.

Ready Player One crew snuck Steven Spielberg references into the movie |  EW.com
Ready Player One

Epic Games Unreal Engine software is one of the significant players in this virtual production environment. It allowed for example to not only film a scene with motion captured actors and a virtual camera, but also to introduce raytracing into the real-time rendered output. I personally find the result incredibly insane.

This demonstration is a collaboration between Epic’s ray tracing experts, NVIDIA GPU engineers and the creative artistry of ILMxLAB.

The two major game engine producers, Unreal and Unity, are certainly pushing virtual production. It’s their engines that are the backbone of many real-time rendering and virtual set production environments. It’s important to note for independent filmmakers that these engines are generally available for free, at least in terms of getting started.

Epic claims Apple threat to remove Unreal Engine from iOS, macOS - SlashGear
unreal engine by epic games

Virtual Camera

A virtual Alexa ARRI lens 28mm, from I AM MOTHER

The Virtual Camera enables a user to drive a Cine Camera plug-in like Arri or RED in Unreal Engine using an iPad in a virtual production environment. With ARKit or an optical motion capture system such as Vicon or Optitrack, the position and rotation of the iPad is broadcast wirelessly to the PC, with the PC sending video back to the iPad.

FilmFormatAspectRatio.png
Example of selecting film format
SmartVCS: Intelligent Virtual Cinematography – Girish Balakrishnan

Camera settings such as Focal LengthApertureFocus Distance, and Stabilization can be adjusted using touch input. Additionally, the Virtual Camera can be used for taking high-res screenshots , setting waypoints, recording camera motion and other tasks related to virtual production.

Tutorial on how to use an ipad or iphone with Unreal Engine

The physical cinematography equipment on a film set is replicated in the virtual world, and tracked so they match exactly. This allows the director of photography to put their hands on the gear and react more naturally as the performances unfold.

The assets of the film is made with the game engine, instead of creating a movie set.

The history of Virtual Production:

Its earliest use and iterations can be traced to advancements and innovations in filmmaking technologies. Peter Jackson’s The Lord of the Rings: The Fellowship of the Ring (2001)used virtual reality (VR) goggles and virtual cameras to plan camera moves.

James Cameron took it a step farther with Avatar (2009), as seen in Image 1, to create a bioluminescent species and exotic environments with a motion capture stage and simulcam. Simulcam is a VP tool used to “superimpose virtual characters over the live-action in real-time and aid in framing and timing for the crew” (Kadner 2019).

Jon Favreau continues to lead the charge with ground-breaking film and television projects such as The Jungle Book, The Lion King (2019), and The Mandalorian (2019), by designing and altering photorealistic environments in real-time.

Image 1: Facial and motion capture in  Avatar . Source:  ComingSoon.net .

UNCHARTED (Game series)

A short analysis of the game from the initial vision

Uncharted is an action-adventure game series developed by Naughty Dog and published by Sony Interactive Entertainment for PlayStation consoles. The main series of games follows Nathan Drake, a treasure hunter who travels across the world to uncover various historical mysteries.

Locations are usually unexplored remote places, rich in uncontaminated nature, if not by the presence of ancient temples and hidden treasures.

Concept art of Indian temples

The main character is Nathan Drake as mentioned before, the kind of successful middle-aged man that conceals his reprehensible behavior behind good looks, buckets of charm, and moral gymnastics. Nathan is the contemplative man torn between the adventures of his past and the domesticity of his present.

Yibing Jiang
Yibing Jiang (Senior Shading Artist): Uncharted 4: A Thief’s End

Leandro Amaral
Leandro Amaral (Lead Cinematic Lighter): Uncharted 4: A Thief’s End

The peculiarity and uniqueness of this game, in addition to the adventurous plots full of twists, is the graphics in general and the landscape choice that accompany the protagonist throughout all the saga. Starting from a very realistic vision of landscapes and characters, the game develops its own characteristics: a combination of cinematic design and theatricality, like a movie story, but in the experience of being caught up in an adventure so compelling that you feel, if only for a moment, that you’re there.

Naughty Dog’s concept artists, character artists, environment artists, modellers, UI artists, lighting artists, shading artists and technical artists have all posted their Uncharted work in ArtStation portfolios, here some examples:

Hyoung Nam
Hyoung Nam (Concept Artist): Uncharted 4: A Thief’s End
Ashley Swidowski
Ashley Swidowski (Concept Artist): Uncharted 4: A Thief’s End


John Sweeney
John Sweeney (Concept Artist): Uncharted 4: A Thief’s End

In these concepts, which I personally think contain the essence of the game, they reflect the principles mentioned in the previous lesson: grouping, focal area, balance and rhythm.

Eytan Zana
Eytan Zana (Concept Artist): Uncharted 4: A Thief’s End

Technical achievement mixes with impeccable art design to give a warmth feeling of a William Turner painting. Uncharted artists have learned a fine and unique touch that sets them apart. This unique touch allowed them to create landscapes and locations full of details, cinematic light that give life to a theatricality environment. This is why Uncharted doesn’t look like most movies; it looks better.

Boon_Cotter
Boon Cotter (Lighting Artist): Uncharted 4: A Thief’s End

The opening scene of the game

film analysis

Little Shop of Horrors 1986

American horror black comedy musical film directed by Frank Oz. The film was first released in 1986 on VHS.

There were no digital optical effects, green-screens, or CGI used in the making of Little Shop of Horrors, with the exception of the reshot ending where the plant is electrocuted, designed by Visual Effects supervisor Bran Ferren, and in some shots during the rampage in the original ending. To achieve the various sizes of Audrey II, six different sizes of the plant were constructed. Three different scales of Mushnik’s flower shop were also built, allowing production to work on different sizes of the plant simultaneously.

Once filming wrapped each day, the plants had to be scrubbed, patched, and re-painted for the following day. For scenes involving the actors interacting with the largest versions of Audrey II, the frame-rate was decreased to 12 and 16 frames per second, which required the actors to mouth their lines in slow-motion.

To achieve the growth of the plant on-screen, the plant was placed on a small dolly track hidden below the coffee can flower pot. When Oz called “action,” the plant was slowly pulled towards the camera on the track to make it appear as if it was getting bigger.

An American Werewolf in London 1981

Horror comedy film written and directed by John Landis, the film was first released in 1981 on VHS.

No CGI. Certain body parts, namely the arms and head are robotics that Naughton would wear. They were designed to move and change on screen, in real time. Here’s a look at the head work:

Also, An American Werewolf in London won the first of many awards for special effects makeup master Rick Baker, “Outstanding Achievement in Makeup” award.

The arms created had movable fingers and stretched through a series of inflatables connected by tubes and syringes. The same method was used for the growing spine. When the team needed the hair to grow before our eyes, hairs were literally drawn on to the skin and then filmed in reverse. All of the effects blend together seamlessly with the brilliant cuts and editing of the scene.

When the full body is shown, the actor is mainly under the floor in a cutaway, with the fake body on top.

Recalling Total Recall 1990

American science fiction action film shot on film using ARRIFLEX Camera and Zeiss Lenses.

The film was praised for its innovative use of practical stunts, special effects make-up, miniatures, optical compositing and CG rendering – culminating in a visual effects Special Achievement Academy Award for Eric Brevig (visual effects supervisor), Alex Funke (director of miniature photography), Tim McGovern (x-ray skeleton sequence) and Rob Bottin (character visual effects).

Visual effects supervisor Tim McGovern relates very candidly in this official behind the scenes video

Most significantly, the Kuato puppet was built on actor Marshall Bell, who played George, a process which took up to six hours and left Bell unable to use the bathroom while it was attached to him.

Every one of Kuato’s movements was controlled by a different puppeteer, and the team totalled fifteen. All very pre- CGI.

But it was one scene in particular that marked the transition to what would quickly become the main form of special effects, effectively kicking miniatures and puppets into touch.

The special effects team were faced with the challenge of creating what appeared to be a real X-ray – with the bones looking transparent at the centre – but equally making it resemble the actors and reproduce their movements convincingly. And the only way to do this was to write special, unique software.

The mo-cap technique to capture Schwarzenegger’s own movements was a primitive version of today’s familiar tiny dots. The actor wore a body suit adorned with 18 reflective bulbs and, initially, it appeared to do the job.

But it simply wasn’t detailed enough, and the team eventually had to resort to rotoscoping the scene using other footage which included all his limbs, which enabled the skeleton’s movements to match his.

This sequence was the only CGI element in the film, and it pushed available computer technology to its absolute limit, so hand finishing was the only way to complete the scene.

Stan Winston School on Twitter: "Mars Mountains from Total Recall (1990)  during air eruption, build by Stetson Visual Services, Inc. - Mark Stetson  & Robert Spurlock and filmed by visual effects facility
Mars Mountains from Total Recall (1990) during air eruption, build by Stetson Visual Services, Inc. – Mark Stetson & Robert Spurlock and filmed by visual effects facility Dream Quest Images, Inc. with Eric Brevig, visual effects supervisor.

Gremlins 1984

Gremlins is a 1984 American comedy horror film

Gremlins1.jpg

Gremlins was produced at a time when combining horror and comedy was becoming increasingly popular.

GREMLINS Director Joe Dante Discusses the New Prequel Series GREMLINS:  SECRETS OF THE MOGWAI — GeekTyrant
Gizmo

Some of the performances were shot on the Courthouse Square and Colonial Street sets of the Universal Studios Lot in Universal City, California. This required fake snow; Dante also felt it was an atmosphere that would make the special effects more convincing. As the special effects relied mainly on puppetry (an earlier attempt to use monkeys was abandoned because the test monkey panicked when made to wear a gremlin head),[12] the actors worked alongside some of the puppets. Nevertheless, after the actors finished their work for good, a great deal of effort was spent finishing the effects. Numerous small rubber puppets, some of which were mechanical, were used to portray Gizmo and the gremlins. They were designed by Chris Walas. There was more than one Gizmo puppet, and occasionally Galligan, when carrying one, would set him down off camera, and when Gizmo appeared again sitting on a surface it was actually a different puppet wired to the surface. These puppets had many limitations. The Gizmo puppets were particularly frustrating because they were smaller and thus broke down more. While Walas recommended making the mogwais larger to make their creation and functioning easier for the special effects team, Dante insisted on keeping their size small to enhance the cuteness of the creatures.

GREMLINS 1984 Mrs Deagle’s Stair Lift

The Abyss

The Abyss is a 1989 American science fiction film

The Abyss won an Oscar for Best Visual Effects, and is remembered chiefly for the then-cutting-edge CG water tentacle. But it also ran the gamut of traditional effects techniques.

In The Abyss (1989) in order to pull off the water creature the VFX  artists, Thomas and John Knoll, actually made a water creature. I don't  know how they did it but

 The special visual effects work was divided up among seven FX divisions with motion control work by Dream Quest Images and computer graphics and opticals by ILM. ILM designed a program to produce surface waves of differing sizes and kinetic properties for the pseudopod. For the moment where it mimics Bud and Lindsey’s faces, Ed Harris had eight of his facial expressions scanned while twelve of Mastrantonio’s were scanned via software used to create computer-generated sculptures.

The set was photographed from every angle and digitally recreated so that the pseudopod could be accurately composited into the live-action footage. The company spent six months to create 75 seconds of computer graphics needed for the creature. The film was to have opened on July 4, 1989, but its release was delayed for more than a month by production and special effects problems. The animated sequences were supervised by ILM animation director Wes Takahashi.[21] The technology they used for the CGI was SGI and Apple.

abyss1
Medium-wide shots of the actors in real submersibles shot in an abandoned power station that had been converted by the production into the world’s largest fresh-water filtered tank, equal in capacity to about eleven Olympic swimming pools.
abyss2
Close-ups of the actors in a submersible mock-up on stage.

The sub chase demonstrates perfectly how visual effects should work: mixing a range of techniques so that the audience never has time to figure out how each one is done, and using an appropriate technique for each individual shot so that you’re making things no more and no less complicated than necessary to tell that little piece of the story.

Video (HD) (First rereleased film with new CGI elements) – added CGI Wave scene

Independence Day 1996

Independence Day is a 1996 American epic science fiction action film

The movie contains more than 500 effects shots, combining computer generated imagery, digital compositing, digital matte paintings, and traditional miniature model effects, among other techniques. The sheer amount of effects shots makes it the biggest effects film of the 1990’s.

Independence Day (1996) - Movie Review : Alternate Ending

Director Emmerich rounded up a great team of effects artists; rather than depend on one single effects company, Emmerich and effects producer Tricia Ashford put together their own model photography and CG units. The film was awarded with an Academy Award for Visual Effects, and also won two awards in The 1996 VFX HQ Awards.

One of the revolutions on Independence Day was the decision by Emmerich and producer Dean Devlin to form an in-house miniatures and pyrotechnic unit specifically for the film. Major sequences included views of massive spaceships appearing above Earth, the destruction of landmark buildings including the White House and some stunning F-18 dogfight sequences.

Pic of the Day: Independence Day (1996) | deep fried movies
Independence Day': How Visual Effects Have Dramatically Escalated Since the  Original Film's Release | Hollywood Reporter

Blade Runner (2049) 2017

Blade Runner 2049 is a 2017 American science fiction film.

VOTD: Blade Runner 2049 VFX Reel Reveals the Creation of Future Los Angeles

They shot the project in 1.55:1 aspect ratio from a single Arri Alexa XT Studio camera with Zeiss Master prime lenses, assisted with an attached crane arm or a dolly. The filmmakers conducted tests with an Alexa 65 camera but preferred the XT Studio’s somewhat grainy image quality, and the choice of lenses corresponded to the scale and lighting specifications of the scenes. For example, close-up character scenes were captured in 32 mm lenses, but filmmakers captured sweeping cityscape shots with 14 mm and 16 mm lenses. Occasionally, production filmed with Arri Alexa Mini cameras to capture shots from the spinners, the vehicles used in the film.

Blade Runner 2049 has won an Academy Award and BAFTA, for which framestore artists crafted concept artwork used in pre-production, and delivered nearly 300 shots of VFX work in post. Tasked with the creation of large-scale CG environment builds and some challenging animation work, Framestore teamed with VFX Supervisor John Nelson to pay homage to the original picture.

One of the great surprises of “Blade Runner 2049” was a stunning CG recreation of the Rachael replicant played by Sean Young in the original movie. The two-minute sequence brings an emotionally stirring reunion with Harrison Ford’s Deckard that required technical virtuosity and subtle performance.

How Rachael was brought back to life in 'Blade Runner 2049' | Features |  Screen

Five different topics that piqued my interest

  1. MINIATURE EFFECT

Different ways to create impossible worlds and locations in movies

miniature effect, a special effect created for motion pictures and television programs using scale models. Miniature set designers design and build miniature props and sets of motion pictures. They build models used for visual effects that meet the look and requirements of the production.

Poudlard ⚡️ uploaded by Yasmina on We Heart It | Hogwarts, First harry  potter, Harry potter wiki

While the use of computer generated imagery (CGI) has largely overtaken their use, there are still films being made that use stop motion animation with very elaborate miniaturized sets and fully articulated characters, especially for projects requiring physical interaction with fire, explosions or water, but the result It’s always a combination of both, model and vfx. Titanic (1997), Godzilla (1998), The Lord of the Rings trilogy (2001-3), Casino Royale (2006), Inception (2010), and Interstellar (2014) are examples of highly successful films that have utilized miniatures for a significant component of their visual effects work.

Will there be further development of the miniature effect or have we reached the end of their use?

Christopher Nolan is one of those directors who will use real-life, practical effects if possible. Miniature models of each spaceship in Interstellar were built and often filmed against a projected background of space on the sound stage (no green screens).
While “The Grand Budapest Hotel” is busy with smaller design elements, one of its most striking designs is the hotel itself. Outfitted in shades of pink and purple and situated atop a hill, the hotel is grandiose and picturesque. It also happens to be nine feet tall. For wide shots of the hotel, the director Wes Anderson and his team decided to use a handmade miniature model.
https://www.nytimes.com/2014/03/02/movies/the-miniature-model-behind-the-grand-budapest-hotel.html


  1. WHAT IS RHYTHM?


Analyzing through edits and beat

Editing a video or a film is one of the most vital aspects of storytelling. The right rhythm and pace of the edit can determine the progression of the narrative, on the other hand it can also change how the audience receives the message of the film entirely. Weak editing is not only visually irritating, it can also drive away the attention of the spectator from the video itself.

How Wright connects one scene to the next, as well as the shots within these scenes, how he transitions, is how he becomes not just a director but a conductor of this rhythm, a visual composer who understands the beats between the moments, the bridges, are just as vital to the overall product as the moments themselves.
The first introduction scene and opeing credit dialogue from Woody Allen’s 2011 movie “Midnight in Paris”.
A cut every 4 seconds following the beat of the soundtrack

But is there a specific rule to respect in order to create a rhythm?

  1. AERIAL MOTION CONTROL TECHNIQUE

Aerial VFX w/ timelapse

Manipulating the footage in a visual effects software to mirror the ground in an almost “Inception”-like effect.

Filmmaker Rufus Blackwell recently shared with the internet his latest project. His techniques combine motion control through planned waypoint missions, and then manipulating the footage in a visual effects software to mirror the ground in an almost “Inception”-like effect.

Buy Inspire 2 - DJI Store
Blackwell filmed his aerial footage entirely with a drone (DJI Inspire 1). 

He has continued to experiment with techniques that involve aerial time-lapse, motion control, and visual effects to find new and unique uses for drones that can capture video.

“I started with basic aerial time-lapse. Then started working on aerial motion control techniques. Using preplanned waypoint missions it is possible to set the drone up like an motion control rig in the sky. You can repeat the same set of camera moves at a different times of day to create a beautiful day to night transition.”

Blackwell explained

Blackwell also need some compositing and animation skills to track, smooth, and simulate additional movements to create the unique motion you see in the video.

” Then on top of that there are various techniques that allow me to reposition the camera in post and create camera moves on top of the drones automated movement. Finally by stabilizing the source material and flipping the image with a soft matte you can create the mirrorworld effect, sometimes changing the vertical scale of the images to give a different perspective.”

Blackwell explained
Drone-Lapse work, with a bit of VFX thrown in.

The shooting techniques are many and different from each other, in this case we talk about hyperlapse of aerial shots. If we combine new types of film like these with the art of vfx we can create a completely different visual experience, and this work is the result.

  1. WHAT IS CINEMATIC LIGHTING?

Borrowed from French cinéma, shortening of cinématographe (term coined by the Lumière brothers in the 1890s), from Ancient Greek κίνημα (kínēma, “movement”) + Ancient Greek -γράφειν (-gráphein, “write(record)“.

Cinematic lighting is a film lighting technique that goes beyond the standard three-point lighting setup to add drama, depth, and atmosphere to the story. Cinematic lighting utilizes lighting tricks like bouncing light, diffusing light, and adjusting color temperatures.

5 Best Film Lighting Techniques: Filmmaker's Guide to a Cinematic Look
From the movie “Fight Club”

Lighting is a fundamental to film because it creates a visual mood, atmosphere, and sense of meaning for the audience. Whether it’s dressing a film set or blocking actors, every step of the cinematic process affects the lighting setup, and vice-versa.

  • Tells the audience where to look.
  • Reflects the psychology of characters.
  • Defines and supports the genre of the film
Cinematic Lighting (Subway Re-Lighting) — polycount
Cinematic lighting (Subway re-lighting)

Main roles for the Lighting Setup for a Scene?

Firstly the director, who shares visual inspirations and ideas for cinematic lighting, the director of photography or cinematographer creates then the lighting plan with input from the director. The gaffer designs and executes the cinematographer’s lighting plan and oversees the crew that brings the lighting plan to life.

With excellent light it is possible to obtain beautiful and powerful shots even if the location is not the best, but what are the rules to be respected to achieve a good level of cinematic light?

  1. Why Retopology is used?

Retopology is the process of converting high-resolution models into something much smaller that can be used for animation. It can be a difficult process, but the basic idea is to create another mesh that simplifies the original HD asset.

In the past a 3D artist would painstakingly build a mesh polygon-by-polygon. The problem with this approach is that it’s very technical and difficult.

Grommit character retopologized

Sculpting is a more intuitive process that lends itself better to how artists think and work.

The downside is that you have to create a lot of polygons to sculpt the curves you need. And too much detail will slow down even the most powerful computer.

That’s where retopology comes in. By overlaying a low-polygon mesh onto a high-polygon mesh you can get something that’s perfect for animating and easy on your processor.

Retopology is the act of recreating an existing surface with more optimal geometry. A common use-case is creating a clean, quad-based mesh for animation, but it’s also used for most any final object that needs textured, animated, or otherwise manipulated in a way that sculpted meshes are not conducive to.

Roles in the VFX industry

Here I am identifying all different roles of the VFX industry

Let’s start with Runner, Matchmoving Artist and Roto Artist

Runner

is one of the first roles to approach the vfx environment, an entry level job and a great way of getting into the industry. Generally the runner is a film or animation passionate with some VFX or media art qualifications. Working as a runner in a studio allows you to gain experience and have the opportunity to move forward in the world of vfx, learning from industry experts. Runners are all-purpose helpers in a VFX studio, they can therefore have a lot of workload. They support any members and departments in the studio and make sure that everyone has what they need, doing a variety of jobs. They deliver materials and messages between departments, organize meetings and schedules. Runners may also have the opportunity to get their hands on programs such as Maya, Nuke or After Effect, usually under the supervision of a mentor or supervisor.

Matchmove artists *

combine computer generated (CG) scenes with live footage so the two can be convincingly combined and recreate live-action backgrounds (plates) on a computer in a way that mirrors the camera on the set in every way, including lens distortion. They do this by monitoring the camera’s movements to make sure real and virtual scenes appear from the same perspective. Sometimes matchmove artists go to the movie set to take measurements and put tracking markers. Then they use these markers to track the camera movement and calculate the relevant coordinates in the 3D scene. They do this using 3D tracking programs like Maya or 3DEqualizer. Matchmove artists also perform body and object tracking, using markers to recreate the movements of people, vehicles or other CG objects. The motion files created (camera, object or body track) are then transmitted to other departments via the VFX pipeline, so that, in the end, they can be perfectly combined by the composer. The Matchmove artists are extremely accurate and meticulous in their work. It has to be pixel perfect, so they need an eye for detail. At the end of a work the CG and live-action movements must line up perfectly in order to reach a good level.

Roto artists *

are animators that use animation techniques to trace over motion picture footage, frame by frame, to produce realistic action. They manually draw around and cut out objects from movie frames so that the required parts of the image can be used, a process known as rotoscoping. The parts of an image that are wanted after cutting out are known as mattes. Roto artists work on the areas of live action frames where computer-generated (CG) images or other live-action images will overlap or interact with the live image. If the live-action camera is not moving within a shot, rotoscoping might involve only one frame. If the camera’s moving, roto artists trace the relevant areas of every frame within the shot so that CG can be combined accurately with the live-action. Roto artists need to have a keen eye and patience in order to complete this meticulous and repetitive work. In addition to rotoscoping, roto artists might assist in the preparation of material for compositing. Roto artists are typically employed by VFX studios but can also be freelancers. They generally use composing and 3d programs such as Nuke and Blender.

Concept Artist

A concept artist is a designer who visualizes & creates art for characters, creatures, vehicles, environments, and other creative assets. Concept art is a form of illustration used to visualize ideas so that modelers, animators, and VFX teams can make these ideas ready for production to use in films, video games, animation, comic books, or other media.

Maxime Desmettre: Ubisoft, Netease, King Kong Concept Artist | by Concept  Root | Medium
Maxime Desmettre artist

The artist must bring these ideas to life through their artwork.

Jungle Concept - Characters & Art - Uncharted: Drake's Fortune | Concept  art, Uncharted, Art
Jungle Concept Art – Uncharted

Concept artists need to know design and how to create a design that blends well with any creative project. A concept artist working for Fortnite will have a very different design style than a concept artist working for Pokemon.

But the fundamental process is still the same. It’s the artist’s creativity and attention to detail that lets them design many varying styles of characters, creatures, or anything else needed for production.

How they work?

Concept art is developed through several iterations. Multiple solutions are explored before settling on the final design. Concept art has embraced the use of digital technology. Raster graphics editors for digital painting have become more easily available, as well as hardware such as graphics tablets, enabling more efficient working methods.

The best drawing tablet: Our pick of the best graphics tablets in 2020 |  Brayve Digital

Prior to this, any number of traditional mediums such as oil paints, acrylic paints, markers and pencils were used. Many modern paint packages are programmed to simulate the blending of color in the same way paint would blend on a canvas; proficiency with traditional media is often paramount to a concept artist’s ability to use painting software. Popular programs for concept artists include Photoshop and Corel Painter. Others include Manga Studio, Procreate and ArtRage. Most concept artists have switched to digital media because of ease of editing and speed. A lot of concept work has tight deadlines where a highly polished piece is needed in a short amount of time.

Matte Painter/Artist *

matte artist is today’s modern form of a traditional matte painter in the entertainment industry. A matte artist digitally paints photo-realistic interior and exterior environments that could not have been otherwise created or visited.

The video shows what a digital matte artist does

In more detail?

matte painting is a painted representation of a landscape, set, or distant location that allows filmmakers to create the illusion of an environment that is not present at the filming location. Historically, matte painters and film technicians have used various techniques to combine a matte-painted image with live-action footage (compositing). At its best, depending on the skill levels of the artists and technicians, the effect is “seamless” and creates environments that would otherwise be impossible or expensive to film. In the scenes the painting part is static and movements are integrated on it.

The Techniques of Dylan Cole Vol. 3 | The Gnomon Workshop
Dylan Cole artist

From traditional to digital

Traditional matte painting is older than the movie camera itself and has actually been already practiced in the early years of photography to create painted elements in photographs.

With the advantages of the digital age, matte painters have slowly transitioned to a digital work environment, using pressure-sensitive pens and graphic tablets in conjunction with a painting software such as Adobe Photoshop. A digital matte painter is part of a visual effects team being involved in post-production, as opposed to a traditional matte painter, who was a member of a special effects crew, often creating matte paintings on set to be used as backdrops.

Digital Matte Painting: 4 Texturing
A matte painting work on photoshop

Digital matte art is often characterized by an artificially perfect look. One of the modern approaches adopted to address this is the integration of details from a photograph, say, of real places to depict realistic scenes. It is this reason why some digital matte artists refer to their work as a combination of digital painting, photo manipulation, and 3D, for the purpose of creating virtual sets that are hard or impossible to find in the real world.

Prep/Paint Artist

Prep artists clean up the backgrounds of live action-footage ready for the effects to be layered on it by the compositor. The shots they work on, known as plates, either moving or still, don’t have foreground action or players included.

A Prep Artist is responsible for rig removal tasks, painting out markers, wires and rigging before the shot can move along the pipeline. High-level quality control and a keen eye for repair work is necessary, as the work must be invisible. The work of a Paint/Prep Artist is likely to be reviewed by a VFX Supervisor, often one frame at a time, and compared back to the original plate on a cinema screen.

Pikachu removal on NUKE

Prep artists use specialist VFX software to clean plates, such as Maya, Photoshop and, particularly, Nuke. There are many processes used to do this cleaning. They remove any unwanted dust and scratches from the frame. They sort out dropped frames, where a camera has been unable to capture all the frames in a given time resulting in little jerks in the action. They remove any unwanted items such a boom microphones or electric pylons.

They typically work with the compositors because they hand their plates over to them. They also work with the roto artists, who cut out objects and help clean the plates. 

Compositing Artist 2D / 3D *

Compositors create the final image of a frame, shot or sequence. They take different digital elements; like the animations, background plates, graphics and special effects (SFX) and put them together to make a believable picture.

A short project that explains what is VFX and especially Digital Compositing.

They are responsible for the composition of images and shots. They make these look good by way of how the different art assets and elements are digitally placed. Compositors enhance the lighting, create convincing shadows and add motion blur where required.

ArtStation - BBC Winter Olympics Spot - compositing, Łukasz Stolarski
different art assets and elements are digitally placed in order to create a new realistic scene

They are also responsible for continuity; making sure art from different sources and different artists looks the same. They make sure the blacks and other colours match each other in the image. They spot mistakes and either correct them or send the work back through the pipeline to be improved. They ensure the overall style of the film is consistent and in line with the director’s vision.

Some studios have junior compositor roles. Junior compositors help compositors by doing the simpler parts of the job, sometimes under supervision. They might match colours or add in shadows.

What is SFX, VFX, Animation and CGI? – Topicrooms VFX
from Game of Thrones

Typical applications

In television studios, blue or green screens may back news-readers to allow the compositing of stories behind them, before being switched to full-screen display. In other cases, presenters may be completely within compositing backgrounds that are replaced with entire “virtual sets” executed in computer graphics programs. In sophisticated installations, subjects, cameras, or both can move about freely while the computer-generated imagery (CGI) environment changes in real time to maintain correct relationships between the camera angles, subjects, and virtual “backgrounds”.

 computer-generated imagery (CGI) environment backgrounds from Avengers Infinity War
 computer-generated imagery (CGI) environment backgrounds from Avengers Infinity War
 computer-generated imagery (CGI) environment backgrounds from Avengers Infinity War

ASSISTANT TD (technical director)

TD assistants help identify and resolve issues and make sure everyone in a visual effects production pipeline has the tools they need. They must have a very good understanding of how VFX pipelines work and all the different VFX professional roles. Their experience also includes understanding the software used by VFX artists and the needs and limitations of the different departments.

but what is a technical director?

Assistant TDs assist pipeline TDs and other TDs to gather information on the needs of each department. They design solutions for problems that arise and also use coding skills to create small-scale tools needed by the VFX artists. They deal with minor bug reports so that pipeline TDs to deal with bigger problems, such as rendering errors.

They also work closely with data input/output technicians to develop solutions to these as well as with supervisors, like create their own plug in.

An assistant TD must have a strong understanding of all jobs within the pipeline, their roles, needs, and the challenges they face

a typical VFX pipeline

Assistant TDs are also responsible for data management, archiving and restoring and tracking data and converting and resizing files where needed. They help to keep the project on schedule. Along with pipeline TDs, assistant TDs will work closely with research and development teams, who design and test any new software.

Top Hard Disk Drive Stickers for Android & iOS | Gfycat

PRODUCTION COORDINATOR

Production coordinators run the production office.

A production coordinator works with a producer or production manager. They help to coordinate with individuals involved in the filmmaking to ensure everyone comes together for the video production.They also serve as a supervisor to any of the production assistant staff. It tends to be more of an office position rather than a hands on position.

How to become a Production Coordinator?

Production coordinators start work during pre-production. They set up the production office, organising equipment, supplies and staff. They coordinate travel, accommodation, work permits and visas for cast and crew. They also distribute shooting schedules, crew and cast lists, scripts and script revisions.

During production, production coordinators are responsible for preparing, updating and distributing crew lists, daily progress reports and script changes. They also deal with call sheets and transport requirements. They let the transport captain know what is needed and organise couriers and shipping companies.

As the shoot draws to an end, production coordinators help the production manager to ‘wrap’ the production. They close accounts with suppliers, return surplus stock and tie up all loose ends. They usually work on a freelance basis.

Now Hiring - Production Coordinator — Anthill Films

What’s a production coordinator good at?

  • Knowledge of filmmaking: understand the process and the needs of each department
  • Organisation: plan, multi-task, work calmly under pressure
  • Innovation: find solutions to problems, deal with the unexpected
  • Communication: work as a team, share information with heads of departments
  • Budgeting: keep recordings of spending and control it

MODELLING ARTIST

Modelling artists create characters, weapons, plants and animals on a computer in 3D.

3D Art - Ivan Komnenovic - ikomnen - freelance artist portfolio site

They often start with a brief or 2D drawing from a concept artist and build their 3D models from that. Or they can work from reference materials, such as photographs or line drawing sketches, which can be scanned into 3D software.

Inspirational 3D Art | Foundry
Crafting gruesome fantasy creatures (Ken Barthelmey)

They first create a ‘wireframe’, commonly referred to as a ‘mesh,’ of the object. This looks like a series of overlapping lines in the shape of the intended 3D model. From the mesh, they are able to sculpt the model of the object to closely resemble what’s intended. They use digital tools, such as sculpting brushes, and a physical graphics pen and tablet.

Website Wireframe Wire-frame Model Elephantidae 3D Computer Graphics -  Elephant Transparent PNG
‘wireframe’, commonly referred to as a ‘mesh,’ of the object

Modelling artists work at an early stage of the CG and 3D part of the VFX pipeline. The 3D models that they produce can then move on to be animated, given texture and lit.

If a modelling artist specialises in creating a specific type of 3D model, for instance, characters, then they may refer to themselves as a character artist.

In this case, they will likely create both the models and textures for characters.

The best 3D modelling software:

  1. Maya. Industrial-strength 3D modelling software, with a price to match. …
  2. Houdini. 3D modelling software used in today’s movie and TV VFX. …
  3. Cinema 4D. Brilliant 3D modelling software for beginners and pros alike. …
  4. Autodesk 3ds Max. …
  5. Modo. …
  6. Lightwave 3D. …
  7. ZBrush.
Organic and Procedural Modeling with Maya3D
View of Maya interface

Modelling artists work for VFX companies or studios or as freelancers. Smaller VFX companies or studios may not distinguish between modelling and texturing artist roles, and instead advertise for one position to do both roles. Modelling artists take the brief from the concept artist. They draw their models into the work created by environment artists, so they work closely with them. They then pass their work onto the texture artists, riggers or animators.

TEXTURING ARTIST

Texturing artists make 3D models believable by mapping textures onto the object’s surfaces. They also add an extra dimension with imperfections – rust to oil cans, scuffs to trainers, rips to fabric and reflections to windows.

10 Power Tips for Substance Painter and Substance Designer | Gnomon
nana_sculpt_texture_danroarty_3dartist | 3DArt

They are concerned with making surfaces realistic, to help the player forget that they are in a computer-generated world. They may use a combination of hand painting, photos, digital and 3D art to create unique custom textures. They use materials, shaders, mapping and an understanding of how these interact and respond.

Which tool do they use?

New Autodesk Mudbox 3D Modeling Software Empowers Indies for $10 a Month
  • Image editing software (Adobe Photoshop)
  • 3D painting software (Mudbox, ZBrush, Substance Painter, Substance Designer, Quixel)
  • Games engines (Unity, Unreal)

Texturing artists work with all the other members of the art department – the concept artistsenvironment artistsmodelling artists and so on – as well as designers but also programmers. They usually report to the art director.

Environment Artist

3D Environment based on Jermey’s artwork
3D Environment based on Jermey Pallotin’s artwork

An environment artist is a professional artist who works in the video game industry as a 3D modeler, specializing in outdoor and indoor locations for a game’s setting. They are responsible for creating the majority of the overall asset and visuals the player will encounter on the screen, modelling, texturing and placing assets, buildings, streets, foliage, furniture and all other elements into a scene.

They often start with 2D art created by a concept artist and matte artist and turn it into a believable environment in 3D. Sometimes they use photographs, sometimes their own imagination. Environment artists carefully consider the level designers’ gameplay requirements. They find out what’s mission-critical and ensure those elements are included.

Spyro the Dragon – Hardcore Gaming 101
Environment from Spyro 1998

If they create an environment that’s too detailed, it could cause the game to lag. Consideration of technical aspects such as polygon count within the environment can prevent that from happening. Maybe a Retopology Artist could save the situation…

Environment artists usually work with all the other members of the art department as well the designers and programmers working on the same game. 

Uncharted 4: A Thief's End images courtesy of Naughty Dog.
UNCHARTED 4’S ENVIRONMENT

The best 3d environment software:

  • Image editing software as Adobe Photoshop
  • 3D modelling, sculpting and painting software (Blender, 3D SMax, Maya, Mudbox, ZBrush, Substance Painter, Substance Designer, Quixel)
  • Games engines: Unity, Unreal

LOOK DEVELOPMENT ARTIST

Look development artists define the look of CG creatures or objects to ensure all the art in the film and tv series is consistent. Look development is part of the pre-production phase where a show or movie’s overall artistic & scene styles are established by asking the question of how stylized, flashy etc. the final design should be.

They work with lighting TDs, texturing artists and creature TDs to establish the different looks, balancing the processes of texturing, lighting and rendering to match reference images and real footage.  

Character courtesy of www.td-u.com
Character courtesy of www.td-u.com

The looks that the look development artist creates are really important because after having established, all the artists in their VFX pipeline then use these looks when they create their assets. So consistency and quality are the key words .Ofter the role of a look development is covered by a lighting artist.

Look development can be divided into traditional, 2D digital, and 3D digital:

Traditional mediums constitute canvas paintings and watercolors and in some cases maquette. Maquettes are sculptures that are made with clay, wax, or even wood, among other materials. A detailed maquette will serve as a reference for further design choices to help visualize the concepts.

kent melton - Google Search | Art Reference | Treasure planet, Art, Long  john silver
Maquettes example

The 2D digital look development process determines the majority of the design choices.It is comprised mainly of concept art and color charts. For 2D look development most tools are fairly accessible to all artists and they amount to no more than the best digital painting programs: Photoshop, Krita, Affinity Designer, or Clip Paint Studio, among many others.

What is Look Development?

With 3D digital look development, the end goal is to determine rendering choices and stylized asset designs. This is geared towards shading, texturing, and rendering. After look development exploration is done the 3D assets can be considered ready for production.

Look development artist software:

Almost all of the digital content creation (DCC) packages such as Maya, Cinema4D, or 3ds Max can be used for this process.There are, however, applications that can accommodate rapid iteration and they specialize in look development. These are tools like Keyshot, Katana, and Clarisse among others.In addition, the 3D phase also requires testing several rendering engines to see what gives the best final output.

For instance, Autodesk’s Arnold might be a good all-around choice for any topic. But for interior scenes it’d be better to use a biased-render engine such as Maxon’s Redshift or Chaosgroup’s V-Ray for faster turnaround.

Lighting artist

A 3D lighting artist is a member of a larger team of 3D artists who specialize in lighting the final scene (or render) in a style that blends properly with the 3D project. This can include lighting for video games, movies, instruction videos, or architecture mockups.

Like a director of photography and gaffer who decide what lights to use and where to place them in a movie, the lighting artist does it in a computer-generated (CG) animation. The difference with an animation is that the lights are created through software and the lighting artist has complete control of what the effects will be.

before
after

Modern render engines and most real time game engines calculate lighting in a way very similar to that in the real world. But the lights you use in a scene need to be set up by the artist to give the scene the final look.

Lighting artists use light to enhance the atmosphere, tone, depth and mood of a scene. They input different light effects depending on the factors in a frame or scene, such as the weather or the time of day. They make it clear where the light sources are meant to be in a scene.

There are several forms of digital lights, and they vary depending on the software & the render or real-time engine being used.

Spotlight:

Spotlights behave in the 3D world very similarly to their real world counterparts.The light is emitted from a source within the 3D space then projects a “cone” of light in a single direction. These lights are very customizable, allowing you to change the direction of the cone, the intensity, and color of the light. Within the cone size settings you usually have options called near and far attenuation (or falloff).

Radiator Blog: Lighting theory for 3D games, part 4: how to light a game  world in a game engine
a “cone” of light in a single direction

Sun/Sky Light

Sunlight is usually a combination of a spotlight and HDRI. This type of light tends to import with a custom HDRI that is relatively flat, giving off no strong lighting information and is only used to soften shadows and other darker areas. The “Sun” light acts similarly to a spotlight producing the main bulk of the light in the scene. Due to the main light being separate from the HDRI allows for more customizability in your lighting, which can give you some nice results.

Artificial Skylight Brings Sunlight To Any Room | Hackaday

HDRI Light

HDRI lights are an interesting form of lighting where all the information is taken from an image and translated into lighting data.HDRI stands for High-dynamic-range imaging which is a technique used in photography to produce images that can store a larger amount of information than a standard photograph.This can be useful in many post processing applications.

Lightmap | HDR Light Studio - Lighting Software for 3D Artists

Ambient Light

Ambient lights are very different from the others in this list. While they technically have a source within the scene where you can select and move the light, when it comes to rendering there is no source for this light. All this light does is brighten up your entire scene uniformly, causing no bright spots where a regular source light would. This also limits the customizability of these lights with you only being able to change their color and brightness.

Art of Lighting Game Environments in Unity - CG Cookie

Lighting artist software:

  • Image editing software: Adobe Photoshop
  • 3D lighting software: V-Ray, Arnold, Redshift and Renderman
  • 2D compositing software: After Effects, Blackmagic Fusion and Nuke

Production Manager

The VFX Production Manager is primarily responsible for the management of production flow for VFX projects. Attending client and internal show meetings, generating schedules and status reports, and reporting to the VFX Producer on show progress, are all part of the daily responsibilities of the VFX Production Manager.

A Production managers look after the budget too. He oversees the work of the production coordinator in scheduling the work and might also be involved in casting or hiring artists and drafting contracts. He works alongside the VFX artists and technical directors (TDs) from all parts of the VFX pipeline to see that work is completed on time.

interview with a production manager

Production managers are also important in communicating with the producer of company shooting the live-action footage and producing the film or TV programme. Anyway they have to act on the decisions that have been made by the VFX producer.

Usually a Production manager tends to be employed by VFX companies instead of be a freelancer.

Duties:

A production manager must keep in contact with the various departments on a production in order to inform them of their schedules.

Also, a production manager must understand all aspects of the VFX pipeline, know the processes, the creative challenges and the software, in order to anticipate any issues that might occur during the project, adapt to changing timescales and technical issues.

PRODUCER

A Producer oversees entire projects from a managerial position. They are involved from beginning of the process from early client discussions, bidding and shot breakdowns, right through to making sure the final result is a high standard and delivered on time.

VFX Producer Hasraf Dulull

VFX producers write the bid; the document through which they persuade the film or TV series’ producer to take their VFX studio on to do VFX work on a project. VFX producers put together the team of VFX artists and other technical staff. They set the schedules for the work and they manage the budget.

Works with:

  • VFX Supervisor
  • Department Supervisors
  • VFX Coordinator
  • Company Director
  • Client Producers/Directors

While filming is happening, VFX producers work closely with the live-action production crew. They also work with the editor in post-production. They communicate between the crew and editor. How much they interact with the client varies between studios. They might report to them on a weekly or even daily basis.

Duties:

  • Act as the main contact for client relations.
  • Provide early ‘ballpark’ estimate bids, and work with the VFX Supervisor to provide more in depth shot breakdown quotations to clients.
  • Schedule production objectives, tasks and milestones between all involved, including supervisors, artists.
  • Schedule client review sessions and screenings.
  • Liaise with clients on formulating approval processes for projects.
  • Ensure all shots/assets are tracked, including their status, version number, changes, etc, throughout the process until final delivery.
  • Maintain a smooth workflow and operation of all departments, include any potential off site, remote artists.
  • Encourage new client relationships and strive to bring the company new work.

ANIMATOR

An Animator breathes life into a modelled character by making it move, talk, and express emotions. They might animate vehicles or machinery too.

Animators create animation ‘frames’ (images), using the ‘rig’ (the digitally moveable 3D model). When the frames are put together in sequence, they form the animation.

In some films, a process of motion capture is used for certain characters. This is where an actor wears a special skin-tight suit with motion trackers on it, so that the movement and expression of their performance can be captured digitally and translated into a different-looking character animation model.

ILM 'Captain America: Civil War' VFX Part 2: Fully CG Airport Battle |  Animation World Network

The animation team will usually have reviews with the studio’s Animation Director and the Director of the film: both ensuring the performance and narrative, are true to script.

Animators produce work to be integrated into the live-action footage of a film or TV programme. They animate 3D objects as dictated by background film plates, which means that there is footage and a set camera position that they must work to.

Animators in the VFX industry are either employed by VFX studios or they work as freelancers.

The most common software used to animate is Maya, but others like Blender, 3Ds Max, and Cinema 4D are widely used too.

2017 Sci-Fi Film “Attraction” by the talented team at Main Road Post!

Duties:

  • An animator must draw and reveal attitude, emotions and mood through a character’s movement, have spatial awareness and a feel for movement over time
  • He must have a big knowledge of animation: have a good understanding of the principles and mechanics of animation
  • Using VFX software: be adept at using relevant programmes such as Arnold, Blender, Maya, Mental Ray, Photoshop, RenderMan, Substance Painter, V-Ray, ZBrush and 3ds Max
  • Work within the production schedule, manage files and meet deadlines
  • Be able to work with other VFX artists in the pipeline, use each other’s resources and work effectively

EFFECTS TD

Effects TDs create special effects for vfx and animation. They make it easier for visual effects (VFX) artists to use effects like explosions, billowing smoke and rushing water.

They create these effects for the VFX artists and animators to use in their sequences; they write the computer language scripts that generate the effects. FX TDs build and test software tools for the VFX artists to use and then they incorporate them into a VFX studio’s production pipeline.

They make sure the effects look believable and also consistent with the style of the animation so that they blend seamlessly with the other art assets.

The role is both artistic and technical and so is ideal for problem-solvers with a good eye. FX TDs aim to meet the director’s vision, in terms of digital FX, with the resources available to them. Each project presents its own complex obstacles. They stay up-to-date with the latest research and techniques and push software technological boundaries to find ways the make directors’ creative vision a reality on screen. They make sure the effects look consistent and convincing. Effects need to blend in seamlessly with the other VFX and live action components of the film or TV programme.

Alireza Bidar FX TD/ 3D Generalist – Alireza Bidar FX TD/ 3D Generalist
Alireza Bidar FX TD/ 3D Generalist 

An effect TD ( FX TD) must have several skills such as:

a good eye for detail, knowing how to make a sequence look good, a good understanding of physics, be able to create accurate and believable movement of particles, overcome obstacles, use current technology to find new ways to achieve a creative vision.

Also a Knowledge of VFX production pipelines is required: have a strong understanding of other roles within VFX studios and ensure that FX sequences will fit into the rest of the process and the finished product. Programming and coding skills its a vital aspect: have a high level of technical ability using a variety of relevant software.

Lastly a FX TD must communicate well with a team of VFX artists, offer support, taking direction from a VFX supervisor.

FX TD software:

  • Maya
  • Houdini
  • Nuke

Creature technical director (TD)

Creature technical director (TD), also referred to as a creature TD, is a type of technical director in film and video games who is specifically concerned with characters, creatures, and mechanical objects.

They develop and program the digital tools for all the artists who work on digital dinosaurs, animals or magnificent beasts, helping them to be as efficient as possible. They then work with pipeline TDs to incorporate the tools into a VFX production pipeline.

TD Creature Designs - ZBrushCentral
The Cuttle Monster. A humanoid creature that is the result of alien genetic experimentation. It’s part cuttle fish, part reptile, and part alien.

The role of a Creature TD may vary from studio to studio in its scope, but is almost always centered around the discipline of rigging: the process of engineering anatomical or mechanical kinematic systems that move and deform digital models, and the design of higher-level interfaces used by computer graphics animators to control the movements of those models.

The role may additionally encompass disciplines such as modeling and simulation.

In larger studios, the role of creature TD is focussed on creating the fur, hair, feathers, skin. The goal is the same, to make sure the effects look convincing and blend seamlessly with the other VFX and live-action components of the film or TV programme.

Creature TDs can be employed by VFX studios or work as freelancers.

Lonnie Kraatz Creature TD Demo Reel
lonnie Kraatz
lonnie Kraatz

Software:

The software used by character TDs may vary widely from studio to studio, from off-the shelf tools to proprietary in-house systems. Autodesk Maya is used predominantly throughout the VFX and animation industry, with Softimage also having a large user-base. In the gaming industry, Autodesk Maya and Autodesk 3ds Max have been the dominant presence.

Many studios pair off-the-shelf software with their own in-house software and plug-ins for rigging and simulation. For instance, Industrial Light & Magic does much of their sim setup and simulation in a proprietary package called Zeno, and Weta Digital uses an in-house simulation system they call Tissue.

Notable newcomers to the field of rigging include the independent platform Fabric Engine, being used by Double Negative, MPC and Hybride.

SHADER DEVELOPMENT TD

Shader TD’s write and adjusts shaders to determine the appearance of models, and prepare objects to be painted by the texture painters.

The Shading TD works alongside modelers and lighters to create the look of characters, sets and other objects in the film. The shading is created with RenderMan shaders, using a mixture of painted and procedural textures along with sophisticated illumination models.

Demo Reel for Lighting and Shading TD

Pipeline TD

Being a pipeline TD is all about helping your team be successful. In this position, you’ll lead the charge on designing and developing custom tools to help everyone else get their work done faster and better.

Being a pipeline TD means you need extensive knowledge about how pipelines work from beginning to end. With this knowledge, you’ll be able to tell when a pipeline isn’t running smoothly and work to figure out what sort of tools you can create to keep the pipeline running.

Since there’re a lot of people in a typical pipeline, you’ll often be expected to interact and interpret the needs of various artists along the pipeline.

A Pipeline TD communicates with VFX artists across the team to understand their needs, putting things in place to ensure the project runs smoothly and the artists’ needs are met. If the project isn’t running smoothly, the Pipeline td identifies what kind of tools need to be developed to fix this. Issues that arise might be technical ones to do with 3D art, or productivity issues. The job involves writing or modifying code to solve problems and also provide face-to-face technical assistance. Pipeline TDs will work closely with research and development teams, who design and test any new software.

Skills:

  • Communication: communicate well with a variety of staff at different levels to understand their needs and assist with technical issues, work well as part of a team to develop solutions and take direction from a VFX supervisor
  • Problem-solving: think analytically to identify problems and come up with creative and efficient solutions, find new ways to overcome obstacles and achieve a creative vision
  • Knowledge of all parts of the pipeline: have a good understanding of the jobs within the pipeline, their roles, needs and the challenges that they face
  • Programming and coding skills: have advanced knowledge of programming in Python and C++ with a very high level of technical ability using a variety of relevant software used across the project such as Maya, Houdini and Nuke
  • Helping others: offer guidance to more junior members of the pipeline team and give face-to-face technical assistance to staff across departments, maintain a positive attitude
pipeline td reel

R&D Programmer

R&D Programmers produce the technology required for a visual effects (VFX) project. They create the systems which technical directors (TDs) can use and modify to suit the specific needs of their VFX artists. They also design new digital tools and make sure they fit into existing software systems. This enables the efficient passing of assets from one VFX process to the next.

R & D / HOUDINI-FX / Car Crash /

This is a research and development role, which means that it involves working out ways to improve how well digital processes works. R&D Programmer must stay informed about software and technology relevant to their field and beyond. They find innovate ways to enable the artists within the pipeline to complete their work as fast and as well as possible.

Skills:

  • Communication and teamwork: communicate well with pipeline technical directors as well as directly with a variety of staff at different levels to understand their needs and assist with technical issues, work well as part of a team to develop solutions, present a plan to and take direction from supervisors
  • Problem-solving: think analytically to come up with creative and efficient solutions, using the most up-to-date technology to find ways to overcome obstacles and achieve a creative vision
  • Programming and coding skills: have advanced knowledge of programming in Python and C++ with a very high level of technical ability using a variety of relevant software used across VFX projects, such as Maya, Houdini and Nuke
  • Knowledge of all parts of the pipeline: have a strong understanding of all jobs within VFX pipelines, their responsibilities and needs
  • Planning: create an appropriate development plan and stick to a timescale, understand how to breakdown a project into tasks

Where am I right now as a researcher and how might this influence my future project

I am a filmmaker and aerial photography enthusiast.

During my bachelor’s in design I approached the world of video, filming and post-production. I make videos of experiences and places through the use of my camera, drones and 360 cameras. I have done some work, especially into the hospitality industry. I use drones a lot, it allows me to photograph from above and to get unique shots.

I want to learn the fundamentals of vfx in order to achieve new skills to apply to my work.

My future research could highlight new technological models in the videography industry, such as 360 cameras and FPV drones (first person view).

The research could explain how these new models work and what you can achieve by using them, through video examples.
Finally, identify the target of use and understand if there will be further developments and updates on those industries.