Digital Era Visual Effects PDF
Document Details
Uploaded by UnrestrictedFortWorth
Tags
Summary
This document provides an overview of visual effects techniques, including digital compositing, CGI, and animation. It describes various roles within a visual effects pipeline, like the art department and asset department.
Full Transcript
The Digital Era Visual Effects https://www.dneg.com/reels/film-vfx-breakdowns/ https://www.dneg.com/reels/film-vfx-breakdowns/ What is digital compositing? Digital compositing is the process of digitally assembling multiple images to make a final image, typically for print, motion pictures or...
The Digital Era Visual Effects https://www.dneg.com/reels/film-vfx-breakdowns/ https://www.dneg.com/reels/film-vfx-breakdowns/ What is digital compositing? Digital compositing is the process of digitally assembling multiple images to make a final image, typically for print, motion pictures or screen display. It is the digital analogue of optical printers for film compositing era. Compared to optical compositing, digital compositing provides significantly more flexibility and creative freedom. In comparison to analogue format, the digital format does not degrade the image over time. Analog image processing is a slower and costlier process. Digital image processing is a cheaper and easy to store and retrieve. It is generally continuous and not broken into tiny components. Video - The Art of Compositing CGI - Computer Generated Imagery CGI is the acronym of Computer Generated Imagery. In terms of movies and TV production CGI is a part of the VFX. VFX is the digital manipulation by combining two or more of live action footages, or created entirely of CGI, and even sometimes by combining both live action footage with CGI to achieve the desired realistic environment. Visual effects encompass any kind of effect that wasn't shot directly in the camera and was created in post production. CGI involves modelling 3D objects in a computer and rendering out images of those objects. CGI has enabled us to create fully realized characters with the malleability of computer software. At the most basic level, Computer-Generated Imagery (CGI) is the creation of still or animated visual content with computer software. CGI most commonly refers to the 3D computer graphics used to create characters, scenes and special effects in films, television and games. Article - Extra reading Computer-Generated Imagery (CGI) is the creation of still or animated visual content with computer software. CGI refers to the 3D computer graphics used to create characters, scenes and special effects in films, television and games. The technology is also used in everything from advertising, architecture, engineering, virtual reality and even art. CGI is used extensively these days because it is often cheaper than physical methods which rely on creating elaborate miniatures, hiring extras for crowd scenes, and most commonly for when it's simply not safe or humanly possible to create the visuals. CGI is created using a range of different methods. The use of algorithms can produce complex fractal patterns. 2D pixel- based image editors can create vector shapes. 3D graphics software can create everything for simple primitive shapes to complex forms made from flat triangles and quadrangles. 3D software can even simulate the way light reacts to a surface and generate particle effects. Where CGI starts to get really exciting is when computer-generated imagery is layered into digital film footage using a technique known as compositing. This technique is becoming more familiar to people as often refered to as green screen. Art Department The Art Department is responsible for translating a Directors vision and a script into visuals that can be shared with the entire team to truly understand the creative and technical challenges that lay ahead. These concept artists and illustrators create everything from storyboards to photorealistic artworks that show what the finished shot will look like. Pre-viz Pre-visualisation Artists are responsible for creating the first 3D representation of the final visual effects shot. They use artwork and basic 3D models to create normally low-quality versions of the action sequences so the Director can start planning out camera placement and creative/technical requirements. Asset Department Virtual assets are need in visual effects to match real world objects or create new objects that don't exist or are too expensive to build in the real world. These are mostly created by modeling artists, texture painters, shader developers and riggers. Research and Development Considered a very technical department, RnD artists are responsible for building new software and tools to accomplish the tasks that can't be done, or are simply too time consuming for artists to manually complete over and over again. The role requires a very strong background in computer science and a passion for problem solving. Animation This one is pretty obvious. Basically anything that moves on film needs to be animated. It doesn't matter if it's a small prop like a chair, a huge space ship or even a hero character or creature. If it moves and has a performance, an animator will most likely be behind the controls. https://vimeo.com/324789206 Matchmove This is also referred to as motion tracking and without it there would be no way to incorporate 3D data into live action footage. To make digital assets appear as if they completely real, you need a virtual camera that moves exactly like the camera in the live action footage. This is where matchmove artists come to the rescue. It's their job to use the live action video footage and create a virtual camera for all departments to work with. FX Simulation An FX Artist designs and creates FX animation, procedural simulation, dynamic simulation, and particle and fluid systems. They are responsible for recreating the behaviour of real world elements such as fire, water, explosions, cloth, hair and a whole lot more that most people don't even realise. Highly technical, yet creative role. Lighting The lighting artist is responsible for applying all lighting effects to the digital scene. The artist takes into consideration the light sources of the live-action plate and applies virtual lighting to mimic the existing illumination within the environment. The goal is to ensure that the VFX and live-action elements blend seamlessly, as though both exist in the same environment. Matte paint A matte painting is an image, created using digital or traditional painting techniques, to create a representation of a scene that would be impossible for filmmakers to deliver in real life. This might be because the landscape does not exist in the real world, it's not financially practical to travel to a location, or to extend the set outside of its filmed parameters. Rotoscoping Rotoscoping is used to create a matte or mask for an element so it can be extracted out of place on a different background, masked out so colours can be changed or any other set of reasons. The rotoscoping artist will normally trace an object using a set of tools to create a new alpha channel for a specific part of an image sequence or video. Compositing Compositing is the action of layering all the various elements in a shot – live action, mattes, multiple CG passes, 3D lighting, animation, particle effects – and blending them all seamlessly to create the photo-realistic final shot. Working throughout the production process, you’ll need to collaborate with other VFX departments to creatively and technically problem solve along the way. Production Camera Tracking What is Camera Tracking? Camera Tracking is a process which involves taking a post that has been filmed with a real live camera and tracking it's motion so that 3D elements can be added to it. This process is used countless times throughout movies and tv shows to add special effects, backdrops, robots, you name it. 3D tracking softwares like Boujou, Cinema 4D, PFTrack, After Effects, Matchmover etc. are some of the professional tools used for camera tracking. Motion Tracking What is Motion Tracking? Motion tracking, in its simplest form of tracking. It is the process of tracking the movement of an object within a piece of footage. Once you've collected this trackdata from the selected point, you then apply it to another element or object. The goal of motion tracking is to extract information about the position, orientation, scale, and sometimes even deformation of objects or the camera over time. This information is then used to integrate computer-generated imagery (CGI) elements seamlessly into live- action footage or to apply visual effects that interact realistically with the scene. Motion Capture (Mo-cap) What is Motion Capture? Motion capture (sometimes referred as mo-cap or mocap, for short) is the process of recording the movement of objects or people. In filmmaking and video game development, it refers to recording actions of human actors, and using that information to animate digital character models in 2D or 3D computer animation. When it includes face and fingers or captures subtle expressions, it is often referred to as performance capture. Rotoscope What is Rotoscope? In the visual effects industry, rotoscoping is the technique of manually creating a matte for an element on a live-action plate so it may be composited over another background. These days chroma key is more often used for this, as it is faster and requires less work, however rotoscopy is still used on subjects that are not in front of a green screen. Rotoscoping in the digital domain is often aided by software like Silhouette, Mocha, After effects etc. Rotoscoping is a visual effects (VFX) technique used in filmmaking and animation to manually trace over live-action footage frame by frame, creating a matte or mask. This process involves isolating specific elements within a scene, such as characters or objects, to separate them from the background or to apply visual effects to them selectively. Traditionally, rotoscoping was done by hand, with artists tracing each frame of film or video footage onto a separate sheet of paper or onto a transparent celluloid sheet (also known as a "cel"). However, with the advancement of digital technology, rotoscoping is now typically performed using specialized software that allows artists to create precise masks more efficiently. Rotoscoping is used for various purposes in VFX and animation, including: Creating visual effects: Rotoscoping is often used to isolate elements within a scene so that specific visual effects can be applied to them separately. For example, it may be used to insert CGI characters or objects into live-action footage, or to apply digital makeup or enhancements to actors. Compositing: Rotoscoping allows artists to separate foreground elements from the background, making it easier to composite multiple layers of footage together seamlessly. This is essential for creating complex visual effects shots where live-action and CGI elements need to interact convincingly. Motion tracking: Rotoscoping can be used to generate motion data for tracking the movement of objects or characters within a scene. This data can then be used to apply motion to other elements in the scene or to create realistic camera movements in virtual environments. Fusion Camera System The Fusion Camera was invented by James Cameron and Vince Pace, and was said to be “the most elaborate and advanced camera system ever devised”. The Fusion camera combines the two images into a single image with realistic depth” which made the 3D more enjoyable to watch. The Fusion camera is a combination of two independent camera packages tied physically together. The camera created two images with the same approximate distance between a human’s two eyes. The line of sight of the lenses is adjustable so that, during a shot, they can be angled closer together to focus on nearby objects, or farther apart for those in the distance, just as your eyes do. Avatar (2010), Hugo (2011), Life of Pi (2012) are some of the movies that used this system. The Fusion Camera System is a technology developed by RED Digital Cinema, a company known for its high- performance digital cinematography equipment. The Fusion Camera System integrates multiple cameras into a single rig, allowing filmmakers to capture stereoscopic 3D footage with greater ease and flexibility. The system typically consists of synchronized RED digital cinema cameras arranged in a configuration that facilitates capturing left-eye and right-eye images simultaneously. This setup is crucial for producing 3D content, which requires separate images for each eye to create the illusion of depth. The Fusion Camera System offers filmmakers the ability to capture high-resolution, high-quality 3D footage suitable for a wide range of applications, including feature films, documentaries, and immersive experiences. It provides filmmakers with more control over the 3D production process, enabling them to create visually stunning content while maintaining the level of image fidelity and detail that RED cameras are known for. Digital Matte Painting What is Digital Matte Painting? Digital matte painting, is today's modern form of a traditional matte painting that was used in the entertainment industry. With the advantages of the digital age, matte painters have slowly transitioned to a digital work environment, using pressure-sensitive pens and graphic tablets in conjunction with a painting software such as Adobe Photoshop. A digital matte painter is part of a visual effects team being involved in post-production, as opposed to a traditional matte painter, who was a member of a special effects crew, often creating matte paintings on set to be used as backdrops. Dynamic Simulation What is Dynamic simulation? In computer animation, things like hair, cloth, liquid, fire, and particles can be easily modeled and animated using dynamic simulation. While the human animator animates simpler objects like a car or a character. Computer-based dynamic animation was first used at a very simple level in the 1989 Pixar short film Knick Knack to move the fake snow in the snowglobe and pebbles in a fish tank. In the context of visual effects (VFX), dynamic simulation refers to the process of simulating the physical behavior of objects or phenomena within a computer-generated environment. This could involve simulating the motion of fluids, such as water or smoke, the behavior of rigid or deformable objects, the interaction of particles, or the dynamics of cloth, hair, or fur. Dynamic simulations in VFX are typically based on physics-based algorithms and techniques that mimic real-world phenomena. These simulations are often used to create realistic-looking effects in movies, video games, and other forms of visual media. Some common types of dynamic simulations in VFX include: Fluid simulation: Simulating the behavior of liquids (e.g., water, lava) and gases (e.g., smoke, fire) to create realistic fluid dynamics and effects. Rigid body dynamics: Simulating the motion and interactions of rigid objects, such as collisions, gravity, and constraints. Soft body dynamics: Simulating the deformation and motion of flexible or deformable objects, such as cloth, rubber, or soft tissue. Particle simulation: Simulating the behavior and movement of individual particles to create effects like rain, snow, explosions, or magical spells. Hair and fur simulation: Simulating the dynamics of hair, fur, or other strands to create realistic-looking hair and fur effects on characters or creatures. Autodesk Maya: Maya is a versatile 3D modeling, animation, and rendering software widely used in the VFX industry. It includes powerful tools for dynamic simulation, such as nParticles for particle effects, nCloth for cloth simulation, and Bifrost for fluid simulation. SideFX Houdini: Houdini is renowned for its powerful procedural workflow and advanced simulation capabilities. It offers a wide range of tools for fluid simulation (using the FLIP solver), rigid body dynamics, cloth simulation, particle effects, and more. Blender: Blender is a free and open-source 3D creation suite that includes robust simulation tools. It features a built-in fluid simulator, cloth simulation, rigid body dynamics, and particle systems, making it a popular choice for independent artists and smaller studios. RealFlow: RealFlow is a specialized software for fluid simulation that is commonly used in the VFX industry. It offers advanced tools for simulating liquids, such as water, as well as other fluid effects like splashes, foam, and bubbles. Cinema 4D: Cinema 4D is a 3D modeling, animation, and rendering software that includes dynamic simulation capabilities. It offers tools for cloth simulation, rigid body dynamics, and particle effects, making it suitable for a variety of VFX tasks. Questions Part A 1. What is digital compositing? 2. What is motion capture? 3. What is camera tracking? 4. What is motion tracking? 5. What is digital matte painting? 6. What is dynamic simulation? 7. Give four examples of dynamic simulations used in movies? 8. What are the advantages of using digital matte painting? 9. What is camera tracking used for? Part B 1. Write a short essay on CGI in VFx? 2. What are the uses of rotoscope in effects shots? 3. What is CGI? How is it used in movies? 4. What is the fusion camera system?