Digital Video - Unit 13 - IGNOU PDF
Document Details
Uploaded by Deleted User
null
IGNOU
null
Tags
Summary
This document is an educational material that covers fundamental concepts and various aspects related to Digital Video. It specifically covers the introduction to digital video, learning outcomes, video basics, highlighting digital and analog comparison, the process of video production by looking at pre-production, production, and post-production stages. The keywords are digital video, video production.
Full Transcript
Digital Audio UNIT 13 DIGITAL VIDEO Structure 13.0 Introduction 13.1 Learning Outcomes 13.2 Video Basics 13.2.1 Analog Vs. Digital 13.2.2 Frame Rate and Resolution 13.2.3 Interlaced...
Digital Audio UNIT 13 DIGITAL VIDEO Structure 13.0 Introduction 13.1 Learning Outcomes 13.2 Video Basics 13.2.1 Analog Vs. Digital 13.2.2 Frame Rate and Resolution 13.2.3 Interlaced and Non-interlaced video 13.2.4 Video Colour System 13.2.5 Capturing Digital Video 13.2.6 Video Compression and Streaming Video 13.3 Digital Video Technology 13.3.1 IEEE 1394 13.3.2 MPEG 13.4 Computer Configuration for Digital Video 13.4.1 Computer 13.4.2 Storage Device 13.4.3 Video Capture Card 13.4.4 Camcorder 13.4.5 Cables and Connectors 13.5 Process of Video Production 13.5.1 Pre-production 13.5.2 Production 13.5.3 Post Production 13.6 Video Editing Using Movie Maker 13.7 Using Web-based Video Editing Tool 13.8 Let Us Sum Up 13.9 Keywords 13.10 References and Further Readings. 13.11 Feedback to Check Your Progress Questions 13.0 INTRODUCTION With the universal invasion of television in to our homes, there is perhaps a very rare chance that you are not aware of video as a medium of communication. And with hundreds of different channels focusing on different target audiences, you are also aware of the wide range of possibilities the video medium offers. From a news reader reading out a report kind of presentation to a high drama crocodile hunter in action, to Swamis demonstrating yoga to a soap opera, to a movie to a game show, you have perhaps by now seen almost all possible ways in which this medium can be leveraged for effective communication. As opposed to cinema, television is known to be a close up and a far more invasive medium - it can reach out to you even when you are relaxing in your bed room. And at a technological level, video has become a common man’s medium. Today even a novice can wield a small hand held video camera, even a mobile phone with a video recording facility and make short 51 Content Creation Tools movies. It took some time for people to recognize the power and utility of this medium and its miniaturized, simplified technologies, but it has begun to be universally used. For our purpose, video is no more restricted to buying a DVD (earlier you would buy a videotape) and showing it in the classroom. Every teacher, every student can now be the author of a video. Think how this opens out a large canvas of possibilities for its use in education, and how it supports modern educational thought and practice. We will attempt to understand this medium from a technological point of view, so that you could leverage this to make your own educational communication more effective. Now that video has gone digital, we also dream of placing video files on the web and play or download it world wide. The size of the video file will of course have enormous implications. Ways of managing this size and its effects on the attributes of the video itself will also be explored. 13.1 LEARNING OUTCOMES After working through this unit, you are expected to be able to: Identify different types of digital video formats; Describe the advantages of digital video; Record and create digital video for use in different conditions; and Design a video for an educational communication 13.2 VIDEO BASICS While visually video resembles cinema very much, technologically it is drastically different. In fact this is what has enabled video to be miniaturized and converted into an amateur device. Cinema is based on a series of still pictures, played at a speed where the eye sees it as a constant picture, or a continuous moving scene. Hence if an action is depicted as a progressive set of pictures, and played at this speed, the viewer would see the action, as if it is happening in real time. That is why this technology of moving images is referred to as a movie. Video retains this concept of playing out frames sequentially and rapidly, but unlike cinema, stores the information on a magnetic media (and nowadays, on optical media too). While recording, the optical information is stored as electromagnetic information on the magnetic tape and played back using a reading head like in an audio tape recorder. This has resulted in a large range of devices of different formats and sizes of tapes and tape recorders. You would have heard of VHS, S-Video, DAT etc. Among the optical playback media, you would have heard of VCD and DVD which use a laser beam to read. Many other formats have been used in professional recording. Playing back of video also differs from cinema at the technological level. While you would use an optical projector, shining light through a film, in the case of cinema, you use a television monitor to play back video. The television monitor uses a cathode ray tube to reconstruct the image from the electromagnetic signal received from the tape or the camera. The way the television plays back information has implications for picture quality. Let us examine this in some detail. The cathode ray tube, as may be 52 aware produces a continuous ray of electrons. These electrons, when they Digital Video strike a phosphorescent screen produce a spot of light, a dot on the screen. In a television monitor, this ray of electrons is made to scan across the screen to generate a line, rather a closely packed row of dots (You have learnt this in Unit 11). At the end of each line, the signal retraces to the left edge of the display, goes a step down and then starts scanning the next line, and then the next. Starting at the top, all of the lines on the display are scanned in this way. One complete set of lines makes a picture. This is called a frame. Once the first complete picture is scanned, the scanning circuit retraces to the top of the display and starts scanning the next frame, or picture. This sequence is repeated at a fast enough rate so that the displayed images are perceived to have continuity and if the image depicts motion, you would see a continuous one. In the case of a black and white television monitor, the screen produces a white spot, whose brightness is varied through a range of gray shades from black to white, thereby producing the picture. In the case of a colour television, each dot on the screen is actually composed of three dots of phosphor, capable of responding to the electron beam and produce red, green and blue dots respectively, which together can generate an image of any colour. Check Your Progress 13.1 Notes: a) Write your answer in the space given below. b) Compare your answer with the one given at the end of this unit. Describe the scanning process in a cathode ray tube based monitor. What limitations does it place on the display of images?............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. 13.2.1 Analog Vs. Digital Electricity and Electromagnetic waves, which include light, radio waves or even x-rays are transmitted as a continuous wave, which depends on the frequency and the amplitude of the varying quantity, say voltage, or colour and intensity of the light. Any signal which can be superposed on these waves can use them as a carrier and be transmitted across large distances or stored on magnetic or optical media. This is exactly what happens to the audio and video signals generated by the video camera. And as the variation of the audio or video signal (being electrical in origin) is analogous to the carrying electrical signal, we refer to it as analog. 53 Content Creation Tools Figure 13.1: An analog and a digital signal A typical cathode ray tube based television monitor, strips the carrier and uses the audio and video signal to generate the image and sound. A digital signal is composed of a square wave, the flat peak representing one or on or true and the base line representing zero or off or false (See Figure 13.1). All computers and therefore computer displays receive and process digital signals. If you happen to have a cathode ray tube based monitor, then the graphics card on your computer is actually converting the digital signal to an analog signal, which is then processed to display images. The LCD, Plasma or LED displays directly use the digital signal to display images. Some of these may even have a capability to receive and process an analog signal, in which case it converts it into a digital signal. Such conversion from analog to digital or vice versa is referred to as modulation and demodulation. 13.2.2 Frame Rate and Resolution A cinema or motion picture reel moves 24 frames every second, a television monitor 25 frames every second. This means, between every single frame there must be an interval of time, however tiny, when the screen is actually blank. The human eye’s inability to see these blank in-betweens is what has made cinema, animation or television possible. Referred to as persistence of vision, this failure enables our perception of one continuous smooth motion. The minimum frame rate to achieve the illusion of a moving image [persistence of vision] is about fifteen frames per second. Worldwide, three major video standards have been adopted. While PAL (used in Europe, Asia, Australia, etc.) and SECAM (used in France, Russia, parts of Africa etc.) standards specify 25 frames per second; NTSC (USA, Canada, Japan, etc.) specifies 29.97 frames per second. This is referred to as the frame rate. How fast is this? To give you an idea, consider the fact that you would see 25 frames every second using a PAL device, for you to perceive a continuous image. So every frame is constructed in one-twenty fifth of a second. A typical television screen consists of 576 horizontal lines. And each line is made up of 625 dots, creating a matrix of 360,000 dots. So each point of light constructed on the screen stays for 1/9,000,000 of a second. 54 We considered how a still image is constructed in Unit 11. You may like to Digital Video revise it at this stage. Television and in fact computer screens too construct video images in exactly the same fashion, except that it changes to 25 times every second. Modern day displays have evolved from the cathode ray tube to liquid crystal displays, plasma displays and light emitting diode displays. While each of them combines the red, green and blue dots to produce a dot of the desired colour, using different techniques, the basic scanning process we described above is still the same. What has changed of course is the size of the dot and the number of dots packed across the screen. This makes the screen display brighter and display more vivid colours, closer to the true colours. The characteristics of the display, and in the case of a computer controlled system, the graphics capability, which governs the input to the display, have a bearing on the resolution of the image. While in the case of an analog display, the numbers of scan lines were fixed, so a larger screen only spaced out the dots. Obviously, you could not sit too close to a large screen television. With the advent of digital displays however, the number of pixels that could be packed on to the screen emerged as the factor. So, we have a range of display resolutions, 640 x 480, 800 x 600, 1024 x 768, and now 1280 x 1024 on regular desktop computer displays. As video is played in a window, the size of the video frame could be any fraction of this, all the way to a full screen one. So, if the original source has a smaller frame size, say 320 x 240 and you play it on a screen set to 1024 x 768, the video will play in a small window. If you now maximise the window or shift to a full screen video mode, you will still see the video but of a much poorer quality. The information available in the original video is smeared over a larger area leading to a loss in quality. Notice that 640/480 or 800/600 or 1024/768 is in the ratio 4:3. This ratio, known as the aspect ratio, is in fact the standard for video. With the emergence of High Definition television (HD TV), a ratio of 16:9 is also becoming popular. Check Your Progress 13.2 Notes: a) Write your answer in the space given below. b) Compare your answer with the one given at the end of this unit. What is screen resolution? What do the numbers 640 x 480 or 1024x768 associated with it signify? What characteristics of an image will change when displayed on monitors of different resolutions?........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 55 Content Creation Tools 13.2.3 Interlaced and Non-interlaced Video There are two different techniques used to “paint” the picture on the screen. Television signals are interlaced, and computer signals are progressive or non-interlaced. In the case of interlaced scanning, each picture (frame) is divided into two separate sub-pictures, referred to as fields. Two fields make up a frame. An interlaced picture is painted on the screen in two passes, by first scanning the horizontal lines of the first field and then retracing to the top of the screen and then scanning the horizontal lines for the second field in-between the first set. A progressive, or non-interlaced, picture is painted on the screen by scanning all of the horizontal lines of the picture in one pass from the top to the bottom. The technique of interlacing was to overcome the limitation of the speed of the scanning process, which led to a flicker in the image. With the advent of digital technologies, speed of all processes have increased tremendously and a progressive scan is becoming the standard, particularly with LCD, Plasma or LED screens. 13.2.4 Video Colour System Just like the still image, the video is also constructed using the primary colours — Red, Green and Blue. Most home televisions and computer monitors accept a signal which is a composite of these three colours and information for synchronization of the signal. Professional video devices use other combinations, or even separate signals for each colour. See Unit 11 for a description of how these three primary colours combine to generate other colours. 13.2.5 Capturing Digital Video Capturing digital video can be conceived of in two different contexts: one, where the source is analog; and two, where the source is digital. If you are using a video cassette player and a VHS tape as your storage or an analog video camera, you would require converting your video to a digital format. The basic device used for this purpose is a video capture card attached to a computer. The device and its supporting software perform the following functions: Convert the analog signal to a digital signal; Synchronise with the video playback device and copy video; Process the signal, reading and converting the video in to an appropriate format; Convert the image size, compress the image, or otherwise modify it to desired requirements; Facilitate the storing and retrieval of the digital video. There are a variety of video capture cards and software and their choice depends on various factors like the source of video, the desired purpose and quality of the output digital file, and formats in which the video will be stored and processed. Different cards also permit different range of compressions of video (see section on video compression). If the source of video is a digital device, say a digital handycam or a digital still camera with a video option, then standard ports like firewire or USB could be used depending on the output format of the camera. Generally, such devices have a built in storage medium like a memory card or disk. In such cases, the process is simply that of reading and copying the digital video files on to the hard disk of your computer. 56 13.2.6 Video Compression and Streaming Video Digital Video At this stage you might once again like to revise the section on file formats and file sizes in Unit 11. Just like it was necessary to resize and compress a single image, it is necessary (in fact even more so) to resize and compress video. In the analog television world, the issue did not arise, as there was one standard being used across – same image size, same number of lines, same colour processing technology. Of course there were differences between PAL, SECAM and NTSC, they were mutually incompatible anyway and you did not resort to conversion. During that phase, television and video remained a job for the professionals. Based on the purpose for your video – you might want it stored as a VCD, or as a DVD, or as a digital file on the hard disk to incorporate it into a slide show or a multimedia project. You may also want to upload your video on to the web and make it available to all. Each of these requirements will have different implications for the image size, file size and format of your video. Playing back a video from a video player or a computer involves the reading of the file (decoding the format), and sending portions of the file in the right sequence from the storage area to the monitor. One would also expect a smooth playback of the video with the audio properly synchronized. In fact, the eyes and ears together can easily discern even the slightest mismatch between the audio and the video. A simple rule of the thumb – smaller the file size, smoother will be its transfer. In the case of a computer, it would also involve the speeds and capacity of the CD/DVD drive, the memory (RAM), and the graphics card. To the extent, the file size is within the capacity of the computer, you can be sure of a smooth video playback. If the file is being played from the internet, the bandwidth of the internet connection and the speed of the server serving the video file also matters. So, how do we ensure a smooth playback? What characteristics of the video file will affect its smooth transfer? Image size of your video is identical to that you used for a still image. It refers to the width and height of the video window. As we noted earlier, the video window can be resized. So, unlike a television screen, where the image is always full screen, your video could occupy a portion of your screen, the remaining being used for other text and graphics. Simply put, larger the image size, larger the file size. You can resize the video using any video editing software. The format of the video file also affects the size. There are a large number of video formats and a large number of media players too. Some examples are avi (audio video interleave files), qt (quick time files), mpg, mpg2, mpv2 (MPEG video files), mjpg (motion jpg file), mov (apple quick time), flv, swf (flash video files). As these file formats are specific to the video capture card you use, and you may not have much choice, while handling video from an analog source. Also, cameras and video editing software are built around specific file formats. So, even with digital files you may not have too much of a choice. Again, if you wish to serve your files through the web, the operating system and support software on the web server will restrict your choice. So for practical reasons, you may have to work with a few video 57 Content Creation Tools formats. But, you can still convert digital video files from one format to another using video convertor software. Like in the case of still images, compression techniques are adopted to reduce file size. We will not get into the details of video compression, except to note that reduction of image size and change of format are both used appropriately to compress the file. Typically a computer file can be operated upon only when it is complete. So when you try to playback a video, it will not play until the entire file is downloaded into the memory of the play back machine. This will seriously limit the performance of the video if the file sizes are large. In order to optimise the playback capability, the technique of streaming is adopted. Streaming basically involves enabling a player software to play the file as soon as a pre-defined part of it is downloaded. The video begins playing, while the file is progressively being downloaded. At the server end, it involves making copies of the video file available and enabling progressive downloads to each requesting client. Check Your Progress 13.3 Notes: a) Write your answer in the space given below. b) Compare your answer with the one given at the end of this unit. What factors come in the way of video being broadcast on the Internet?........................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................ 13.3 DIGITAL VIDEO TECHNOLOGY The digital signal basically consists of high voltage pulses and the blank between two consecutive pulses representing the base voltage or zero. This technology, therefore, has some distinct advantages over analog signals. Firstly, it can be transmitted over larger distances without losses. Even when losses do occur, it is easier to reconstruct them as they are a steady stream of pulses. They occupy much less space on storage and transmission devices leading to higher density transfer and hence better quality images. Random access to different parts of the digital file is possible thereby facilitating non- linear processing. This has led to enormous time and cost saving in the production process. The development of optical media (CD, DVD and now Blu-ray) has also eliminated the problems of wear out and fungal attacks which were common on magnetic tapes. Having thereby reduced the dependence on professional devices, it has now become possible for even normal desktop computers and laptops to play back or even construct videos. 58 13.3.1 IEEE 1394 Communicating with Graphics IEEE 1394 is a standard developed by the Institute of Electrical and Electronics Engineers for high speed isochronous (a guaranteed and steady rate) data transfer. Most professional video devices support this interface which supports data transfer rates of up to 400Mbps (in 1394a) and 800Mbps (in 1394b). The high speed of this connection makes it ideal for devices that need to transfer high levels of data in real-time, such as video devices. And like USB, 1394 supports both Plug-and-Play and hot plugging, also providing power to peripheral devices. Apple Inc., which originally developed the technology, calls it the FireWire port. It is also known as i.link or Lynx. So, when you need to transfer video from, say your camera to the computer, and have a choice of a fire wire port, you should opt for it. Also, if you have a choice while selecting a camera or a video capture card, you could look for this port. 13.3.2 MPEG This is a standard for audio and video files, developed by the Motion Pictures Expert Group (therefore called MPEG). Extended from the JPG format, it is also a lossy compression format, meaning, you lose information when you compress. This format has become extremely popular, particularly in audio (mp3) and video (mpg). While the first version MPEG1, compressed to 1.5 Mbit/s is still used to produce digital video of VHS quality (all VCD, for instance) broadcast video and audio use MPEG2 format which compresses video to between 3 and 15 Mbit/s (used for DVD, cable television, etc.). MPEG2 devices are backward compatible that is they can decode MPEG1 also. While we are very likely to be using digital audio and video, we need to be aware of these standards and the file formats derived from these in order to choose appropriate codecs while digitizing, editing and exporting audio and video on to different devices or platforms, for instance the internet, or a VCD or a DVD. 13.4 COMPUTER CONFIGURATION FOR DIGITAL VIDEO Most modern day desktop computers and even laptops have adequate capabilities to process digital video. By processing, we mean importing video from a camera, downloading a file and playing it back. What should those essential characteristics of the computer be? We would like a computer to be able to allow us to edit video too – crop it, trim it, add some special effects, titles, sound, transitions, resize it, compress it and store it in a desired media or even upload it on to a website. What should the computer be equipped with to handle such tasks. 13.4.1 Computer Most graphics and video tasks are handled by the graphics card. Supported by a large enough memory, it is this card which enables smooth display of video. Modern day graphics cards have their own processors, their own memory and handle the entire graphics tasks literally on their own. The larger the graphics memory, larger will be the file size it will handle, which 59 Content Creation Tools translates to higher resolution, larger image size, and better colours. Most media management and editing are carried out by software. 13.4.2 Storage Device Large enough hard disks will give you the freedom to acquire and store audio and video in large quantities. A typical raw (uncompressed) full screen video of say an hours duration could occupy up to about a 100 GB, simply beyond the capability of any computer to process. But, fortunately, nobody works with the whole video at a time. We always work with individual shots, which are typically a few seconds to a few minutes long. The storage space is also for the sound effects, music, commentary, stock shots, and various versions of our unfinished product. Hard disks have crossed the tera byte (1 000 000 000 000 bytes) range and hence no more an issue. 13.4.3 Video Capture Card Video capture card, as we discussed earlier would be a very critical component if we have an analog source of video. The file formats, the compression, the editing software offered and therefore the quality of the final output will very much depend on this card. Connected as a daughter board on the computer, this card will possess all the audio and video inputs and outputs. Different kinds of connectors depending on the source, software to control the video recorder, selection of options and interface with the editing software could all be part of the video capture card. If you are dealing with high quality video from a professional camera, the video capture card will be the most critical and also the most expensive part of the system. There are a range of cards available with software support for different operating systems. If you are setting up a video editing unit, the capture card will be the first item to be selected. The specifications of the computer and the range of accessories needed will be decided on the basis of this card. In turn, the card selection depends on the type of analog video and audio sources you have. Video capture cards are also used to export the edited video to analog storage devices, video tape for example, or to CD/DVD. If you wish to convert your video into the CD/DVD format, appropriate software codecs for compressing video and for conversion of your video to an appropriate format will be required. Besides you would require a drive and blank disks which can support CD/DVD writing. Fortunately, digital audio and video sources are becoming exceedingly common and match professional analog equipment in quality and performance. Hence, you may wish to resort to digital equipment, in which case, you may at best need to interface your camera or recorder to the computer and copy files. 13.4.4 Camcorder Traditionally, the two major components of the video recording system, namely, the camera and the recorder were separate units. This imposed various restrictions on their use – two people at least to operate the two units, need to interface them, and work flow management. In a typical studio recording situation, where multiple cameras are used simultaneously, sharing of a recorder made very good sense. 60 Camcorders on the other hand combined these two units into one. So you Digital Video had the camera, the imager (usually the CCD or CMOS) for recording the light and the recording media (tape, optical or solid state device) all rolled into one compact unit, making it very easy and efficient for one person to operate. Camcoders range from very small, lightweight hand held cameras to broadcast quality professional cameras. As we mentioned, they can support a varied range of storage devices, like video tapes, CD/DVD, harddisks and of late, solid state memory disks (also known as flash memory). Figure 13.2: Camcorders 13.4.5 Cables and Connectors The need to upload the recorded video and audio on to the editing unit requires a suitable interface between the two systems. The simplest way is of course to remove the media and play it back in a player connected to the editing unit. But in case you need to connect the camcorder to the editing unit directly, you can do it in one of four ways: component; composite; firewire; and USB. In the case of component, the video signal is analyzed into three separate signals, R-Y, B-Y and Y, where R stands for red, B for blue and Y for luminance or brightness. At the receiving end, either at the editing unit or at the television monitor, these signals are interpreted to yield the information for red, green and blue for the generation of the coloured image. The composite, as the name suggests is a combined (or rather pre-mixed) signal. The amateur hand held camcorders generally provide a firewire or a USB (or both in some cases). These are purely digital interfaces and used for direct data transfer. In these cases, you would copy an image file from the recording media to the other storage media. Each of these types of video transfer would require its own kind of cables and connectors, as shown in the Figure 13.3. 61 Content Creation Tools Composite cable Component cables Fire wire cables USB cables Figure 13.3: Types of Cables It is obviously beyond our scope to study more details about the technological differences between each of these connections, but if you are using a video camera and need to upload video on to a computer, either directly or through a video card (digitise video), you need to be aware of the types of connections available and select appropriate cables for the purpose. Check Your Progress 13.4 Notes: a) Write your answer in the space given below. b) Compare your answer with the one given at the end of this unit. Describe the typical chain of equipment you would require to develop a video and play it back on a DVD player............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... 13.5 PROCESS OF VIDEO PRODUCTION One of the major purposes for this unit was to acquaint you with the potential of video for educational communication. While it is likely that you may opt for using a trained technician or a professional for the technical processes of recording, editing and packaging, a process that cannot be outsourced is the communication. Only you can define the purpose, the scope and range of content, identify and put together the information you wish to present. Let us briefly examine the ways in which these tasks are accomplished. 13.5.1 Pre-production Like any communication, you need to plan a video too. Just like a lesson plan, you would identify the objectives, some kind of an introduction, develop the content using appropriate examples, activities, experiments, supporting it with other information which could be graphical (graphs, photographs, drawings, etc.) and speech. You put them all together in to a script, which forms the guideline for the production. 62 Video allows even more flexibility. Unlike a teacher in a classroom, the video Digital Video presenter is not fixed in time and space. You can, for example say, let us examine this crater on Mars, and the visual could be a close up of a Martian crater. You can substantiate your argument or support your information with still images, animation, or video inserts. In order to make your statements authentic, you can also bring in various people to narrate their experiences or present their arguments (straight from the horse’s mouth). And finally you could use music and sound effects. And obviously all of these cannot be achieved in 45 minutes (a typical period in your class). As a producer you would identify the resources, the people, the locations, the logistics, the schedule and equipment to carry out the task. The script would be your guideline. Video of course has its limitations too. You are not there to catch that bored look on the student’s face and modify your pace, change your example, crack a joke, or even ask the student to wake up! You have for all purposes handed over the remote control. This is where the creative part of the script fits in. A video script is not only a prescription for the technical parts but also a visualization of the scenes and dramatization. Purposefully using it hold the attention of the viewer, get him/her interested and reach out with the message is an art, which would require practice. 13.5.2 Production This is the part where all the technology is first put to use. This is where the script is translated into an audio-visual medium. While in a professional production, a large number of people and a range of equipment related to light, sound and video could be involved; your production may even involve one camera and one person in front of it. The choice is dependent on the complexity of the script. All the recording of audio and video is achieved at this stage. For practical reasons, you may have taken repeated shots of the same subject, overcoming errors or shooting from alternate perspectives. If the scene involves people conversing, then you would have a large number of shots, together establishing the conversation, etc. We will not be able to get in to the details here. Suffice it to say that you will have more video than you actually need for the final product. Care should be taken to record audio which is loud and clear enough to suit the requirements of your script, avoiding all noise to the extent possible. Systematic planning is very critical to ensure that you have all the shots your script requires. You may not be able to revisit a location, or organize all the people you need again. There may simply not be a second chance. Good producers spend a lot of time reading and re-reading the script, visualising each shot to its last detail and working out all the requirements, well in advance. 13.5.3 Post Production The final assembly, and trimming of video, adding sound and transition effects, removing noise, adding titles, graphics or other inserts and publishing it on to a CD, DVD or video tape as a finished video product happens at this stage. Also known as editing, the first step in this process is gathering together all the raw materials or resources you require – the video, the audio, the 63 Content Creation Tools graphics, etc. In the digital world, the editing process is carried out using editing software. All the resources have to be digitized (using your video capture card) and stored on the hard disk in formats appropriate to the editing software. Each piece of video – a shot or a clip is previewed many times over, the start and end points noted and sequenced. This process could be achieved on paper or in the case of some editing software, electronically and is referred to as the development of an edit decision list (EDL). Typically, editing software allows us to work on pieces of video, in any sequence we desire. You could for example complete some part of the video which is in the end and then take up the beginning. That is why such editing is also popularly known as non-linear editing, as opposed to analog tape based editing, where the process had to be achieved sequentially beginning at the very beginning. Check Your Progress 13.5 Notes: a) Write your answer in the space given below. b) Compare your answer with the one given at the end of this unit. Choosing a relevant instructional situation, explain how a video can be more effective than a face to face presentation............................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................... Depending on the computer platforms – Wintel or Apple or dedicated video platforms and the video capture card, there are a range of editing software, with enormous features, effects and capabilities. While discussing their details is beyond the scope of this unit, we will introduce you to two editing software, one which is part of your Windows operating system’s accessories – Windows Movie Maker and another a web based video editing tool known as Kaltura in the subsequent sections. We will assume you have access to some digital video and audio stored on your computer’s hard disk. Even if you do not, you should be able to achieve this easily based on what you learnt in the previous sections. 13.6 VIDEO EDITING USING MOVIE MAKER Windows Movie Maker is generally installed when you install Windows XP or Windows Vista operating systems. It is a simple straightforward movie making software and allows you to edit your video and publish it as a web based video or for playback on a desktop computer. It does not expect you to know much about video production and hand holds you through a step by step process. 64 Digital Video Figure 13.4: The Windows Movie Maker interface Launch Windows Movie Maker. You should see the following: A task pane listing the steps you need to perform, a collections window, a player window, and a timeline/ storyboard window. We would strongly suggest you read the Help Topics under the Help menu before you begin using the software. (see Figure 13.4). The basic processes involved are creating a collection – a collection of your audio and video files and any other graphics file you need; making a project – a container file which stores information about the sequencing, timing, effects and any modifications in the original clips you have made; and publishing the movie – a final product file which will play back independently in a suitable player software. You may import video and audio directly from your camera or microphone, or use media files stored on your hard disk or any other storage device connected to your computer. You may use a video capture card, a USB or a IEEE 1394 port to connect your camera or even your sound card to import audio. The task pane lists the steps you take: 1. capture video, 2. edit movie, and 3. finish movie. Let us look at each of these steps briefly. In our example, let us use video files stored on our hard disk, say in the folder c:\my documents\my video. The video files may be in the.avi,.mpg, or the.wmv format. Let us use the.wav files for our audio. For graphics let us use.jpg files (see help topics for other supported formats). The listing in the task pane works like a wizard, taking us step by step through the movie making process. So let us use this support. Click on import video (if you were importing from the camera, you would opt for capture from video device), select the video files and click Import. Repeat the process for importing graphics (import pictures) and sound (import audio or music). Your collections pane gets populated. Notice the byline – 65 Content Creation Tools Drag a clip and drop it on the timeline below. Do just that. Select the clip you wish to place first in the movie and drag it on to the timeline below. The timeline and story board are alternate views of your project. While the storyboard allows you to view the sequence of clips and any audio, transition effects, or titles, the time line allows you to review or modify the timing of clips in your project. Use the timeline buttons to perform tasks such as changing the view of your project, zooming in or out on details of your project, recording narration, or adjusting the audio levels. You can use the trim handles, which appear when you select a clip to trim unwanted portions of your clip. You can also preview all of the clips in your current project which are displayed on the timeline. After dragging all the clips on to the time line in the right order (you may re order them to your heart’s content), zoom into the clips trimming it where necessary or to synchronize it with the audio and to make your movie slick. Let us now add some captions on to the video. Perhaps these are subtitles in some other language, or the movie’s title or credit titles at the end. There are two forms in which titles appear – one as an independent graphic on its own background, and two as an over lay on the video, like in the case of a sub- title. Go to tools -> Titles and Credits. The titles pane gives you the options – you may add a title at the beginning or end of the movie, before or after a selected clip, or overlay it on a clip. Let us select ‘Add title’ on the selected clip on the timeline. A new pane with a text window opens, where you may type in your text. Notice, an instant preview is generated in the player pane. Also notice a video effect is already applied to the title. If you are satisfied, click done add title to movie. If not, move down to more options. You can now change the animation effect or the text effect. Let us change the text. Click on change the text font and colour. The resulting pane allows you to select a new font, a new colour for the text, increase or decrease the font and justify the text in the Figure 13.5: The titles pane in Windows Movie Maker frame. Notice there is no option for the vertical position of the text. Each of the titling effect comes with preset positions. So, if you are not happy with the position of the text, go to change title animation and choose from among the one line and two line text animations. As you can see an instant preview, you can select the one that suits your need. Finally, click done, add title to the movie. Now you would like to publish your movie. Go back to the movie tasks pane and open ‘Finish Movie’. You have a variety of options for saving or sending this movie – save on computer, save on CD, send in e-mail, send to web or 66 even send to a digital video or DV camera. Let us for now save this movie on Digital Video your hard disk. When you select this option, a wizard opens to guide you. Select the location to which you wish to save and click next. Now you have options for selecting frame size and quality. The default is best quality and the details of this choice are listed below. Click on more choices. Go to other settings and pull down the options. You will see a very large range of presets. Let us say we wish to finally post this video on our blog and that our audience has slow broadband connections. Opt for Video for broadband (150 Kbps). Notice all the settings change to accommodate this new restriction. Click next, and that’s it; the movie is saved in the destination folder. If you have checked the option play movie when I click finish, you will see a grand play back of your movie. As in the case of all software, there will be a number of choices available at each step. Do explore them and have fun making movies. 13.7 USING WEB-BASED VIDEO EDITING TOOL Having produced your video, however small or big it is, you would certainly like to show it to others. If you opt for offline methods, you may put it on a CD or DVD. But if you wish to make it available online, you will have to be aware of a few more issues. As we have discussed earlier, size here refers to the file size as well as the pixel size of the video frame. Internet is still not fast enough to support smooth play back of video. Enthusiasts however have found workarounds. Small stamp size video, a few seconds to a few minutes long placed on a streaming server is adequate to communicate small messages. You will find a large number of such uses on the web. Look up http://youtube.com for instance. Compression formats for video files, for instance, shockwave and flash video are also being used to deploy banners and even tutorials. Figure 13.6: Kaltura – an online video editing facility 67 Content Creation Tools Having worked out an optimum size for your video file, finding a place to host it will be the second challenge you need to address. Web servers use media servers (software capable of serving audio or video) to service requests for playback of video. Flash media server, Real media server, etc. are commonly used in the windows domain. Not only is the preparation of your video for web hosting an involved task, but web hosting charges can also be steep. Before you learn to cope with the technological requirements, you should try this out. While a search for “free hosts for video” on a search engine like Google will yield a number of sites willing to host your video, utmost discretion is required, as many of these sites earn their revenue hosting undesirable or adult content. One of the sites, we recommend is Kaltura (http://corp.kaltura.com). This is an open source video platform, where you can upload your raw video, edit and then publish it for online viewing. This site provides a free trial service to a limit (10GB) and offers to upgrade you to a paid service thereafter. The free trial includes 10GB of hosting & streaming. A simple calculation will show that this is actually not much if you have a lot of video or a lot of clients. If you have a 1MB video and about 100 people viewed it every day, you would have used 100MB each day. 10GB is hundred times this figure. That is, if you host 100 files of about 1 MB each, and 100 people viewed each of these videos, then your trial offer is over in a day. But the purpose of this exercise is not to provide you with a web hosting option for your videos, but to demonstrate a capability of the web and help you explore the technology associated with video post-production and hosting. So, we will assume, that you have a few videos of a few MB each, for the trial. The proposed video activity can be quite effectively completed within the trial period. As the first step, sign up for a free trial. This involves filling up a form on the Kaltura site and providing a few details. Once you are a registered user, you are given access to Kaltura’s tools. The first task would be to gather together your raw audio, video and photographs, a process called ingestion. Kaltura handles a variety of formats for each of your media. This application also allows direct upload from a web camera or microphone and even from VHS video players. This is a straightforward step, where you browse your hard disk or the web, select the files and click on upload (see Figure 13.7 ). Once uploaded, you can add keywords and other descriptors, which will help you organise and users search for your video. Figure 13.7: Uploading Video on Kaltura 68 The second step is to compose your video, a process that requires you to Digital Video work with a video editor. Kaltura’s video editor comes in two forms, one straightforward one, which is somewhat similar in functionality with Windows Movie Maker and a more premium version, where you can work with advanced features and modify your audio, video or photographs, apply transitions and effects, add subtitles in a variety of ways, etc (see Figure 13.8 ) Figure 13.8: Kaltura’s basic video editor Kaltura’s Video-PowerPoint® Widget allows you to synchronize video with PowerPoint® slideshows, add subtitles and captions in any language to any type of web video, delivering a rich contextual multimedia experience (See Figure 13.9). The special feature of these subtitles is the facility to select a preferred language for the subtitles, which means you can simultaneously cater to different language groups. Figure 13.9: Kaltura’s Power Point Widget The next steps are relevant when you are ready to host the video on your web site or run a course or forum, in which the video is integrated. Then you need to get into managing your videos, decide on how to stream them, and ways to embed the video (what player, how will it look, what will be the controls, etc.) on the web page. You will learn about websites and content management systems in later units. Revisit this unit when you have experienced video on the web. 69 Content Creation Tools Again, as we have mentioned time and again, all the features cannot be exhausted in such a unit. Do try this online video editing software and explore its wide variety of features. 13.8 LET US SUM UP We present to you below a brief summary of discussions in this unit: With advances in technology video has become a very popular medium for audio-visual communication. The advent of digital video and its processing on normal desktop computers and miniaturization leading to small hand held camcorders have been mainly responsible for this popularity. Cinema is based on recording images on film and playing it back rapidly enough to simulate real time motion. Video and television on the other hand use a scanning of a stream of electrons across a photo sensor while recording and a phosphorescent screen while displaying images. This scanning process has a bearing on various parameters such as aspect ratio (size of the video frame), colours and their range, recording and playback, digitization and transmission of video signals. Analog video can be converted into digital video using a video capture interface. Once digitized, video can be stored as a digital file in a variety of formats. They can also be converted from one format to another using appropriate converter. Digital video can be stored on a variety of media like video tape, CD/DVD, hard disks and flash memory devices like memory sticks, etc. Digital video formats are selected based on the purpose of the video and the storage and transmission medium. Broadcast video has the highest resolution and hence the highest file size. Video for the web uses the highest compression and small image sizes, making it suitable for playback on low speed internet connections. Video communication is a creative medium and calls for a range of abilities. Developing a communication on video requires systematic planning, organizing, production and post production activities, requiring a range of educational, technical and professional skills. While professional video production is an involved process, you can also use small hand held camcorders and a computer based software to compile, edit and produce an amateur video communication. Software like Windows Movie Maker has features designed for the amateur video enthusiast. You may use an online facility like Kaltura to develop web ready video. 13.9 KEYWORDS Firewire: FireWire is Apple’s name for the IEEE 1394 High Speed Serial Bus. Frame rate: or frame frequency, is the frequency (rate) at which an imaging device produces unique consecutive images called frames. Interlace: is a technique of improving the picture quality of a video signal without consuming extra bandwidth. It uses progressive scan of the pixels in the screen. Resolution: is the number of distinct pixels in each dimension that can be displayed in a computer screen. Streaming video: is content sent in compressed form over the Internet and displayed by the viewer in real time. 70 Video capture card: is a class of video capture device designed to plug directly into Digital Video expansion slots in personal computers. It is also used for converting analog signal into digital format. Widget: is a portable chunk of code that can be installed and executed within any separate HTML-based web page by an end user without requiring additional compilation. 13. 10 REFERENCES AND FURTHER READINGS Adobe Dynamic Media Group (2002). A Digital Video Primer, Available at http:// www.adobe.com/motion/events/pdfs/dvprimer.pdf Goldman, R. Pea, R., Barron, B., & Derry, S.J. , Eds. (2007). Video Research in the Learning Sciences, Mahwah, NJ: Lawrence Earlbaum Associates. Heinich, R., Molednda, M., Russell, J.D., & Smaldino, S.E. (1999). Instructional Media and Technologies for Learning, Upper Saddle River, Merril. Koumi, J. (2006). Designing Video and Multimedia for Open and Flexible Learning, London: Routledge. Millerson, F. (1999). Television Production, Oxford: Focal Press. Thornhill, S., Asensio, M., & Young, C. Eds. (2002). Video Streaming: a guide for educational development, Manchester: JISC Click and GO Video Project. Available at http://www.cinted.ufrgs.br/videoeduc/streaming.pdf 13.11 FEEDBACK TO CHECK YOUR PROGRESS QUESTIONS Check Your Progress 13.1 A cathode ray tube (CRT) produces a constant beam of electrons which when obstructed by a fluorescent or phosphorescent screen produces a luminous dot. Using colour sensitive phosphors, red, green or blue dots are produced. The scanning process progressively moves the beam across the screen, retraced back to the beginning, a step down and the tracing begins again. This way the entire screen is painted. Once it reaches the bottom edge of the screen, the beam is taken back to the top and the process begins all over again. This process happens very rapidly (in about one-twenty fifth of a second) simulating a continuous image. The number of dots created on the screen and their spacing are related to the sharpness of the image. The speed (25 frames per second) lays an automatic restriction. Processes happening faster than this cannot be shown normally. Check Your Progress 13.2 The number of phosphor dots available per inch of monitor space defines how closely packed the pixels of the image would be. The number 640 x 480 signifies a matrix of 640 columns and 480 rows of pixels. All these dots together make up the image on the screen. When a bitmap image of say 640 x 480 size is displayed on a monitor of a higher resolution, say 1024 x 768, it will be displayed in a smaller window of size 640 x 480. If blown up to full screen, the same number of pixels will be smeared out over the larger area, making the image lose its resolution. Check Your Progress 13.3 The internet facilitates communication of digital data at varying speeds. So if you have a broadband connection, which under ideal conditions transmits 512 Kbps, you would be able to receive close to 512 Kilo Bytes each second. You can work out based on what you learnt in the unit, how much data you wish to receive if smooth playback of video is to be ensured. Obviously, Internet is still not fast enough to video broadcast. Factors like image size, compression, format and the speed of the internet connection and the capacity of the server will affect the quality of download and consequently the quality of playback of the video. 71 Content Creation Tools Check Your Progress 13.4 We can visualize the chain by categorizing the process thus: production, post- production, packaging, and playback. Once you have a systematic script worked out, you record video, supporting audio, music and sound effects, and any graphics required. This part would require at least a video camera. You may use external microphones, lights, etc. based on the requirement. All the recorded content is put together into a coherent video communication on an editing machine. This would typically be editing software residing on a computer. You may also need a video capture card if you had used analog video. Once the post-production process is complete, the final video presentation has to be subjected to two processes. One, it has to be suitably compressed and converted into the DVD format. Editing software has built-in facilities for this. Two, it has to be recorded on to a DVD. Your computer should therefore be equipped with a DVD writer drive. Check Your Progress 13.5 Typically, we would prefer video where there is something to show. Also something that cannot be communicated in a still image – a process, an event, etc. The event itself could happen normally over a relatively long time. Further let us assume that watching the video itself does not suffice, but needs further explanation. This could be through an explanatory commentary. It could also need other data, or graphic or simply a labeling of parts. Select a subject and instructional situation which requires all of the above. Then describe how even if possible, it would be difficult to manage, in a face to face situation without the support of video. Also describe how the video itself functions as a time saving device. 72