Full Transcript

Understanding Uses of Peripheral Devices Peripheral devices add much-needed functionality to computers, beyond the core components. Having a fast processor and terabytes of hard drive space is great, but it doesn’t complete the picture. Users need the ability to input data and easily see and use the...

Understanding Uses of Peripheral Devices Peripheral devices add much-needed functionality to computers, beyond the core components. Having a fast processor and terabytes of hard drive space is great, but it doesn’t complete the picture. Users need the ability to input data and easily see and use the output that162the processor generates. Of course, the types of devices that can input or receive output are quite varied. In the following sections, we are going to break peripheral devices into four categories: Video Audio Input and output Storage We realize that video and audio are indeed input or output devices, but because they are more specialized, we will cover them separately. After this section, you will have a good understanding of purposes of and uses for several common peripheral devices, as well as how they connect to a PC. Video Devices The primary method of getting information out of a computer is to use a computer video display. Display systems convert computer signals into text and pictures and display them on a TV-like screen. As a matter of fact, early personal computers used television screens because it was simpler to use an existing display technology than to develop a new one. The most common video device used is a monitor, and we’ll discuss them in depth. We will also cover projectors, cameras and webcams, virtual reality headsets, and optical devices. Monitors Most display systems work the same way. First, the computer sends a signal to a device called the video adapter—an expansion board installed in an expansion bus slot or the equivalent circuitry integrated into the motherboard—telling it to display a particular graphic or character. The adapter then renders the character for the display; that is, it converts the single instruction into several instructions that tell the display device how to draw the graphic and sends the instructions to the display device based on the connection technology between the two. The primary differences after that are in the type of video adapter you are using (digital or analog) and the type of display (LCD, plasma, OLED, and so forth).  Many monitors sold today are touch screens. Touch screen technology converts stimuli of some sort, which are generated by actually touching the screen, to electrical impulses that travel over serial connections to the computer system. These input signals allow for the replacement of the mouse, both in terms of movement and in clicking. With onscreen keyboards, the external keyboard can be retired as well. This technology is used extensively with smartphones, tablets, and Internet of Things (IoT) devices, such as automobile information systems, security systems, and smart thermostats. Many laptops today have touch screens as well. The technology has invaded the PC market, too, but has yet to totally dominate it. It’s probably just a matter of time, though. 163 Types of Monitors The first monitors built for PCs used the same technology as televisions of the time, which were based on the cathode ray tube (CRT). The details behind how a CRT worked are beyond the scope of this book, but what you should know is that CRTs were very bulky and heavy to move. Unlike monitors and TVs of today, which have relatively flat profiles, CRTs could be a foot or more deep. They also mostly used VGA connectors, with newer models having a DVI-A connector. It’s rare to see a CRT monitor in use today. Current monitors are based on LCD, plasma, or OLED technology. Liquid Crystal Display First used with portable computers and then adapted to desktop monitors, liquid crystal displays (LCDs) are based on the concept that when an electrical current is passed through a semi-crystalline liquid, the crystals align themselves with the current. When transistors are combined with these liquid crystals, patterns can be formed. Patterns can then be combined to represent numbers or letters. LCDs are relatively lightweight and consume far less power than the CRTs they replaced. LCDs can use VGA, DVI, or HDMI connections, depending on the monitor. Liquid crystals produce no light, so LCD monitors need a lighting source to display an image. Traditional LCDs use a fluorescent bulb backlight to produce light. Most LCDs today use a panel of light-emitting diodes (LEDs) instead, which consume less energy, run cooler, and live longer than fluorescent bulbs. Therefore, when you see a monitor advertised as an LED monitor, it’s really an LCD monitor with LED backlighting. Plasma The word plasma refers to a cloud of ionized (charged) particles—atoms and molecules with electrons in an unstable state. This electrical imbalance is used to create light from the changes in energy levels as they achieve balance. Plasma display panels (PDPs) create just such a cloud from an inert gas, such as neon, by placing electrodes in front of and behind sealed chambers full of the gas and vaporized mercury. This technology of running a current through an inert gas to ionize it is shared with neon signs and fluorescent bulbs. Because of the emission of light that this process produces, plasma displays have more in common with legacy CRTs than they do with LCDs. In fact, as with CRTs, phosphors are responsible for the creation of light in the shade of the three primary colors—red, green, and blue (RGB). In this case, the pixels produce their own light (no backlight is required with plasma displays), also a feature shared with CRTs. PDPs were superior to LCDs for many years for two reasons. One, they produced smoother-flowing images, due to a refresh rate about 10 times higher than LCDs. Two, they had a deeper black color. In an LCD, the backlight cannot be completely blocked by the liquid crystal, which produces dark hues that are more gray than black. LCDs backlit with LEDs, however, are able to dim selective areas or the entire image completely. Because of the relative cost-effectiveness to produce PDPs of the same size as a given LCD panel, plasma displays historically enjoyed more popularity in the larger-monitor market. That advantage is all but gone today, resulting in more LCDs being sold today than plasma displays. Organic Light-Emitting Diode Organic light-emitting diode (OLED) displays, unlike LED displays, are really the image-producing parts of the display, not just the light source. In much the same way as a plasma cell places an excitable material between two electrodes,164OLEDs are self-contained cells that use the same principle to create light. An organic light-emitting compound forms the heart of the OLED, and it is placed between an anode and a cathode, which produce a current that runs through the electroluminescent compound, causing it to emit light. An OLED, then, is the combination of the compound and the electrodes on each side of it. The electrode in the back of the OLED cell is usually opaque, allowing a rich black display when the OLED cell is not lit. The front electrode should be transparent to allow the emission of light from the OLED. If thin-film electrodes and a flexible compound are used to produce the OLEDs, an OLED display can be made flexible, allowing it to function in novel applications where other display technologies could never work. Because of the thin, lightweight nature of the panels, OLED displays can both replace existing heavy full-color LED signs, like the ones you might see in Las Vegas or even at your local car dealer’s lot, and carve out new markets, such as integration into clothing and multimedia advertisements on the sides of buses to replace and improve upon the static ads that are commonly seen. Because OLEDs create the image in an OLED display and supply the light source, there is no need for a backlight with its additional power and space requirements, unlike in the case of LCD panels. Additionally, the contrast ratio of OLED displays exceeds that of LCD panels, regardless of backlight source. This means that in darker surroundings, OLED displays produce better images than do LCD panels. The power to drive an OLED display is, on average, less than that required for LCDs. Generally speaking, OLED monitors are the highest-quality monitors you will find on the market today. Adjusting Display Settings Although most monitors are automatically detected by the operating system and configured for the best quality that they and the graphics adapter support, sometimes manually changing display settings, such as for a new monitor or when adding a new adapter, becomes necessary. Let’s start by defining a few important terms: Refresh rate Resolution Multiple displays Each of these terms relates to settings available through the operating system by way of display-option settings. Refresh Rate The refresh rate—technically, the vertical scan frequency—specifies how many times in one second the image on the screen can be completely redrawn, if necessary. Measured in screen draws per second, or hertz (Hz), the refresh rate indicates how much effort is being put into checking for updates to the displayed image. For LCD screens, the refresh rate is generally fixed and not an adjustment to be made. LCD televisions that support 120 Hz refresh rates are common, but it’s easy to find those rated for 60 Hz, 240 Hz, and 480 Hz as well. For computer monitors, you might be able to select among multiple refresh rates because you’re in control of the circuitry driving the refresh rate, the graphics adapter. Higher refresh rates translate to more fluid video motion. Think of the refresh rate as how often a check is made to see if each pixel has been altered by the source. If a pixel165should change before the next refresh, the monitor is unable to display the change in that pixel. Therefore, for gaming and home-theater systems, higher refresh rates are an advantage. The refresh rate is selected for the monitor. Nevertheless, the refresh rate you select must be supported by both your graphics adapter and your monitor because the adapter drives the monitor. If a monitor supports only one refresh rate, it does not matter how many different rates your adapter supports—without overriding the defaults, you will be able to choose only the one common refresh rate. It is important to note that as the resolution you select increases, the higher supported refresh rates begin to disappear from the selection menu. If you want a higher refresh rate, you might have to compromise by choosing a lower resolution. Exercise 3.1 shows you where to change the refresh rate in Windows Resolution Resolution is defined by how many software picture elements (pixels) are used to draw the screen. An advantage of higher resolutions is that more information can be displayed in the same screen area. A disadvantage is that the same objects and text displayed at a higher resolution appear smaller and might be harder to see. Up to a point, the added crispness of higher resolutions displayed on high-quality monitors compensates for the negative aspects. The resolution is described in terms of the visible image’s dimensions, which indicate how many rows and columns of pixels are used to draw the screen. For example, a resolution of 1920 × 1080 means 1920 pixels across (columns) and 1080 pixels down (rows) were used to draw the pixel matrix. The video technology in this example would use 1920 × 1080 = 2,073,600 pixels to draw the screen. Resolution is a software setting that is common among CRTs, LCDs, and projection systems, as well as other display devices. Setting the resolution for your monitor is fairly straightforward. If you are using an LCD, for best results you should use the monitor’s native resolution, which comes from the placement of the transistors in the hardware display matrix of the monitor. For a native resolution of 1680 × 1050, for example, there are 1,764,000 transistors (LCDs) or cells (PDPs and OLED displays) arranged in a grid of 1680 columns and 1050 rows. Trying to display a resolution other than 1680 × 1050 through the operating system tends to result in the monitor interpolating the resolution to fit the differing number of software pixels to the 1,764,000 transistors, often resulting in a distortion of the image on the screen. Some systems will scale the image to avoid distortion, but others will try to fill the screen with the image, resulting in distortion. On occasion, you might find that increasing the resolution beyond the native resolution results in the need to scroll the Desktop in order to view other portions of it. In such instances, you cannot see the entire Desktop all at the same time. The monitor has the last word in how the signal it receives from the adapter is displayed. Adjusting your display settings to those that are recommended for your monitor can alleviate this scrolling effect. To change the resolution in Windows 10, right-click the Desktop and choose Display Settings (like in Exercise 3.1). There is a pull-down menu for resolution. Click it and choose the resolution you want, as shown in Figure 3.33. 169 Figure 3.33 Adjusting the resolution in Windows 10 Some adapters come with their own utilities for changing settings such as the refresh rate and resolution. For example, Figure 3.34 shows a screen from the NVIDIA Control Panel. On it, you can change both the resolution and the refresh rate. Other settings are available using the menu on the left. 170 Figure 3.34 The NVIDIA Control Panel Understanding Aspect Ratios The term aspect ratio refers to the relationship between the horizontal and vertical pixel counts that a monitor can display. For example, old square-ish CRTs were shaped to support a display that conformed to 4:3 ratios, such as 800 × 600 or 1024 × 768. If you divide the first number by 4 and multiply the result by 3, the product is equal to the second number. Additionally, if you divide the first number by the second number, the result is approximately 1.3, the same as 4 ÷ 3. Displays with a 16:10 aspect ratio have measurements that result in a dividend of 16 ÷ 10 = 1.6. When LCD monitors first became popular, they had wider screens and most supported a 16:10 ratio. Because the ATSC (Advanced Television Systems Committee) standard for widescreen television aspect ratios is 16:9 (1.778), computer monitors are trending more toward this same aspect ratio. As a result, the once-popular 1920 × 1200, 16:10 resolution is now less common than the 1920 × 1080, 16:9 resolution. If you have a monitor that supports one and you try to set it to the other, the image may look squished or stretched, or the monitor may not display at all. 171 Multiple Displays Whether regularly or just on occasion, you may find yourself in a position where you need to use two monitors on the same computer simultaneously. For example, you may need to work in multiple spreadsheets at the same time, and having two monitors makes it much easier. Or, if you are giving a presentation and would like to have a presenter’s view on your laptop’s LCD but need to project a slide show onto a screen, you might need to connect an external projector to the laptop. Simply connecting an external display device does not guarantee that it will be recognized and work automatically. You might need to change the settings to recognize the external device, or adjust options such as the resolution or the device’s virtual orientation with respect to the built-in display. Exercise 3.2 guides you through this process. When you have dual displays, you have the option to extend your Desktop onto a second monitor or to clone your Desktop on the second monitor. To change the settings for multiple monitors in Windows 10, follow the steps in Exercise 3.2, after ensuring that you have a second monitor attached. Exercise 3.2 Changing the Settings for Multiple Monitors Right-click a blank portion of the Desktop. Click Display Settings to open the Display Settings window. If a second monitor has been detected, you will see a screen similar to the one shown in Figure 3.35. Otherwise, you will need to scroll down and click the Detect button in the Multiple Displays section. 172Notice that the second monitor is highlighted. If you were to change settings such as scale, resolution, or orientation, it would affect the monitor that’s highlighted. Scroll down in Display Settings. Under the Multiple Display settings, you will have options to show an image on only one of the monitors, duplicate the displays, or extend the displays. Choose Extend These Displays, as shown in Figure 3.36. Scroll back up to the area where you see the two monitors. Click and hold the second monitor, and drag it around. Notice that you can place it above, below, left, or right of the first monitor. This will affect some display features, including where you need to move the mouse cursor to get it to appear on the second monitor. Move the second monitor to be above the first monitor, and close Display Settings. Move your mouse until you get the cursor to appear on the second screen. (Optional) Open Display Settings and configure the second monitor to be in the position you want it to be relative to the first monitor. Figure 3.35 Multiple displays detected Figure 3.36 Extending the displays 173 Projection Systems Another major category of display device is the video projection system, or projector. A portable projector can be thought of as a condensed video display with a lighting system that projects the image onto a screen or other flat surface for group viewing. Interactive white boards have become popular over the past decade to allow presenters to project an image onto the board as they use virtual markers to draw electronically on the displayed image. Remote participants can see the slide on their system as well as the markups made by the presenter. The presenter can see the same markups because the board transmits them to the computer to which the projector is attached, causing them to be displayed by the projector in real time. To accommodate using portable units at variable distances from the projection surface, a focusing mechanism is included on the lens. Other adjustments, such as keystone, trapezoid, and pincushion, are provided through a menu system on many models as well as a way to rotate the image 180 degrees for ceiling-mount applications. Brightness Projection systems are required to produce a lighted image and display it many feet away from the system. The inherent challenge to this paradigm is that ambient light tends to interfere with the image’s projection. One solution to this problem is to increase the brightness of the image being projected. This brightness is measured in lumens. A lumen (lm) is a unit of measure for the total amount of visible light that the projector gives off, based solely on what the human eye can perceive and not on invisible wavelengths. When the rated brightness of the projector in lumens is focused on a larger area, the lux—a derivative of lumens measuring how much the projector lights up the surface on which it is focused—decreases; as you train the projector on a larger surface (farther away), the same lumens produce fewer lux. The foregoing discussion notwithstanding, projection systems are rated and chosen for purchase based on lumens of brightness, usually once a maximum supported resolution has been chosen. Sometimes the brightness is even more of a selling point than the maximum resolution that the system supports because of the chosen environment in which it operates. Therefore, this is the rating that must be used to compare the capabilities of projection systems. Some loose guidelines can help you choose the right projector for your application. Keep in mind that video versus static image projection requires more lumens, and 3D output requires roughly double the lumens of 2D projection. Additionally, use of a full-screen (4:3 aspect ratio) projector system in a business environment versus a widescreen (16:9) home theater projector requires approximately double the lumens of output at the low end and only 1.3 times at the high end. For example, if you are able to completely control the lighting in the room where the projection system is used, producing little to no ambient light, a projector producing as little as 1,300 lumens is adequate in a home theater environment, while you would need one producing around 2,500 lumens in the office. However, if you can only get rid of most of the ambient light, such as by closing blinds and dimming overhead lights, the174system should be able to produce 1,500 to 3,500 lumens in the home theater and 3,000 to 4,500 lumens in the office. If you have no control over a very well lit area, you’ll need 4,000 to 4,500 lumens in the home theater and 5,000 to 6,000 lumens in the business setting. These measurements assume a screen size of around 120″, regardless of aspect ratio. By way of comparison, a 60W standard light bulb produces about 800 lumens. Output is not linear, however, because a 100W light bulb produces over double, at 1,700 lm. Nevertheless, you couldn’t get away with using a standard 100W incandescent bulb in a projector. The color production is not pure enough and constantly changes throughout its operation due to deposits of soot from the burning of its tungsten filament during the production of light. High-intensity discharge (HID) lamps, like the ones found in projection systems, do more with less by using a smaller electrical discharge to produce far more visible light. A strong quartz chamber holds the filament in a projector lamp and can be seen inside the outer bulb. It contains a metal halide (where the word halogen comes from) gas that glows bright white when the tungsten filament lights up. Depositing the soot on the inside of the projector bulb is avoided by using a chemical process that attracts the soot created back to the filament where it once again becomes part of the filament, extending its life and reducing changes in light output. Expect to pay considerably more for projector bulbs than for standard bulbs of a comparable wattage. The metal halide gases used in projector bulbs are more expensive than the noble gases used in standard bulbs. Add to that the fact that the bulb itself might have to be handmade and you can understand the need for higher cost. Cooling Down Although it doesn’t take long for the fan to stop running on its own, this is a phase that should never be skipped to save time. With projector bulbs being one of the priciest consumables in the world of technology, doing so may cost you more than a change in your travel arrangements. See the sidebar titled “Burning Up” for some perspective. Webcams Years ago, owing to the continued growth in the Internet’s popularity, video camera–only devices known as webcams started their climb in popularity. Today, anyone who does a fair amount of instant messaging, whether professional or personal, has likely used or at least been introduced to webcams, often used in conjunction with messaging user interfaces. Webcams make great security devices as well. Users can keep an eye on loved ones or property from anywhere that Internet access is available. Care must be taken, however, because the security that the webcam is intended to provide can backfire on the user if the webcam is not set up properly. Anyone who happens upon the web interface for the device can control its actions if there is no authentication enabled. Some webcams provide a light that illuminates when someone activates the camera. Nevertheless, it is possible to decouple the camera’s operation and that of its light. Nearly every laptop produced today has a webcam built into its bezel. An example is shown in Figure 3.37—this one has a light and two microphones built in next to it. If a system doesn’t have a built-in camera, a webcam connects directly to the computer through an I/O interface, typically USB. Webcams that have built-in wired and wireless NIC interfaces for direct network attachment are prevalent as well. A webcam does not have any self-contained recording mechanism. Its sole purpose is to transfer its captured video directly to the host computer, usually for further transfer over the Internet—hence, the term web.

Use Quizgecko on...
Browser
Browser