History of Electrical Measuring Devices and VOM PDF
Document Details
Uploaded by Deleted User
Tags
Summary
This document provides a historical overview of the multimeter, including the development of early measuring devices like the galvanometer and the invention of the first multimeter in the 1920s. It also discusses the transition from analog to digital multimeters and the impact of multimeters on various fields.
Full Transcript
**Lecture: The Invention and Evolution of the Multimeter** **1. Introduction** The multimeter is one of the most essential tools in the field of electronics, widely used by engineers, technicians, and hobbyists. But where did it all begin? Today, we\'ll explore the origins of the multimeter, how i...
**Lecture: The Invention and Evolution of the Multimeter** **1. Introduction** The multimeter is one of the most essential tools in the field of electronics, widely used by engineers, technicians, and hobbyists. But where did it all begin? Today, we\'ll explore the origins of the multimeter, how it was invented, and its evolution over time. **Objectives**: 1. To understand the historical development of the multimeter, starting from its early predecessors to the sophisticated digital devices we use today. **2. The Earliest Predecessor: The Galvanometer (1820)** - **Definition and Purpose**: - The galvanometer is considered the first device leading to the development of the multimeter. - Invented in 1820, it was a moving-pointer device designed to detect and measure electric current. - **How It Worked**: - The galvanometer operated on the principle that an electric current passing through a coil would create a magnetic field, which could move a compass needle. - It provided a visual indication of current flow by the deflection of a needle. - **Limitations**: - Despite its usefulness in laboratory settings, the galvanometer was large, bulky, and delicate. - Its sensitivity made it impractical for fieldwork or rough environments, limiting its use primarily to controlled lab environments. **3. The Invention of the First Multimeter** - **Donald Macadie and His Innovation (1920)**: - In 1920, Donald Macadie, a British Post Office engineer, is credited with inventing the first multimeter. - **Motivation**: Macadie was frustrated by the need to carry multiple tools to measure amperes, volts, and ohms while working on telecommunication lines. - **Solution**: He developed a single device that could perform all these measurements, combining the functions into one compact tool. ![](media/image2.png)**The AVOmeter**: *An early version of the AVOmeter by Liftarn, CC BY-SA 3.0* - Named after the units it could measure: Amperes (A), Volts (V), and Ohms (Ω), it was called the AVOmeter. - **Design**: The original AVOmeter was quite clunky and resembled a small wooden box with a large analog dial. - **Functionality**: It could measure DC voltage, current, and resistance. This innovation was revolutionary at the time, making it easier for technicians to perform their tasks efficiently. **4. Evolution of the Multimeter: From the 1920s Onward** - **Early Modifications**: - During the first decade after its invention, the AVOmeter underwent significant changes to enhance its portability and functionality. - By the 1930s, the AVOmeter had been redesigned to be more compact and user-friendly, making it suitable for field use. - **Expansion of Capabilities**: - Initially, the AVOmeter was limited to measuring direct current (DC), resistance, and voltage across 13 different ranges. - With advancements in technology, particularly the introduction of the "copper oxide instrument rectifier," the AVOmeter\'s capabilities expanded. - This rectifier allowed the AVOmeter to measure alternating current (AC) as well as DC, increasing its range of measurement from 13 to 20 different settings. - **Westinghouse's Contribution**: - The history of the voltmeter took a significant turn when Westinghouse introduced the first Universal meter. - This innovation marked a leap forward in multimeter technology, as it allowed a single device to measure both AC and DC currents and voltages, enhancing versatility. **5. From Analog to Digital: A Technological Leap** - **Introduction of Digital Multimeters (DMMs)**: - The development of digital technology in the latter half of the 20th century paved the way for digital multimeters. - DMMs replaced the analog needle with digital displays, providing higher accuracy, ease of reading, and additional functionalities like auto-ranging, data hold, and connectivity options. - **Comparison to Early Multimeters**: - Digital multimeters are more compact, accurate, and reliable than their analog predecessors. - They offer a variety of measurement functions in a single handheld device, demonstrating the significant advancements in multimeter technology over the past century. **6. Impact and Importance of the Multimeter** - **In the Field of Electronics**: - The multimeter revolutionized electrical measurement, making it more efficient and accessible. - It is a crucial tool for troubleshooting, maintenance, and development in electronics, automotive, telecommunications, and many other industries. **THE VOLTMETER** A voltmeter, sometimes referred to as a voltage meter, is a device that measures the electrical potential difference between two points in an electrical or electronic circuit. It is commonly used to measure voltage in both Alternating Current (AC) and Direct Current (DC) circuits. Additionally, specialized voltmeters are available for measuring Radio Frequency (RF) voltage. A voltmeter measures voltage, typically calibrated in units such as volts, millivolts (0.001 volts), or kilovolts (1,000 volts). To measure the voltage of a device, the voltmeter is connected in parallel with the device. This parallel connection is crucial because components connected in parallel experience the same potential difference. By connecting the voltmeter in parallel with the circuit, it ensures that the voltage drop across the voltmeter is the same as across the device being measured. Voltmeters are designed with high internal resistance. This high resistance is important for accurately measuring the potential difference between two points in a circuit, as it ensures that the current flowing through the voltmeter is minimal. By restricting the flow of current, the high resistance prevents the voltmeter from altering the circuit\'s operation, allowing for accurate voltage readings. The voltmeter is usually represented by the letter V, which is placed inside a circle adjoining two terminals. ![](media/image4.png) An analog voltmeter is primarily used for measuring AC and DC voltages. It displays the reading using a pointer that moves across a calibrated scale. The movement of the pointer is influenced by the torque acting on it, which is directly proportional to the voltage being measured. A basic analog voltmeter consists of a sensitive galvanometer (current meter) connected in series with a high resistance. It is important for the meter to have high internal resistance to prevent it from significantly drawing current and disrupting the circuit operation during testing. The voltage range that the meter can display is determined by the value of the series resistance and the sensitivity of the galvanometer. Conversely, for measuring low voltages, an oscilloscope is commonly employed, where the instantaneous voltage is shown as a vertical displacement on the screen. In RF and AC applications, oscilloscopes are used to measure both peak-to-peak and peak voltages. When measuring high potential differences, the use of appropriate wiring, insulators, and heavy-duty probes is essential to ensure accurate readings and safety. **Digital Voltmeter** Another commonly used instrument for measuring voltage is the digital voltmeter (DVM). A digital voltmeter determines an unknown voltage by converting it into a digital value and displaying it numerically. These devices typically utilize a specific type of analog-to-digital converter known as an integrating converter. Several factors can affect the accuracy of a DVM, including input impedance, temperature, and variations in the power supply voltage. Inexpensive DVMs generally have an input resistance of about 10 MΩ. Precision DVMs, however, may feature input resistances of 1 GΩ or higher, especially when measuring low voltages (below 20 V). To maintain accuracy and stay within the manufacturer\'s specified tolerances, DVMs should be regularly calibrated using a voltage standard, such as the Weston Cell. ### Other Types of Voltmeters These voltmeters are categorized based on their design and construction: - **MI Voltmeter**: The Moving Iron (MI) voltmeter is used for measuring both AC and DC voltages. In this type of voltmeter, the needle's deflection is directly proportional to the voltage across the coil. MI voltmeters are further divided into two categories: Attraction Type Moving Iron Instruments and Repulsion Type Moving Iron Instruments. - **Rectifier Voltmeter**: Commonly used in AC circuits, rectifier voltmeters measure voltage by first converting AC into DC using a rectifier. The resulting DC signal is then measured using a Permanent Magnet Moving Coil (PMMC) instrument. - **PMMC Voltmeter**: The Permanent Magnet Moving Coil (PMMC) voltmeter, also known as a D'Arsonval meter or simply a galvanometer, measures current by observing the angular deflection of a coil within a uniform magnetic field. When a voltage is applied, it induces a current in the PMMC instrument, causing the pointer to deflect. PMMC voltmeters are used for measuring DC voltages. - **Electro-dynamometer Voltmeter**: This type of voltmeter can measure voltage in both AC and DC circuits, with calibration typically being the same for both types of measurement. - **Amplified Voltmeter**: Amplified voltmeters are designed to allow adjustments in sensitivity and input resistance. These adjustments are made possible by using an amplifier and power supply to provide the current necessary to move the meter pointer. **THE AMMETER** ![](media/image6.png)An ammeter is a device used to measure electric current, whether it is alternating current (AC) or direct current (DC). Since it measures current in amperes, the unit of measurement for electric current, it is called an ammeter. **Representation Of Ammeter** ----------------------------- **Ammeter Specifications** -------------------------- To measure current, an ammeter is typically connected in series with the circuit. It is primarily used to measure small currents, often in the milliampere or microampere range. Devices specifically designed to measure currents in milliamperes are known as milliammeters, while those for extremely small currents, measured in microamperes, are called microammeters. In circuit diagrams, an ammeter is represented by the letter \"A.\" Ammeters are designed to have low resistance, with an ideal ammeter offering zero internal resistance. Generally, they have minimal internal resistance for the following reasons: - The entire input current flows through the device. - There is a minimal voltage drop across the device. - The ammeter includes an internal fuse that protects it from excessive current. If a large current flows through the ammeter, the fuse will cut-off, preventing further measurement until the fuse is replaced. ### Effect of Temperature on Ammeter An ammeter is sensitive to temperature and can be influenced by both internal and external environmental conditions. Temperature variations can impact the readings of the device. To minimize the temperature\'s effect on the ammeter, a resistance with a zero-temperature coefficient, known as swamping resistance, is used. When this swamping resistance is connected in series with the ammeter, it reduces the impact of temperature fluctuations on the device\'s accuracy. ### THE OHMMETER An ohmmeter, also known as an ohm-meter, is a device used to measure the electrical resistance of a material, which indicates how much the material opposes the flow of electric current. There are various types of ohmmeters: micro-ohmmeters and milli-ohmmeters are used for measuring low resistance values, while megohmmeters (a trademarked device by Megger) are used for measuring high resistance values. Every material has some level of electrical resistance, which can vary from high to low. For conductors, resistance typically increases with temperature, whereas for semiconductors, resistance generally decreases as the temperature rises. Important consideration in using the ohmmeter: 1. Always turn-off the power source during testing as it will damage the equipment especially in continuity testing 2. The ohmmeter is always connected in parallel to the load or component to be tested. 3. Always set the highest range for testing an unknown parameter.