Lecture Notes in Microprocessors PDF

Summary

These lecture notes provide an introduction to microprocessors, discussing their history, components, and functions. The notes cover topics such as the evolution of computing devices, analog to digital calculation, and the development of microprocessors. They also explain the various types of microprocessors and their applications, such as in calculators and computers.

Full Transcript

Lecture Notes in Microprocessors by Prof. Cyrel Manlises LECTURE NOTE 1: INTRODUCTION TO MICROPROCESSOR Computers before are a lot different from the existing one, either from its physical and operati...

Lecture Notes in Microprocessors by Prof. Cyrel Manlises LECTURE NOTE 1: INTRODUCTION TO MICROPROCESSOR Computers before are a lot different from the existing one, either from its physical and operational features. Designers focus on the development of computers is specifically speed. Counting device such as abacus was one the first that was used for arithmetic tasks. Analog computers came to address concern in astronomical calculations. To perform multiplication and division the so-called pocket calculator and the first digital calculator was invented. These devices used the slide rule which represents real numbers as distance or intervals on a line. Some other counting devices were developed after, from binary to decimal based system. Punched card technology was next to these counting devices. Punch cards were used to control and program specific system or machine. One punch card is used for one program line. The problem with punch card is it is not allowed to be fold or to be disfigured because it might damage the program contained in it. Analog calculation was replaced by digital calculation. ENIAC (Electronic Numerical Integrator and Computer) was one of those machines. These were built using vacuum tubes. This device has its downside that led to the invention of transistors. Number of transistors was connected to form logic elements and was later packed on a silicon wafer to form an IC (Integrated Circuit). This was invented to improve more the performance of the computers. Level of integration is very important since it signifies how powerful a certain IC is. From then on Microprocessors started to boom and were developed by different manufacturers so as to satisfy the needs of the user. The word microprocessor was derived thru miniaturizing processor into single chip. It is the central processing unit of a computer, which includes ALU, control unit and bus unit all integrated on a single IC. The first microprocessor that could carry out calculator’s functions was originally delivered by the Japanese company named BUSICOM (Business Computing Lecture Notes in Microprocessors by Prof. Cyrel Manlises Corporation). Intel© bought the rights from the BUSICOM, who later got bankrupted, to manufacture the said chip. Then after this Intel© continued to developed different microprocessors. Improvement was made almost every year. The following are some of the Intel’s designs; a. Intel 4004 – The first single-chip microprocessor. b. Intel 8008 – The 8-bit microprocessor. c. Intel 8080 – This is 10X faster than the performance of 8008. d. Intel 8085 – Used as a computer peripheral controller. e. Intel 8086 –Used in portable computing. f. Intel 8088 – Identical to 8086 except for its 8-bit external bus. g. Intel 80186 – Used mostly in embedded applications. h. Intel 80188 – Identical to 80186 with an 8-bit external data bus. i. Intel 80286 – Widely used in IBM-PC AT and AT clones at the time. j. Intel 80386 – Used in desktop computing. k. Intel 80486 – Used in desktop computing and servers. l. Pentium I – Used in desktops and uses Superscalar architecture. m. Pentium Pro - Primarily used in server systems. n. Pentium II – Pentium Pro with MMX and improved 16-bit performance. o. Pentium III – Improved PII. Now including SSE. p. Pentium 4 – Used in desktops and entry-level workstations. Pentium microprocessor used the technology RISC (Reduced Instruction Set Computing). This helps the computer to speed-up instruction execution. Microcontroller is the enhancement of the usual microprocessor because aside from the ALU and control unit, read-write memory, read-only memory, EEPROM, peripheral devices, and I/O interfaces was added. Microcomputer is a complete computing system consisting of a microprocessor, memory unit, I/O ports and a bus. These are all housed in the so-called motherboard. Lecture Notes in Microprocessors by Prof. Cyrel Manlises BUS Input Register Devices MEMORY I/O Port CPU R AM ROM Output Devices ALU CU Functional Elements of Microprocessor Fig. 1 As shown in Fig. 1, Input device consists of circuits needed to get information or data into the program. Output device shows answers and processed data or information to the outside world. I/O port is used as the medium of communication between the processor and the outside world. There are two different techniques in handling CTRL for I/O devices. These are a) Memory- mapped I/O and b) Isolated I/O or Port-mapped I/O. Register is a temporary storage of information inside the CPU. ALU (Arithmetic Logic Unit) is a circuit contained in CPU that is used to perform arithmetic and logical operations. CU (Control Unit) directs the operation of the CPU and all other devices. This part of computer is made of a) Instruction Decoder, b) Timing and c) Control Logic. Memory is part of the computer that is used to store information either permanently or temporarily. There are two types of memory existing in microprocessor; one is RAM (Random Access Memory) and another is ROM (Read Only Memory). System Bus is a group of lines that has a related function within a microprocessor system. Referring to Fig. 2, this is composed of Address Bus, Data Bus and Control Bus. Two kinds of buses are Internal Bus and External Bus. Vacuum Tubes 1 to 3 in high, it requires space, generate tremendous heat, and requires large amount of power. Lecture Notes in Microprocessors by Prof. Cyrel Manlises Transistors Act as logical switches in digital circuits. It replaced the vacuum tubes. Invented at bell lab in 1947. ¼ in high, little power. Jack Kilby Of texas instrument used very small wires to connect transistors (first IC), that was in 1958. Robert Noyce Of Fairchild semiconductor or developed a method of evaporating aluminum in specific places on a silicon wafer to connect transistors. First practical IC was developed in 1959. The FIRST MICROPROCESSOR was delivered by BUSICOM with the help of TED HOFF in 1971. Ted Hoff Developed a single general-purpose chip that could be programmed to carry out a calculator’s function. But BUSICOM was bankrupted and INTEL bought the right to manufacture the said chip. ROBERT NOYCE & GORDON MOORE started INTEL. INTEL 8080 Used in the first PC, the ALTAIR computer. Created at MITS by ED ROBERTS. No keyboard and monitor. Can only be programmed in straight binary. BILL GATES and PAUL ALLEN Started out Microsoft. The first product of Microsoft was a basic interpreter for ALTAIR. Intel 4004 4 bit microprocessor Runs at 108khz Contains 2300 transistors 4kb of memory First microprocessor developed primarily for games, test equipment and other simple digital systems Used for electronic calculations Not capable of running nor starting word processors, spreadsheets and databases 1971 Lecture Notes in Microprocessors by Prof. Cyrel Manlises Intel 8008 8 bit microprocessor Improved version of 4004 16kb of memory Can handle more complicated operations Late 1971 Intel 8080 Improved version of 8008 Speed of operation is ten times faster than 8008 64kb of memory Simple word processors maybe used 1973 Intel 8086/8088 Demand for greater and more powerful microprocessor and influence of competition 16 bit microprocessor 1MB of memory Late 1970’s Intel 80x86 Family Upgraded capability of 8086/8088 XT (experimental technology) AT (advanced technology) Lecture Notes in Microprocessors by Prof. Cyrel Manlises Parallel I/O Serial I/O Interrupt Circuitry Address Bus System Bus Data Bus Control Bus Timing CPU Memory Standard block diagram of a microprocessor-based system Fig. 2 CPU (Central Processing Unit) is the brain of the computer. It is a logical circuitry that can execute computer programs. Microprocessor system can perform multiple tasks or operations at the same time. It can also execute one task or operation at a time. Interrupts are signals that indicate need of attention, perform special task and resume the preempted operation. It is used also for multitasking and real- time computing. Categories of Interrupt are a) Maskable Interrupt (IRQ), b) Non- maskable Interrupt (NMI), c) Interprocessor Interrupt, Software Interrupt and Spurious Interrupt. DISPLAY Lecture Notes in Microprocessors by Prof. Cyrel Manlises SYSTEM BUS Expanded block diagram of a microprocessor-based system Fig. 3 Fig. 3 showed the additional features of a microprocessor-based system. One of which is the video display controller. Its main responsibility is to produce TV video signal for computing or video games. Coprocessor is a support to the main CPU. Its function is for floating point, graphics, signal or string processing, and even encryption. Watchdog Timer is a timing device in a computer system. D/A (Digital-to-Analog) converter are used to convert the binary form into current, voltage or electric charge. While the A/D converts current, voltage or electric charge into binary form. NVM stands for non-volatile memory. Lecture Notes in Microprocessors by Prof. Cyrel Manlises QUESTIONS: 1. What is the difference between hardware and software interrupt? 2. Which of the interrupts is commonly used for timers? 3. Explain how the Control Unit directs the operation of the CPU and maintain the communication between devices. 4. What is RISC? 5. What is the importance of watchdog monitor in a computer? 6. What was the first computer that used Intel 8080 as microprocessor? 7. State the function of each of the three buses existing in the microprocessor. 8. What are the examples of the internal and external bus? Lecture Notes in Microprocessors by Prof. Cyrel Manlises LECTURE NOTE 2: DATA ORGANIZATION Every input character to the computer thru the keyboard is converted into machine language. Computers used different codes to this conversion. One is ASCII (American Standard Code for Information Interchange) which is used to represent characters in computers. Computers can only understand, process and execute information in binary form. Commands in each microprocessor have their corresponding machine code equivalent. This machine code which determines the operation and tells what the computer should do at a certain period of time is the so-called operation code (Op-code). Op-codes are of course in the form of binary which is composed of 8-bits, 16-bits or 32-bits. The number of bits is dependent on the instruction. Some are 1-byte, 2-byte or 3-byte instruction. Each cell in computer memory can only handle byte-sized information. Memory cell may contain either data or instruction (which is in terms of Op-code). The size of memory is dependent on the microprocessor’s requirement. Binary-coded decimal (BCD) are used when computing machines only perform simple arithmetic operations. This can be stored in two different forms, packed or unpacked. If two digits of hexadecimal number are stored per byte it is packed BCD while if one digit of hexadecimal number is stored per byte it is unpacked BCD. Byte-sized data can be stored in two different forms as well. One is signed integer whose integer ranged is from -128 to 0 to +127. Another is unsigned integer whose integer ranged is from 0 to 255 (00H-FFH). Sign bit of the number (the most significant bit) states the difference between these two forms. Lecture Notes in Microprocessors by Prof. Cyrel Manlises Word-sized data is composed of two bytes of information. There are two ways of storing these bytes into memory. Little endian is the way where the higher byte is stored in the higher memory. And the lower byte is to be stored in the lower memory. Doubleword-sized is composed of four bytes of information. The way it is stored in the memory is just the same with the word-sized information. This means that the higher word and the lower word will be stored in the higher and lower memory respectively. Floating-point numbers is composed of two parts, the mantissa (stored in 23bits) and the exponent (stored as biased exponent). Mantissa contains the hidden one- bit which is the integer in the normalized form. It has two different forms. The first form is the single-precision which is the 4-byte real number and whose bias is 7FH. Another form is the double-precision which is the 8-byte real number whose bias is 3FFH. QUESTIONS: 1. Complete the table. Use single-precision floating-point numbers. Decimal Binary Normalized Form Sign Biased Exponent Mantissa +19 -21 +0.1027 -0.0919 2. What is packed and unpacked BCD of the following decimal numbers? a. 3431 b. 721 3. What is the ASCII equivalent of the word “SCHOOL OF EE-ECE-CpE”? 4. Does the number of bits that a microprocessor can execute per unit time is related to the speed of microprocessor? 5. How many memory locations do a 4-byte size of information requires? Lecture Notes in Microprocessors by Prof. Cyrel Manlises 6. Do you think that different microprocessors have the same set of OpCode? 7. Convert the following signed decimal numbers into binary numbers: a. +11 b. -28 c. +0601 d. -0106 8. How many bytes does the following contained? a. 8K b. 8M c. 8G

Use Quizgecko on...
Browser
Browser