Lecture 2: Computer Science 2520 - Standards & History
Document Details
Tags
Summary
This lecture covers computer science 2520, specifically focusing on standards organizations, a brief history of computing and different generations of computers. The information includes details on various organizations, examples of early computing devices, and their respective technologies.
Full Transcript
Lecture 2 Computer Science 2520 Standardization This course deals with the components of computer systems, and how they interact with software However, two important questions arise: What assurance do designers have that components will operate as expected? What assuran...
Lecture 2 Computer Science 2520 Standardization This course deals with the components of computer systems, and how they interact with software However, two important questions arise: What assurance do designers have that components will operate as expected? What assurance do we have that components will interoperate (that is, operate together?) Standards Organizations There are established organizations that set computer hardware standards, which ensure the interoperability of components, and provide the means to ensure that these components operate as specified. Throughout the remainder of this course, and in fact your career, you will encounter these standards. What are some of the most important standards-establishing groups? Standards Organizations Institute of Electrical and Electronic Engineers IEEE Establishes standards for computer components, data representation, and signaling protocols, plus many other standards aspects. Promotes the interests of the worldwide electrical engineering community. Establishes standards like the IEEE 802.11 b/a/g/n wireless protocols which are now ubiquitous Standards Organizations The International Telecommunications Union (ITU) Concerns itself with the interoperability of telecommunications systems, including data communications and telephony. National groups establish standards within their respective countries: The American National Standards Institute (ANSI) The British Standards Institution (BSI) Standards Organizations International Organization for Standardization ISO Co-ordinates worldwide standards for development Establishes global standards for diverse products such as threads on screws, to photographic film Influential in formulating standards for computer hardware and software, including manufacturing methodologies A Brief History of Computing In order to understand how today’s technology works, it is useful to know how we got to this stage of development You may find this difficult to believe, but the evolution of computers has taken place over a timespan of centuries Modern electronic computers are usually classified into four generations according to the prevalent technology in that generation We will quote dates, but they are approximate, of course! Generation Zero (1642- 1945) Calculating Clock (Wilhelm Shickard, 1592-1635) – can perform addition and subtraction. Pascaline (Blaise Pascal, 1623-1662) – can Difference Engine perform addition with (Charles Babbage, carry, and subtraction 1791-1871) – can calculate polynomial functions These are mechanical computing devices. Punched Card Tabulating Machine (Herman Hollerith, 1860-1929): Designed to provide input to a computer system, but was itself a primitive electrically powered counting and sorting machine, which manipulated paper cards with holes punched in specified areas to encode data. This is an electromechanical computing device. First Generation Vacuum Tube Computing (1945- 1953) At left, a vacuum tube – an electron device which is closely related to the lightbulb! Vacuum tube diode, triode, tetrode and pentode devices were common. At right, the ABC (Atanasoff-Berry Computer), the first completely electronic computing device. It was used to solve systems of linear equations. Although electronic, it was not a general-purpose computer in the modern sense. Created by John Mauchly and J. Presper Eckert (University of Pennsylvania, 1946), the Electronic Numerical Integrator and Computer (ENIAC) was the first all- electronic general purpose digital computing device. As you can see, it is HUGE. Although a major step forward, vacuum tube technology had some serious limitations for computing. Tubes were still relatively slow, they ran hot hot, consumed a lot of power (to keep them hot, oddly enough, since their speed depended on heat – and to air-condition their environment, ironically), and they were not overly reliable. Second Generation Discrete Transistor Computing (1954- 1965) Shown at left, the transistor (which stands for transfer resistor) radically changed the computing landscape. It had the advantage of being much smaller than a vacuum tube, ran with a fraction of the heat, a fraction of the power, and was highly reliable in the long term. At right, the DEC PDP-1 was an early transistor- based computer system. Other contemporaries of that system were the IBM 7094 scientific computer, the IBM 1401 business computer, the Univac 1100… but there were many other examples. Rapid Evolution Just as the vacuum tube was a major step forward from electromechanical computing, the transistor was an equally huge step forward from the vacuum tube Computers became something a medium- sized business could now own This was, however, still only the start Third Generation Integrated Circuit Computing (1965- 1980) DEC PDP-8 IBM System/360 DEC PDP-11 Cray-1 Supercomputer Integrated Circuits Allowed multiple transistors to be placed on a single wafer of silicon Futher reduced the size of components inside the computer; continued to reduce heat, and enhance speed and reliability Allowed the use of standardized chip components, making computers more modular in design Small businesses could now afford computers Fourth Generation VLSI Computing (1980 onward) With the introduction of the Intel 4004 (at left) the era of Very Large Scale Integrated (VLSI) circuits began. VLSI technology made possible the creation of the microprocessor, which is the heart of the personal computer. For the first time in human history, ordinary individuals could own their own computers. Personal computing really began when later versions of the Intel chip such as the 8080, 8086 and 8088 were released; also, the Zilog Z-80 processor, and the Commodore 6502 and 6510 processors were contemporaries. A high-end processor of the era was the Motorola MC68000. Interesting fact: the Commodore 64 used the 6510 series processor. However, its disc drives (which were external) used 6502 processors. It was possible to write programs for the disc drives! How small? Moore’s Law (1965) Gordon Moore, Intel founder “The density of transistors in an integrated circuit will double every year.” Contemporary version: “The density of silicon chips doubles every 18 months.” But this “law” cannot hold forever... Physical limit to how narrow ‘wires’ can become Rock’s Law How small? Rock’s Law Arthur Rock, Intel financier “The cost of capital equipment to build semiconductors will double every 4 years.” In 1968, a new chip plant cost about $12,000. At the time, $12,000 would buy a nice home in the suburbs. An executive earning $12,000 per year was “making a very comfortable living.” How small? Rock’s Law In 2012, a chip plants under construction cost well over $5 billion. $5 billion is more than the gross domestic product of some small countries, including Barbados, Mauritania, and Rwanda. For Moore’s Law to hold, Rock’s Law must fall, or vice versa. But no one can say which will give out first. Layers Computers are more than just a collection of hardware in the form of chips A modern computer is a software-defined multipupose tool; what it can do depends on what software you use There are many approaches to writing software; one of them is “divide and conquer” where complex software problems are divided into smaller modules Each module solves a smaller problem, and the larger complex problem can be thought of as a collection of smaller problems Complex computer systems are composed of a series of virtual machine layers Each virtual machine layer can be viewed as an abstraction of the layer below it The machines at each level execute their own unique instruction sets, relying on lower-level machines to perform tasks that accomplish these instructions. Layer after layer, the instructions are translated and retranslated until they end up as digital logic circuits. If you have already taken CS1610, you know what these are – if not, we will have a brief overview, and some reference material to enhance your understanding. This hierarchical structure is also seen in corporations, military groups, and so on. Level-By-Level Overview We’ll take a look at this six-level structure, and describe the principal characteristics in each. Level 6 – USER Program execution and user interface This is the level of the average computer user This is the topmost layer, which presents the “real-world” face of the computer system Level-By-Level Overview Level 5: HIGH-LEVEL LANGUAGE At this level, we interact with the computer by writing programs in high-level (human- language-like) computer languages Java C++ Delphi Lisp PHP Level-By-Level Overview Level 4: ASSEMBLY LANGUAGE The Level 5 compiler or interpreter produces assembly language instructions It is also possible for programmers to program assembly language (which you will shortly see…) At this level, the instructions closely resemble the actual actions which the processor can take Level-By-Level Overview Level 3: SYSTEM SOFTWARE Operating System Controls executing processes on the computer system Allocates and protects system resources Assembly language instructions often pass through this level without modification Level-By-Level Overview Level 2: MACHINE ISA (instruction set architecture) level Consists of instructions specific to the architecture of the machine Programs written in machine language don’t require compilers, interpreters or assemblers – they’re already in binary. Level-By-Level Overview Level 1: CONTROL control unit decodes and executes instructions, moves data within the system control unit can be microprogrammed, or hardwired a microprogram is a program written in a low-level language and implemented by hardware hardwired control units consist solely of hardware, and directly execute machine instructions Level-By-Level Overview Level 0: DIGITAL LOGIC this is the hardware level, composed of chips containing digital circuits these circuits are composed of gates and wires (which are usually microscopic) the gates and wires implement the mathematically-based logic of all other levels What’s to come? von Neumann architecture understanding where the modern computer came from Non-von Neumann architectures Some modern alternatives Data Representation how is information represented, stored and manipulated inside the computer?