🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

wk 1 - elect 3.pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

Lesson Proper for Week 1 – Introduction to Software Development Software In the early years of computer era, software was viewed by many an afterthought. Computer programming was a “seat- of-the-pants” art for which few systematic methods existed. Software development was not managed virtual...

Lesson Proper for Week 1 – Introduction to Software Development Software In the early years of computer era, software was viewed by many an afterthought. Computer programming was a “seat- of-the-pants” art for which few systematic methods existed. Software development was not managed virtually – until schedules slipped or cost begun to rise. Then, programmers scrambled to make things right, and which heroic effort, they often succeeded. Throughout the early years, general-purpose hardware became ordinary. Software, on the other hand, was custom- designed for each application and had a relatively limited distribution. Product software (i.e., programs developed to be sold to one or more customers) was just beginning. Most software was developed and ultimately used by the same person or organization. In the early years, there is much to learn about the implementation of computer-based systems, but relatively little about computer system engineering. However, there must be acknowledgement on the many outstanding computer-based systems that were developed during this era. Some of these remain in use today and provide landmark achievements that continue to justify administration. The second era of computer system evolution covered the decade from the mid-1960s to the late 1970s. Multiprogramming, multi-user systems introduced new concepts of human-machine interaction. Interactive techniques opened a new world of applications and new levels of hardware and software sophistication. Real-time systems could collect, analyze, and transform data from multiple sources, thus controlling processes and producing output in milliseconds instead of minutes. Advances in online storage led to the first generation of database management systems. Moreover, the second era was characterized by the use of product software and the arrival of “software houses.” Software was developed for wide-range distribution in a multi-disciplinary market. Programs for mainframes and minicomputers were distributed to hundreds and sometimes thousands of users. The third era of computer system evolution started in the mid-1970s and spanned more than a full decade. The distributed system — multiple computers, each performing functions concurrently and communicating with one another — greatly increased the complexity of computer- based systems. Global and local area networks, high- bandwidth digital communications and increasing demands for “instantaneous” data access put heavy demands on software developers. Personal use was rare. The conclusion of the third era was characterized by the arrival and widespread use of microprocessors. The microprocessor has produced a wide array of intelligent products — from automobiles to microwave ovens, from industrial robots to blood serum diagnostic equipment — but none has been more important than the personal computer. In less than a decade, computers became readily accessible to the public at large. The fourth era of computer system evolution moves us away from individual computers and computer programs and toward the collective impact of computers and software. Powerful desktop machines controlled by sophisticated operating systems, networked locally and globally, and coupled with advanced software applications have become the norm. Computing architectures are rapidly changing from centralized mainframe environments to decentralized client/server environments. And toward the collective impact of computers and software. Powerful desktop machines controlled by sophisticated operating systems, networked locally and globally, and coupled with advanced software applications have become the norm. Computing architectures are rapidly changing from centralized mainframe environments to decentralized client/server environments. Software-Related Problems The following is a set of software-related problems that persisted during the evolution of computer- based systems. 1. Hardware advances continue to outpace the ability to build software to tap hardware’s potentials. 2. The ability to build new programs cannot keep pace with the demand for new programs, nor can program be built rapidly enough to meet business and market needs. 3.The widespread use of computers has made society increasingly dependent on reliable operation of software. Enormous economic damage and potential human suffering can occur when software fails. 4. The struggle to build computer software that has high reliability and quality. 5. The ability to support and enhance existing programs is threatened by poor design and inadequate resources. Figure 1.1 Software Components An important characteristic of a high-quality software component is its reusability. A software component should be designed and implemented so that it can be reused in many different programs. In the 1960s, a scientific subroutine libraries were built that was reusable in a broad array of engineering and scientific applications. These subroutine libraries reused well-defined algorithms in an effective manner, but had a limited domain of application. Nowadays, modern reusable components encapsulate both data and the processing that is applied to the data, enabling the software engineer to create new applications from reusable parts. For instance, today’s interactive interfaces are built using reusable components that enable the creation of graphics windows, pull- down menus and a wide variety of interaction mechanisms. The data structures and processing detail required to build the interface are contained within a library of reusable components for interface construction. Software components are built using a programming language that has a limited vocabulary, an explicitly defined grammar, and well-formed rules of syntax and semantics. At the lowest level, the language represents the instruction set of the hardware. While at mid-level, programming languages, such as Ada 95, C, or Smalltalk, are used to create a procedural description of the program. And at the highest level, the language uses graphical icons or other symbol to represent the requirements for a solution. Executable instructions are automatically generated. Machine-level language is a symbolic representation of the CPU instruction set. When a good software developer produces a maintainable, well-documented program, machine-level language can make extremely efficient use of memory and “optimize” program execution speed. Mid-level languages allow the software developer and the program to be machine-independent. If a more sophisticated translator is used, the vocabulary, grammar, syntax, and semantics of a mid- level language can be much more sophisticated than machine-level languages. Machine codes, assembly languages, and mid-level programming languages are often referred to as the first three generations of computer languages. With any of these languages, the programmer must be concerned both with the specification of the information structure and the control of the program itself. Languages in the first three generations are termed procedural languages. Fourth generation languages, also called nonprocedural languages, move the software developer even further from the computer hardware. Instead of requiring the developer to specify procedural detail, the nonprocedural language implies a program by “specifying the desired result, rather than specifying action required achieving that result.” Software Characteristics When hardware is built, the human creative process (analysis, design, construction, testing) is ultimately translated into a physical form. If a new computer is build, initial sketches, formal design drawings, and bread boarded prototypes evolve into a physical product (VLSI chips, circuit boards, power supplies, etc.). 89 Software is a logical rather than a physical system element. Therefore, software has characteristics that differ considerably from those of hardware: 1. Software is developed or engineered; it is not manufactured in the classical sense. Software costs are concentrated in engineering. This means that software projects cannot be managed as if they were manufacturing projects. In the mid-1980s, the concept of the “software factory” was introduced in literature. It is important to note that this term does not imply that hardware manufacturing and software development are equivalent. Rather, the software factory concept recommends the use of automated tools for software development. 2. Software doesn’t “wear out.” Figure 1.2 Failure Curve for Hardware The figure 1.2 depicts failure rate as a function of time for hardware. The relationship, often called the “bathtub curve”, indicates that hardware exhibits relatively high failure rates early in its life (these failures are often attributable to design or manufacturing defects); defects are corrected, and the failure rate drops to a steady-state level for some period of time. As time passes, however, the failure rate rises again as hardware components suffer from the cumulative effects of dust, vibration, abuse, temperature extremes, and many other environmental maladies. Stated simply, the hardware begins to wear out. Figure 1.3 Failure Rate Curve for Software (idealized) Software is not susceptible to the environmental maladies that cause hardware to wear out. In theory, therefore, the failure rate curve for software should take the form shown in the figure 1.3 Undiscovered defects will cause high failure rates early in the life of a program. However, these are corrected and the curve flattens as shown. The figure is a gross oversimplification of actual failure models for software. However, the implication is clear - software doesn’t wear out. But it does deteriorate! Figure 1.4 Actual Failure Curve for Software During its life, software will undergo change (maintenance). As changes are made, it is likely that some new defects will be introduced, causing the failure rate curve to spike as shown in the figure 1.4. Before the curve can return to the original steady-state failure rate, another change is requested, causing the curve to spike again. Slowly, the minimum failure rate level begins to rise - the software is deteriorating due to change. Another aspect of wear illustrates the difference between hardware and software. When a hardware component wears out, it is replaced by a spare part. There are no software spare parts. Every software failure indicates an error in design or in the process through which design was translated into machine executable code. Therefore, software maintenance involves considerably more complexity than hardware maintenance. 3. Most software is custom-built, rather than being assembled from existing components. Consider the manner in which the control hardware for a microprocessor-based product is designed and built. The design engineer draws a simple schematic of the digital circuitry, does some fundamental analysis to ensure that proper function will be achieved, and then refers to a catalog of digital components. Each integrated circuit (often called an “IC” or a “chip”) has a part number, a defined and validated function, a well-defined interface, and a standard set of integrating guidelines. After each component is selected, it can be ordered off-the- shelf. Software Application It is rather hard to develop meaningful generic categories for software applications. As the complexity of software grows, neat compartmentalization disappears. The following software areas denote the scope of potential applications: System Software – This is a collection of programs written to service other programs. Some system software (e.g., compilers, editors, and file management utilities) processes complex, but determinate, information structures. Other systems applications (e.g., operating system components, drivers, telecommunications processors) process largely indeterminate data. In either case, systems software area is characterized by heavy interaction with computer hardware; heavy usage by multiple users; concurrent operation that requires scheduling, resource sharing, and sophisticated process management; complex data structures; and multiple external interfaces. Real-Time Software — Programs that onitor/analyze/control real world event as they occur are called real-time software. Elements of real-time software include a data gathering component that collects and formats information from an external environment, an analysis component that transforms information as required by the application, a control/output component that responds to the external environment, and a monitoring component that coordinates all other components so that real-time response can be maintained. It should be noted that the term “real- time” differs from “interactive” or “timesharing.” A real-time system must respond within strict time constraints. The response time of an interactive system can normally be exceeded without disastrous results. Business Software — Business information processing is the largest single software application area. Discrete “systems” (e.g., payroll, accounts receivable/payable, inventory, etc.) have evolved into management information system (MIS) software that accesses one or more large databases containing business information. Applications in this area restructure existing data in a way that facilitates business operations or management decision making. In addition to conventional data processing applications, business software applications also encompass interactive and client/server computing (e.g., point-of-sale transaction processing). Engineering and Scientific Software — These have been characterized by “number crunching” algorithms. Applications range from astronomy to volcanology, from automotive stress analysis to space shuttle orbital dynamics, and from molecular biology to automated manufacturing. However, new applications within the engineering/scientific area are moving away from conventional numerical algorithms. Computer-aided design, system simulation, and other interactive applications have begun to take on real-time and even system software characteristics. Embedded Software — Intelligent products have become commonplace in nearly every consumer and industrial market. Embedded software resides in read-only memory and is used to control products and systems for the consumers and industrial markets. It can perform very limited and esoteric functions (e.g., keypad control for a microwave oven) or provide significant function and control capability (e.g., digital functions in an automobile such as fuel control, dashboard displays, braking systems, etc.). Personal Computer Software — The personal computer software market has multiplied over the past decade. Word processing, spreadsheets, computer graphics, multimedia, entertainment, database management, personal and business financial applications, and external network or database access are only a few of hundreds of applications. Artificial Intelligence Software — This makes use of non- numerical algorithms to solve complex problems that are not amenable to computation or straightforward analysis. An active AI area is expert systems, also called knowledge- based systems. However, other application areas for AI software are pattern recognition (image and voice), theorem proving, and game playing.

Use Quizgecko on...
Browser
Browser