CS 427 Software Engineering PDF
Document Details
Uploaded by Deleted User
Tags
Summary
This document provides an overview of software engineering concepts, including the evolution of software engineering, key concepts, and software processes. It covers different software development models, project management phases, and considerations for selecting a development process. It also discusses factors to consider when planning and tracking projects.
Full Transcript
Week 1 The Evolution of Software Engineering 1960s: ○ Batch Processing: Early method of executing computer programs. ○ Time Sharing: Allowed multiple users to interact with a computer simultaneously. NATO Software Engineering Conference (1968): ○ Defined the field...
Week 1 The Evolution of Software Engineering 1960s: ○ Batch Processing: Early method of executing computer programs. ○ Time Sharing: Allowed multiple users to interact with a computer simultaneously. NATO Software Engineering Conference (1968): ○ Defined the field of software engineering. ○ Addressed the "software crisis" - the difficulty of developing and maintaining software. Key Concepts in Software Engineering Information Hiding: Encapsulating design decisions within modules to reduce complexity. Software Characteristics: ○ Malleable: Easily modified and adapted. ○ Complex: No two parts are identical. ○ Useful: Solves real-world problems. ○ Evolving: Constantly changing due to new requirements and technologies. Software Quality Attributes: ○ Size, human interaction, stability, reliability, security, portability, cost The Software Project Trinity Balancing the needs of three key stakeholders: System: High quality and performance. Users: Positive user experience. Development: Efficient and productive development process. Software Engineer Roles and Responsibilities Users: Desire quality features and functionality. Customers: Seek cost-effective solutions. Developers: Aim for shorter development times. Other roles include: Designers Testers Support engineers Qualities of a Good Software Engineer: Continuous Improvement: Always seeking to learn and grow. Open-Mindedness: Willing to consider different approaches. Effective Decision-Making: Able to make sound judgments in complex situations. Honesty and Integrity: Trustworthy and transparent. Attention to Detail: Meticulous in code and problem-solving. Strong Communication Skills: Able to collaborate effectively with team members. The Software Engineering Process A systematic, disciplined, and quantifiable approach: 1. Requirements Gathering: Clearly define the software's purpose and functionality. 2. Design: Break down the system into smaller modules and define their interactions. 3. Implementation: Write the code for each module. 4. Integration: Combine the modules and test the overall system. 5. Maintenance: Update and improve the software to meet changing needs. Week 2 Software Process A framework for organizing tasks to build high-quality software. Agile: Iterative and incremental development, focused on flexibility and customer collaboration. ○ Scrum: A popular Agile framework with time-boxed iterations (sprints). Plan-Driven/Formal: Sequential process with detailed planning and documentation. ○ Waterfall: A linear model with distinct phases. Distributed/Open Source: Collaborative development involving a large, distributed community. ○ Bazaar: A model emphasizing open collaboration and decentralized development. Project Management Phases 1. Pre-Development: ○ Exploration: Defining project goals and scope. ○ Allocation: Assigning resources and creating a project plan. ○ Importation: Acquiring necessary tools and infrastructure. 2. Development: ○ Requirements: Gathering and documenting user needs. ○ Design: Creating the software architecture and detailed design. ○ Implementation: Writing the code. 3. Post-Development: ○ Installation: Deploying the software. ○ Operation/Support: Providing ongoing support and maintenance. ○ Maintenance: Fixing bugs and adding new features. ○ Retirement: Phasing out the software. Software Development Models Waterfall Model: Sequential phases with limited flexibility. Spiral Model: Iterative model emphasizing risk management. Rational Unified Process (RUP): A comprehensive framework with phases and disciplines. Agile Models: Iterative and incremental, focusing on flexibility and customer collaboration. ○ Extreme Programming (XP): A specific Agile methodology with practices like pair programming, test-driven development, and continuous integration. Choosing a Process Factors to consider: Project Size: Smaller projects may benefit from Agile, while larger projects might require a more structured approach like Plan-Driven. Criticality: Highly critical systems may require a more rigorous, plan-driven approach. Dynamism: Agile is well-suited for projects with changing requirements or technology. Team Skills: Agile teams require strong communication and collaboration skills. Organizational Culture: Agile may be more difficult to adopt in organizations with a strong hierarchical culture. Project Planning and Tracking Project Velocity: A measure of a team's productivity. Planning Game: A collaborative technique for estimating and prioritizing user stories. Story Points: A unit of measure for estimating the size of user stories. PERT Charts: Visual representations of project tasks and dependencies. Gantt Charts: Visual representations of project timelines. Start-to-Start: Task B cannot start until Task A starts. ○ Example: Coding a module cannot start until the design is complete.x Finish-to-Finish: Task B cannot finish until Task A finishes. ○ Example: Testing a module cannot finish until the coding is complete. Risk Management: Identifying, analyzing, prioritizing, planning, mitigating, and monitoring risks. Additional Tips Continuous Learning: Stay up-to-date with the latest software engineering trends and practices. Effective Communication: Clearly communicate with team members and stakeholders. Collaboration: Work effectively with others to achieve shared goals. Problem-Solving: Approach challenges with a logical and creative mindset. Attention to Detail: Pay attention to the small details that can make a big difference. Week 3 Requirements Engineering Understanding Requirements Functional Requirements: Specify the system's behavior and functionality. Non-Functional Requirements: Specify constraints on the system's behavior, such as performance, security, and usability. Requirement Elicitation Techniques Traditional Techniques: Interviews, surveys, and questionnaires. Group Elicitation Techniques: Brainstorming, focus groups, and workshops. Prototyping: Creating early versions of the system to gather feedback. Model-Driven Techniques: Using models to visualize and analyze requirements. Cognitive Techniques: Analyzing user behavior and mental processes. Contextual Inquiry: Observing users in their natural environment. Use Case Diagrams Actors: Entities that interact with the system. Use Cases: Specific functionalities or services provided by the system. Relationships: ○ Association: A relationship between an actor and a use case. ○ Include: A relationship between two use cases, where one use case includes the behavior of another. ○ Extend: A relationship between two use cases, where one use case can extend the behavior of another under specific conditions. Use Case Scenarios Basic Flow: The primary sequence of events in a use case. Alternative Flows: Other possible sequences of events, including error conditions and exceptions. Requirements Validation Quality Factors: Completeness, consistency, feasibility, modifiability, etc. Common Flaws: Omissions, contradictions, infeasibility, poor modifiability, etc. Additional Tips for Effective Requirements Engineering Involve Stakeholders: Actively involve stakeholders throughout the requirements process. Prioritize Requirements: Identify and prioritize the most critical requirements. Use Clear and Concise Language: Avoid ambiguity and technical jargon. Review and Validate Requirements: Regularly review and validate requirements to ensure accuracy and completeness. Use Effective Tools: Utilize tools like requirements management software to organize and track requirements. Iterative Approach: Consider an iterative approach to requirements gathering and refinement. Week 4 Software Architecture Abstraction: Simplifying complex systems by focusing on essential details. ○ Procedural abstraction: Naming sequences of instructions. ○ Data abstraction: Naming collections of data. ○ Control abstraction: Hiding implementation details. Functional Independence: ○ Coupling: Degree of interdependence between modules. ○ Cohesion: Degree of relatedness within a module. Modularity: Breaking down a system into smaller, independent modules. Conway's Law: The organization of a system reflects the organization of the team that built it. Architectural Styles Layer/Tier Style: A layered architecture with each layer providing services to the layer above. Pipe and Filter Style: A linear architecture where components process data and pass it to the next component. Client-Server Style: A distributed architecture with clients requesting services from servers. Architectural Patterns Model-View-Controller (MVC): ○ Model: Represents the data and business logic. ○ View: Displays the information to the user. ○ Controller: Handles user input and updates the model and view. Architectural Representations Static View: Shows the structural elements of a system. Dynamic View: Shows the behavioral aspects of a system. Deployment View: Shows the physical deployment of the system. Software Architect's Role Understanding stakeholder requirements. Designing and developing the system architecture. Collaborating with development teams. Ensuring the system meets quality and performance standards. Documenting the architecture. Week 5 Object-Oriented Design Occurs before implementation in the software life cycle Focuses on medium implementation details and low-level requirements Utilizes UML class diagrams Class Diagrams Analysis (Conceptual Model) Represents the problem, not the solution Can include actors outside the system Design Specification Depicts the structure of how the system will be written Implementation Shows the actual classes of implementation Return types are not mandatory Elements of Class Diagrams Class Consists of attributes and operations Each attribute has a type Operations have signatures (only name is mandatory) Order: Name - Attributes - Operations Association Represents relationships between classes Includes multiplicity and direction Solid arrow indicates "has a" relationship Example: Claim has a plan/provider Multiplicity 0..1: 0 or 1 instance *: 0 or more instances 1: exactly 1 instance 1..*: 1 or more instances Generalization Represented by a hollow arrow Indicates "is a" relationship Example: Fax image is a claim image Aggregation Stronger association Whole-part relationship Indicated by a white diamond Example: Department contains a set of employees Has a set method to ensure part is not null Composition Stronger form of aggregation Indicated by a black diamond Example: Invoice - invoice line No checking; past lives or dies with the whole Abstract Class Shown in italics Super classes with abstract methods Implemented by concrete classes Interface Represents a portion of visible behaviors for a set of objects Does not have instance variables or methods Denoted by keyword Connected to classes with dashed lines and hollow arrows Comments Represented as folded notes Connected to methods or classes with dashed lines Mapping Nouns are mapped to classes Verbs are mapped to operations Sequence Diagram Models how objects communicate via a sequence of events for a single scenario Components: ○ Objects: represented as columns ○ Messages: represented as arrows ○ Time: represented in the vertical dimension ○ Activations: represented as narrow rectangles Events are ordered based on triggering time Week 6 Design Patterns: A Brief Overview Design patterns are reusable solutions to common software design problems. They provide a vocabulary and framework for discussing design issues and solutions. Types of Design Patterns: 1. Creational Patterns: Concerned with object creation mechanisms, promoting flexibility and reusability. 2. Structural Patterns: Deal with object composition and relationships, focusing on class and object structures. 3. Behavioral Patterns: Address how objects interact and communicate, optimizing algorithms and responsibilities. Key Behavioral Design Patterns: 1. Observer Pattern Purpose: Defines a one-to-many dependency between objects, so that when one object changes state, all its dependents are notified1 and updated automatically.2 Key Components: ○ Subject: Maintains a list of observers and notifies them when its state changes. ○ Observer: Defines an update method to be called by the subject. Use Cases: Event-driven systems, model-view-controller (MVC) architectures. 2. Composite Pattern Purpose: Composes objects into tree structures to represent part-whole hierarchies. Key Components: ○ Component: Defines the interface for objects in the composition. ○ Leaf: Represents leaf nodes in the hierarchy. ○ Composite: Represents composite nodes in the hierarchy, containing child components. Use Cases: File systems, GUI components. 3. Interpreter Pattern Purpose: Defines a grammatical representation for a language and interprets sentences in the language. Key Components: ○ Context: Encapsulates the semantic information associated with a particular syntactic structure. ○ Abstract Expression: Declares an interpret operation. ○ Terminal Expression: Implements the interpret operation for terminal symbols. ○ Nonterminal Expression: Implements the interpret operation for non-terminal symbols. Use Cases: Language interpreters, compilers. 4. Visitor Pattern Purpose: Separates an algorithm from an object structure on which it operates. Key Components: ○ Element: Defines an accept operation. ○ Visitor: Defines a visit operation for each concrete element. Use Cases: Compilers, syntax analyzers. 5. Template Method Pattern Purpose: Defines the skeleton of an algorithm in an operation, deferring some steps to subclasses. Key Components: ○ Abstract Class: Defines the template method and abstract operations. ○ Concrete Classes: Implement the abstract operations. Use Cases: Framework design, code generation. 6. Iterator Pattern Purpose: Provides a way to access the elements of an aggregate object sequentially without exposing its underlying representation.3 Key Components: ○ Iterator: Defines an interface for accessing elements. ○ ConcreteIterator: Implements the iterator interface. ○ Aggregate: Defines an interface for creating an iterator. ○ ConcreteAggregate: Implements the aggregate interface and creates concrete iterators. Use Cases: Collection classes, data structures. 7. Strategy Pattern Purpose: Defines a family of algorithms, encapsulates each one, and makes them interchangeable. Key Components: ○ Context: Uses a strategy interface. ○ Strategy: Defines an interface for algorithms. ○ Concrete Strategies: Implement the strategy interface. Use Cases: Sorting algorithms, compression algorithms. Week 7: Testing and Validation Validation vs. Verification Validation: Evaluating a system or component during or at the end of the development process to determine whether it satisfies specified requirements. Verification:1 Ensuring that a specific phase of the development process has been correctly completed. Test Cases Used to improve software quality, measure performance, and learn about the software. Can be written by developers, quality assurance teams, or users. Involve providing known inputs and comparing the results to expected outputs (test oracles). Document faults and code issues. Often involve complex data and multiple method calls. Types: fault-based, scenario-based, and random testing. Automated vs. Manual Testing Automated Testing: ○ Repetitive execution of tests. ○ Used for incremental software development. ○ Useful for documentation. Manual Testing: ○ Single execution of tests. ○ Testers may not have in-depth knowledge of the program. ○ Can be costly and time-consuming. Testing Activities Test Design: Designing test inputs to satisfy specific objectives. Requires technical knowledge. Test Automation: Embedding test inputs into scripts. Requires scripting knowledge. Test Evaluation: Running test inputs and recording results. Requires minimal knowledge. Test Execution: Evaluating actual results against expected results. Requires domain knowledge. Fault, Error, and Failure Fault: A defect or bug in the software. Error: A difference between a computed result and the true value caused by a fault. Failure: ○ Execution: The fault is executed by the program. ○ Infection: The software enters an erroneous state. ○ Propagation: The erroneous state propagates to cause an observable output. Black-Box Testing Testing based on the software's specifications. Focuses on what the software should do, not how it's implemented. Techniques: ○ Equivalence Partitioning: Dividing input domains into equivalence classes with similar behavior. ○ Boundary Value Analysis: Testing at the boundaries of input and output ranges. ○ Robustness Testing: Testing the software's ability to handle invalid inputs. White-Box Testing Testing based on the internal structure of the software. Aims to test all logical paths, including methods, statements, conditions, branches, and loops. Techniques: ○ Statement Coverage: Ensuring that every line of code is executed at least once. ○ Branch Coverage: Ensuring that every branch of a conditional statement is executed. ○ Path Coverage: Ensuring that every possible path through the code is executed. Code Coverage A metric used to measure the quality of test suites. Different types of coverage: ○ Method Coverage: Measures the percentage of methods that are executed. ○ Statement Coverage: Measures the percentage of lines of code that are executed. ○ Branch Coverage: Measures the percentage of branches in conditional statements that are executed. ○ Condition Coverage: Measures the percentage of conditions in conditional statements that are evaluated to true and false. ○ Multiple Condition Coverage: Measures the percentage of combinations of conditions that are evaluated. ○ Loop Coverage: Measures the percentage of different loop iterations that are executed. ○ Infeasible Path Coverage: Measures the percentage of paths that cannot be executed due to constraints Week 8: Unit Testing Small Program with Assertions Write a small program with assertions. Use example argument values. Test a sequence of method calls. Benefits of Unit Testing Design and Specification: Helps clarify the expected behavior of code. Code Coverage: Ensures that all code paths are tested. Short Feedback Loop: Quickly identifies and fixes issues. Documentation: Provides a living documentation of the code's behavior. JUnit Assertions: ○ assertEquals: Compares values of different data types. ○ assertAll: Ensures that all assertions in a group pass. ○ assertThrows: Verifies if a specific exception is thrown. Assumptions: ○ assumeTrue, assumeFalse: Skip tests based on conditions. ○ assumingThat: Executes code conditionally. Parameterized Tests: ○ Value Source: Provides values from a data source. ○ Enum Source: Provides values from an enumeration. ○ Method Source: Provides values from factory methods. ○ ConvertWith: Converts values to a specific type. Property-Based Testing Classify: Identify properties of the system under test. Pairwise Combine: Create test cases to cover all combinations of property values. Add Assumptions: Use assumptions to filter out irrelevant test cases. Property-First Guideline Formulate specific properties of the system. Don't classify methods directly. Create test cases based on the properties. Week 10 Debugging Process Debugging is the systematic process of identifying and resolving errors in a program. The TRAFFIC method provides a structured approach: Track the Problem: Clearly define the issue and its symptoms. Reproduce: Consistently recreate the error in a controlled environment. Automate: Use automated testing tools to streamline the debugging process. Find Origins: Identify the root cause of the problem. Focus: Narrow down the search area to the relevant code sections. Isolate: Pinpoint the exact line of code causing the error. Correct: Implement a fix and retest the program. Problem Tracking A problem tracker is a tool used to manage and prioritize bug reports. It typically includes: Problem History: A record of the bug's life cycle, including its discovery, assignment, and resolution. Expected Behavior: A clear description of how the program should behave. Popular problem tracking tools include Bugzilla, Jira, and Trello. Key Elements in Tracking and Reproducing Problems Reproducibility: The ability to consistently recreate the error. Isolation: Pinpointing the specific conditions that trigger the error. Generalization: Identifying patterns in the error's occurrence. Summarization: Concisely describing the problem and its impact. Condensation: Minimizing the test case to the smallest possible scenario. Disambiguation: Clearly distinguishing the error from other similar issues. Neutralization: Removing irrelevant factors that might mask the root cause. Delta Debugging Delta Debugging is a technique for systematically reducing the size of a failing test case to identify the minimal failing input. This helps in isolating the root cause of the error. Fault Localization Techniques Tracing: Using debugging tools to step through the code execution and inspect variables. Backtracking: Analyzing the program's execution history to identify the point of failure. Cause Elimination: Systematically ruling out potential causes until the root cause is found. Expertise: Leveraging knowledge of the program's design and common error patterns. Assertions Assertions are statements that check the validity of a program's state at a particular point in time. They can help in early detection of errors and improve code reliability. Invariant Assertions: Conditions that should always hold true. Class Invariants: Assertions that must be true before and after every method call in a class. Loop Invariants: Assertions that hold true at the beginning, end, and each iteration of a loop. Week 12 Key Realities of Software Constant Evolution: Software is continually subject to change, necessitated by evolving requirements, technological advancements, and bug fixes. Increasing Complexity: As software systems grow, they become more intricate, making maintenance a challenging endeavor. Types of Software Maintenance 1. Corrective Maintenance: Addressing and rectifying defects or bugs that have been identified in the software. 2. Preventive Maintenance: Implementing measures to prevent future failures or errors, such as code reviews, testing, and refactoring. 3. Adaptive Maintenance: Modifying the software to accommodate changes in the environment, such as new hardware or operating systems. 4. Perfective Maintenance: Enhancing the software's functionality, performance, or usability by adding new features or improving existing ones. Measuring Size and Complexity Size Metrics: Lines of Code (LOC): A basic measure of code quantity. Number of Methods, Classes, and Files: Quantifies the structural elements of the software. Complexity Metrics: Cyclomatic Complexity: Evaluates the logical complexity of a code segment. ○ Calculation: Number of decision points + 1 ○ High complexity (e.g., > 10) often indicates potential issues with testability and maintainability. Object-Oriented Metrics: Weighted Methods per Class (WMC): Measures the complexity of a class based on the number and complexity of its methods. Depth of Inheritance Tree (DIT): Assesses the depth of inheritance in a class hierarchy. Number of Children (NOC): Counts the number of subclasses a class has. Coupling Between Object Classes (CBO): Measures the degree of interdependence between classes. Response For a Class (RFC): Determines the number of methods a class may potentially call. Lack of Cohesion in Methods (LCOM): Evaluates the degree to which methods within a class are related. Refactoring Definition: Refactoring involves restructuring code to improve its design and readability without altering its external behavior. Benefits: Enhanced code clarity and understanding Reduced maintenance costs Increased code reusability Lowered risk of introducing defects Common Refactoring Techniques: Extract Method: Isolating a block of code into a separate method. Extract Variable: Assigning a meaningful name to a complex expression. Best Practices: Test-Driven Development (TDD): Writing tests before code to ensure refactoring doesn't introduce regressions. Gradual Refactoring: Making small, incremental changes. Leveraging Refactoring Tools: Utilizing automated tools to streamline the process. Code Smells Definition: Code smells are indicators of potential design problems or areas that could benefit from refactoring. Common Code Smells: Duplicated Code: Identical or nearly identical code segments. Long Method: Excessively long methods that are difficult to understand and maintain. Long Parameter List: Methods with many parameters, making them harder to call and test. Large Class: Overly large classes that violate the Single Responsibility Principle. Message Chain: A series of method calls chained together. Feature Envy: A method that is more interested in the data of another class than its own. Data Class: A class that primarily holds data without significant behavior. Addressing Code Smells: Identify: Use static analysis tools or manual code reviews to detect code smells. Refactor: Apply appropriate refactoring techniques to eliminate the smell. Continuous Monitoring: Regularly assess the codebase for emerging code smells.