Summary

These notes cover different types of requirements, issues with requirements, prioritizing requirements, and the requirements engineering process. They include various methods and examples.

Full Transcript

Notes for SO exam: WEEK 2 Types of Requirements:  Functional Requirements: Specify actions a system must perform, e.g., allowing an admin to add or edit details.  Non-Functional Requirements: Define system qualities like performance or usability, such as the system be...

Notes for SO exam: WEEK 2 Types of Requirements:  Functional Requirements: Specify actions a system must perform, e.g., allowing an admin to add or edit details.  Non-Functional Requirements: Define system qualities like performance or usability, such as the system being operable on a specific Linux version.  User Requirements: High-level needs described in natural language, making them accessible to a broad audience.  System Requirements: More technical, detailed specifications intended as a contract between the client and developer, including both functional and non- functional elements. Issues with Requirements:  Ambiguity and Incompleteness: Natural language can lead to unclear requirements; large projects with diverse stakeholders often face inconsistencies.  Changing Requirements: Requirements evolve, potentially leading to confusion and misalignment between stakeholders.  Developer Assumptions: Lack of domain knowledge can cause misunderstandings about project needs. Prioritizing Requirements:  Methods: Prioritization frameworks like Sommerville’s “Shall/Should” method, the MoSCoW method (Must, Should, Could, Won’t), and priority levels (High, Medium, Low) help focus on the most critical needs. Examples of Requirement Structuring:  The lecture emphasizes the importance of organized and detailed requirements. Requirements should ideally be grouped hierarchically, with explicit priorities and clarifications for each specification. Requirements Engineering Process:  Feasibility Study: Initial analysis to determine the practicality of a system, including integration challenges, skill needs, and organizational impacts.  Requirements Elicitation: Techniques like interviews, questionnaires, and observations help gather requirements, but stakeholders often struggle to express needs fully, and requirements may change during development.  Requirements Specification: Requirements are documented for clarity and alignment, avoiding promises beyond feasibility.  Requirements Validation: Verification to ensure the requirements accurately reflect customer needs. Fixing requirements errors post-delivery is costly. Considerations in Requirements:  Important criteria include validity, consistency, completeness, realism, verifiability, comprehensibility, and traceability. Traceability:  Traceability links requirements to their origins and dependencies, enabling tracking and change management.  Types include source traceability (linking requirements to stakeholders), requirements traceability (interdependencies), and design traceability (connections to system design). Requirements Evolution:  Requirements are expected to evolve, but there must be limitations on changes to avoid project scope creep. Requirements Documentation:  A well-organized, hierarchical, and traceable document outlines all requirements, often using standards like IEEE/ANSI 830-1998 for clarity and consistency. Use cases, diagrams, and structured formats aid in clarity and communication. Modeling with UML:  Unified Modeling Language (UML): UML is a flexible modeling language used to visualize system design, adaptable to different methodologies.  Use Cases: Describe system interactions from the user’s perspective, often with diagrams and text. Typical examples include library or help desk processes.  Other diagrams like activity diagrams (workflow), sequence diagrams (object interactions), and deployment diagrams (physical layout) provide different system views. Requirements-Based Testing:  Testing requirements ensures systems meet both functional and non- functional specifications, which are typically easier to measure.  Qualitative Requirement Testing: For subjective qualities like usability or reliability, attribute specification breaks down requirements into measurable goals (e.g., downtime, response time).  Quality Measures: Metrics like transaction speed, response time, reliability, and portability help quantify performance, involving prototypes, benchmarks, and user feedback for realistic goal-setting. Quality Boundaries:  Domain experts, standards, and simulations help define acceptable quality levels, while user input refines expectations and test boundaries. Week 3 Challenges in Software Engineering:  Complexity: Each project is unique, requiring customized solutions that are hard to standardize.  Conformity: Software must adapt to existing systems and regulations.  Changeability: Software often needs to evolve due to external demands or new applications.  Invisibility: Unlike physical products, software lacks visual presence, making it harder to fully represent in diagrams. The Software Process:  Defined as a structured set of activities: specification, design and implementation, validation, and evolution. Plan-Driven vs. Agile Development:  Plan-Driven Development: A traditional approach with distinct, predefined stages.  Agile Development: Iterative with overlapping activities, adapting to changes and refining requirements during development. Software Process Models:  Waterfall Model: Sequential stages, suitable for projects with stable requirements but limited flexibility.  Incremental Development: Interleaves specification, development, and validation, allowing for faster customer feedback and adaptation but risking degraded structure over time.  Reuse-Oriented Development: Assembles systems from existing components (COTS), potentially reducing development time and cost. Iterative Planning:  Combines risk and client-driven priorities to address high-risk issues early and deliver high-priority client features sooner. 1. Managing Change in Software: o Change is inevitable due to business needs, technology advancements, and platform shifts. o Strategies to reduce rework costs include:  Tolerate Change: Agile processes allow incremental updates, where changes impact only specific parts.  Avoid Change: Prototyping and early customer feedback help refine requirements, reducing later modifications. 2. Software Prototyping: o Prototypes provide an initial version of a system for testing concepts and usability, often discarded afterward as they aren’t production-ready. 3. Incremental Development and Delivery: o Incremental development involves building software in parts, with user feedback after each increment. Incremental delivery allows functional parts to be deployed early for real-world feedback, though it can be challenging for large, integrated systems. 4. Rational Unified Process (RUP): o RUP is a modern process model supporting iterative and incremental development, integrating elements from various process models. o Phases of RUP:  Inception: Establish the business case.  Elaboration: Refine architecture and requirements.  Construction: Develop and test the system.  Transition: Deploy the system. 5. RUP Best Practices: o Emphasizes iterative development, requirement management, component-based architecture, visual modeling with UML, quality assurance, and change control. Agile Background:  Agile methods emerged as a response to rigid, document-heavy approaches. They focus on rapid iteration, adaptability, and minimal documentation to keep up with dynamic business needs. Agile Manifesto & Principles:  The Agile Manifesto values individuals and interactions, working software, customer collaboration, and responsiveness to change. Principles emphasize continuous delivery, customer involvement, self-organizing teams, and simplicity. Core Agile Techniques:  User Stories: Simple descriptions of requirements for prioritization.  Timeboxing: Limits iterations or meetings to fixed time frames.  Releases: Deliver small, functional software parts frequently.  Refactoring: Continuous code improvement without changing functionality to maintain simplicity and ease of changes. Agile Methods:  Popular methods include Scrum, Extreme Programming (XP), and Lean Development. Techniques like pair programming and daily stand-ups encourage collaboration and adaptability. Challenges with Agile:  Agile may struggle with customer engagement, team dynamics, scalability, and complex stakeholder requirements. Agile methods work best for smaller, co-located teams but may need adjustments for larger projects. Plan-Driven vs. Agile:  Choosing between approaches depends on project scale, team structure, and system requirements. Agile works well for flexible, small-team projects, while plan-driven may suit larger, heavily regulated systems requiring extensive documentation. Introduction to Scrum:  Scrum is an agile methodology focused on delivering high-priority features quickly, with iterative progress in time-boxed sprints of 2-4 weeks. Scrum Roles:  Product Owner: Defines and prioritizes product features based on business value and customer needs.  ScrumMaster: Facilitates Scrum practices, removes obstacles, and shields the team from interruptions.  Development Team: A cross-functional group (5-9 people) responsible for delivering increments of potentially shippable software. Scrum Events:  Sprint Planning: The team selects items from the product backlog to work on during the sprint.  Daily Scrum: A 15-minute stand-up meeting where team members share progress, plans, and blockers.  Sprint Review: At the end of the sprint, the team demonstrates completed work to stakeholders.  Sprint Retrospective: A reflection on the sprint to identify improvements. Scrum Artifacts:  Product Backlog: A prioritized list of product features, maintained by the Product Owner.  Sprint Backlog: Tasks selected for the sprint, broken down into manageable work units.  Burndown Chart: Tracks the progress of tasks toward sprint completion. Key Principles:  Scrum emphasizes self-organizing teams, continuous feedback, and iterative development without scope changes during a sprint. Scalability:  Scrum can be scaled across multiple teams for large projects, adapting based on project size and team distribution. Types of Reports:  Reports vary by project stage, such as requirements specifications, risk analysis, progress updates, and final design documentation.  Group projects typically require multiple report types for different purposes and audiences. General Report Structure:  Reports should include a title page, abstract, introduction, main sections, conclusions, references, and appendices as needed.  Clarity, professionalism, and relevance are essential in content and presentation. Meeting Minutes:  Essential for recording decisions, action items, and plans. Meeting minutes ensure continuity and serve as an official record. Progress Reports:  Track project milestones, tasks completed, problems encountered, and solutions. These reports promote alignment, transparency, and accountability. Effective Reporting Tips:  Consider your audience, maintain consistency in style, structure logically, and ensure grammatical accuracy.  Utilize tables and figures for clarity, and include recommendations in conclusions. Group Project Reporting Tips:  Produce regular internal reports, follow stage-specific guidelines, and avoid leaving all reporting to the last minute. Specialist: Provides deep knowledge and expertise in a specific area, focusing on their specialized skills to contribute technical insights. Plant: A creative and imaginative role, the Plant generates ideas and approaches for solving complex problems, often thinking outside the box. Monitor/Evaluator: Analyzes options and provides objective assessments. They help teams make informed decisions by evaluating ideas critically. Implementer: Practical and efficient, Implementers turn ideas into actionable tasks. They help structure the work process and bring plans to life. Shaper: Drives the team forward with energy and determination, pushing for results and overcoming obstacles. Completer/Finisher: Pays close attention to detail, ensuring that tasks are completed accurately and on time. They often review work to ensure quality. Team Worker: Promotes team cohesion by being sensitive to others' needs and maintaining harmony within the group. Resource Investigator: Outgoing and enthusiastic, this role explores opportunities, connects with external contacts, and brings in new resources or ideas. WEEK 4 Understanding Risk:  Risk involves future uncertainties that impact a project’s timeline, budget, or quality. Effective risk management can proactively prevent negative outcomes. Risk Management Approaches:  Reactive: Addresses risks when they arise (e.g., "firefighting").  Proactive: Identifies and manages risks beforehand, minimizing their impact. Risk Types:  Risks are categorized as Project (affecting schedule or resources), Product (impacting software quality), and Business (affecting the organization).  Examples include technical limitations, lack of skilled personnel, organizational restructuring, tool inefficiencies, and evolving requirements. Risk Management Process:  Identification: Recognize potential risks based on project, product, and business factors.  Analysis: Assess each risk’s likelihood and impact (e.g., using a risk map).  Planning: Develop strategies for risk avoidance, minimization, or contingency planning.  Monitoring: Regularly evaluate risks to adjust strategies as needed. Risk Mitigation, Monitoring, and Management (RMMM):  Evaluate the cost-benefit of risk mitigation strategies. Use the 80/20 rule: focus on the 20% of risks that account for 80% of overall impact. WEEK 5 Cost Breakdown Structure (CBS):  Costs include personnel, hardware, software, installation, training, and maintenance. Overheads such as equipment and consumables must also be considered. Types of Costs:  Staff Costs: Calculated based on salaries, including social and insurance costs.  Hardware & Software: Costs for necessary technology, licenses, and potential rentals.  Installation, Training, and Maintenance: Costs related to system deployment, user training, and ongoing support. Effort and Overhead Costs:  These include staff salaries, as well as indirect costs like building maintenance, utilities, and shared facilities. Estimation Challenges:  Estimating productivity, project size, and time accurately is complex. Factors like experience, technology, and project scope impact productivity and costs.  The “cone of uncertainty” model illustrates how initial estimates are often inaccurate and refine over time. Costing Methods:  Techniques include algorithmic cost modeling, expert judgment, analogy- based estimation, and agile estimation approaches. Project Pricing Considerations:  Pricing often reflects broader factors, like market opportunity, contract terms, and financial health, beyond direct development costs. Group Project Costing:  Group projects require detailed internal budgets, including realistic overheads, client equipment, hosting, and staff roles based on project phases and individual roles. Defining Usability:  Usability refers to the ease with which users interact with a system, focusing on meeting user needs effectively and intuitively. User-Centered Design:  Design should prioritize user needs, preferences, and behaviors, often starting with mockups and prototypes for user feedback before full development. Usability Testing:  Involves observing target users performing tasks, capturing quantitative and qualitative data, and collecting user opinions through interviews and surveys. Mockup Studies:  Before coding, mockups help visualize and refine interfaces. Participants interact with prototypes, providing feedback that guides design adjustments. Interview Techniques:  Interviews should avoid bias, maintain focus on user experience, and use clear, jargon-free language. Questions should explore user interaction with the system rather than hypothetical uses. Participant Selection:  A study typically involves 6-10 participants to ensure diverse feedback, with standardized questionnaires for reliable, valid insights. Common Pitfalls:  Avoid confusing terminology, poorly worded questions, and missing consent forms. Clear documentation, including protocols and graphs, helps structure findings and recommendations.

Use Quizgecko on...
Browser
Browser