Software Testing Part 2 Lecture (6) PDF

Document Details

StylishSpessartine

Uploaded by StylishSpessartine

جامعة العلوم والتقانة

Dania Mohamed Ahmed

Tags

software testing systems engineering software development computer science

Summary

This document is a lecture on software testing, specifically focusing on component and interface testing, including interface types, errors, and guidelines. It also covers system and component testing, and the concept of test-driven development (TDD).

Full Transcript

Software testing Part(2) Lecture (6) Dania Mohamed Ahmed Component testing  Composite Components: Software components are often composed of multiple interacting objects. For instance, in a weather station system, a reconfiguration component might include various objects ha...

Software testing Part(2) Lecture (6) Dania Mohamed Ahmed Component testing  Composite Components: Software components are often composed of multiple interacting objects. For instance, in a weather station system, a reconfiguration component might include various objects handling different aspects of reconfiguration.  Component Interface: Functionality is accessed through the defined interface of the composite component.  Testing Focus: Ensure that the component interface performs according to its specification. This testing assumes that individual unit tests for the objects within the component have already been completed.  In essence, component testing verifies that the integrated functionality of composite components meets its defined interface requirements. Interface Testing  Objectives: Identify faults caused by interface errors or incorrect assumptions about how interfaces work.  Interface Types: 1. Parameter Interfaces: Data is passed between methods or procedures. 2. Shared Memory Interfaces: A block of memory is shared among procedures or functions. 3. Procedural Interfaces: Sub-systems provide a set of procedures for other sub-systems to call. 4. Message Passing Interfaces: Sub-systems request services from other sub- systems through messages.  Interface testing focuses on ensuring that these interactions are correct and that all components communicate as expected. Interface testing Interface Errors 1. Interface Misuse:  Definition: Errors occur when a component incorrectly uses another component’s interface, such as passing parameters in the wrong order.  Example: Calling a method with incorrect arguments. 2. Interface Misunderstanding:  Definition: Errors arise from incorrect assumptions about the behavior of a component, leading to incorrect usage.  Example: Assuming a method will always return a value within a specific range, but it does not. Interface Errors 3. Timing Errors:  Definition: Errors due to discrepancies in the operational speeds of components, resulting in the use of outdated information.  Example: A component reads stale data because the other component has not updated it in time.  Understanding and addressing these errors is crucial for ensuring reliable and correct interactions between components. Interface Testing Guidelines  Extreme Parameter Values: Design tests with parameters at the extreme ends of their ranges to ensure robust handling.  Null Pointer Testing: Always test pointer parameters with null pointers to verify proper error handling.  Failure-Inducing Tests: Create test scenarios that intentionally cause the component to fail to check its robustness and error recovery.  Stress Testing: Apply stress testing in message passing systems to evaluate performance under high loads or intensive use.  Order Variation: In shared memory systems, test varying the activation order of components to ensure proper synchronization and data handling.  These guidelines help ensure that interfaces are thoroughly tested for reliability and robustness under various conditions. System testing  Integration: During development, system testing involves combining components into a complete system and testing this integrated version.  Focus: The main focus is on testing interactions between components to ensure they work together correctly.  Objectives: Verify that components are compatible, interact as expected, and transfer the correct data at the right times across their interfaces.  Emergent Behavior: Assess the system's emergent behavior, which is the combined effect of component interactions that might not be apparent when components are tested in isolation.  System testing ensures that the integrated system functions correctly and meets its requirements when all components work together. System and component testing  Integration of Reusable and Off-the-Shelf Components: During system testing, components that were developed separately or purchased as off-the-shelf solutions are integrated with newly developed components. The entire system is then tested as a whole.  Collective Process: System testing involves integrating components developed by different team members or sub-teams. It is a collaborative process that focuses on the interactions between integrated parts rather than individual components.  Separate Testing Teams: In some organizations, system testing is performed by a dedicated testing team that may not include the original designers or programmers. This separation helps ensure an objective evaluation of the system's functionality and integration.  System testing ensures that all parts of the system work together as intended and helps identify issues arising from the integration of various components. Use-case testing  Basis for Testing: Use-cases, which describe how users interact with the system, can be used to guide system testing.  Interaction Testing: Each use-case typically involves multiple system components. Testing these use-cases ensures that interactions between these components are exercised and validated.  Sequence Diagrams: Sequence diagrams associated with use-cases document the components and their interactions. These diagrams help in understanding and testing the flow of interactions as described in the use-case.  Use-case testing ensures that the system performs as expected in real-world scenarios by verifying the interactions among various components through practical use-case scenarios. Collect weather data sequence chart Testing policies  Exhaustive Testing: It is impractical to test every possible scenario in a system. Therefore, testing policies help define the required coverage and focus of system testing.  Examples of Testing Policies:  Menu Functions: Ensure that all system functions accessible through menus are tested.  Function Combinations: Test combinations of functions (e.g., text formatting) that are accessed through the same menu to verify their integration.  User Input: Test all functions with both correct and incorrect user inputs to ensure robust handling of various input scenarios.  Testing policies guide the testing process by specifying critical areas to focus on, helping to ensure that essential functionalities and common use cases are thoroughly tested. Test-driven development (TDD)  Approach: TDD integrates testing and code development by writing tests before writing the actual code.  Test-First: Tests are created before the code, with passing these tests being the primary focus of development.  Incremental Development: Code is developed incrementally, with each increment accompanied by a corresponding test. The development of new increments only proceeds once the existing code passes its test.  Usage: Although TDD originated within agile methodologies like Extreme Programming, it can also be applied in traditional plan-driven development processes.  TDD emphasizes writing tests early and frequently, ensuring that code is continuously validated and that each new feature or change is tested as it's developed. TDD process activities 1. Identify Increment:  Activity: Determine a small, manageable piece of functionality to implement, typically something that can be achieved in a few lines of code. 2. Write Test:  Activity: Create an automated test for the identified functionality. This test should define the expected behavior and outcomes. 3. Run Test:  Activity: Execute the new test along with all existing tests. Initially, the new test will fail since the functionality has not yet been implemented. 4. Implement Functionality:  Activity: Develop the code required to fulfill the functionality described by the test. TDD process activities 5. Re-Run Tests:  Activity: Re-run the tests to ensure that the new functionality works as expected and that all tests pass. 6. Move to Next Increment:  Activity: Once all tests pass successfully, proceed to identify and implement the next piece of functionality.  This iterative process ensures that each piece of functionality is tested and validated before moving on, promoting reliable and incremental development. Test-driven development Benefits of test-driven development 1. Code Coverage:  Benefit: Ensures that every segment of code has at least one associated test, providing comprehensive coverage. 2. Regression Testing:  Benefit: A regression test suite grows incrementally with the development of the program, continuously validating that new changes do not break existing functionality. 3. Simplified Debugging:  Benefit: When a test fails, it is easier to pinpoint the problem to the newly written code, simplifying the debugging process. 4. System Documentation:  Benefit: Tests serve as a form of documentation, clearly describing the intended behavior of the code. Regression testing  Purpose: To verify that recent changes or additions to the system have not adversely affected existing functionality.  Manual vs. Automated:  Manual Testing: Regression testing manually is costly and time-consuming.  Automated Testing: With automation, regression testing becomes simple and efficient, as all tests are automatically rerun whenever changes are made.  Pre-Commit Requirement: All tests must pass successfully before any changes are finalized and committed to ensure that new changes do not introduce issues.  Regression testing ensures that new updates maintain the integrity of the existing code and functionality. Release testing  Purpose: To test a specific release of a system intended for use outside the development team, ensuring it meets the necessary quality standards.  Primary Goal: To provide assurance that the system is reliable and suitable for deployment by demonstrating that it delivers the specified functionality, performance, and dependability, and does not fail during normal use.  Testing Approach: Typically involves black-box testing, where tests are derived solely from the system specification without knowledge of the internal implementation details.  Release testing verifies that the system is ready for external use by validating that it meets all required criteria and performs as expected in real-world scenarios. Release Testing vs. System Testing  Release Testing:  Nature: A form of system testing focused on validating the final product before it is released to external users.  Responsibility: Conducted by a separate team that was not involved in the system development.  Objective: Ensure that the system meets its requirements, performs well, and is ready for external use (validation testing).  System Testing:  Nature: Involves testing the complete system during development to identify and fix bugs.  Responsibility: Performed by the development team.  Objective: Discover defects and ensure that all components work together as intended (defect testing).  Release testing is aimed at verifying that the system is fit for release and meets all user requirements, while system testing focuses on identifying and resolving defects within the development phase. Requirements based testing  Concept: This testing approach involves examining each requirement of the system and developing corresponding tests to ensure each requirement is met. ► Example Requirements (for MHC-PMS):  Allergy Warning: If a patient is allergic to a medication, the system should issue a warning when that medication is prescribed.  Ignoring Warnings: If a prescriber chooses to ignore an allergy warning, they must provide a reason for this action. Requirements based testing  Test Cases:  No Allergy: Create a patient record without allergies, prescribe an allergy- related medication, and verify that no warning is issued.  Known Allergy: Create a patient record with a known allergy, prescribe the allergenic medication, and confirm that a warning is issued.  Multiple Allergies: Set up a record with multiple allergies, prescribe each allergenic drug separately, and ensure the correct warnings are issued for each.  Multiple Warnings: Prescribe multiple drugs that trigger warnings and verify that each warning is issued correctly.  Overriding Warnings: Prescribe a drug that triggers a warning, override the warning, and check that the system prompts for a reason for the override.  Requirements-based testing ensures that all specified functionalities and behaviors are properly implemented and validated. Features tested by scenario 1. Authentication:  Scenario: Logging on to the system to verify that user authentication processes function correctly. 2. Data Transfer:  Scenario: Downloading and uploading patient records to a laptop to ensure proper handling and transfer of data. 3. Home Visit Scheduling:  Scenario: Scheduling home visits to check the functionality and usability of scheduling features. 4. Data Encryption/Decryption:  Scenario: Encrypting and decrypting patient records on a mobile device to ensure data security and integrity. 5. Record Management:  Scenario: Retrieving and modifying patient records to validate that record management operations work as expected. Features tested by scenario 6. Drug Database Integration:  Scenario: Interacting with a drugs database to maintain and access side-effect information, ensuring accurate and up-to-date data. 7. Call Prompting System:  Scenario: Testing the system's ability to prompt calls, verifying its functionality and reliability.  These scenarios cover a range of functionalities, ensuring that critical features of the system are thoroughly tested and validated. A usage scenario for the MHC-PMS  Kate is a nurse who specializes in mental health care. One of her responsibilities is to visit patients at home to check that their treatment is effective and that they are not suffering from medication side- effects.  On a day for home visits, Kate logs into the MHC-PMS and uses it to print her schedule of home visits for that day, along with summary information about the patients to be visited. She requests that the records for these patients be downloaded to her laptop. She is prompted for her key phrase to encrypt the records on the laptop.  One of the patients that she visits is Jim, who is being treated with medication for depression. Jim feels that the medication is helping him but believes that it has the side -effect of keeping him awake at night. Kate looks up Jim’s record and is prompted for her key phrase to decrypt the record. She checks the drug prescribed and queries its side effects. Sleeplessness is a known side effect so she notes the problem in Jim’s record and suggests that he visits the clinic to have his medication changed. He agrees so Kate enters a prompt to call him when she gets back to the clinic to make an appointment with a physician. She ends the consultation and the system re-encrypts Jim’s record.  After, finishing her consultations, Kate returns to the clinic and uploads the records of patients visited to the database. The system generates a call list for Kate of those patients who she has to contact for follow-up information and make clinic appointments. Performance testing ► Role in Release Testing: Performance testing may be part of release testing to evaluate emergent properties like performance and reliability. ► Test Design: Tests should reflect typical usage patterns of the system to ensure they accurately represent real-world conditions. ► Performance Testing:  Approach: Conduct a series of tests with progressively increasing load to identify the point at which system performance becomes unacceptable.  Objective: Assess how well the system handles increasing demands and ensure it meets performance criteria.  Stress Testing:  Approach: Overload the system intentionally to evaluate its behavior under extreme conditions.  Objective: Determine how the system fails and recovers under severe stress, identifying potential weaknesses.  Performance testing ensures that the system operates efficiently and reliably under expected and extreme conditions, supporting its readiness for real-world use. User testing  Purpose: User or customer testing involves users or customers providing feedback and advice on system testing, even if thorough system and release testing have been performed. This is crucial because real-world user environments can significantly impact a system's reliability, performance, usability, and robustness—factors that may not be fully replicated in a testing environment.  Types of User Testing: 1. Alpha Testing:  Description: Users test the software in collaboration with the development team at the developer’s site.  Objective: Identify issues early and refine the software with direct user feedback in a controlled environment. 2. Beta Testing:  Description: A version of the software is released to users outside the development team for experimentation and feedback.  Objective: Discover problems in a real-world environment and gather user feedback to address issues before the final release. User testing 3. Acceptance Testing:  Description: Customers test the system to determine if it meets their requirements and is ready for deployment.  Objective: Validate that the system is ready for use in the customer’s environment and fulfills the agreed-upon specifications, particularly for custom systems.  User testing is vital for ensuring that the system meets practical user needs and performs well in real-world conditions. The acceptance testing process Stages in the acceptance testing process 1. Define acceptance criteria 2. Plan acceptance testing 3. Derive acceptance tests 4. Run acceptance tests 5. Negotiate test results 6. Reject/accept system Agile methods and acceptance testing 1. User Involvement:  Role: In agile methods, the user or customer is integrated into the development team, actively participating in the development process and making decisions regarding the acceptability of the system. 2. Test Definition and Integration:  Approach: Tests are defined by the user or customer and are automatically integrated with other tests. These tests are run automatically whenever changes are made to the system, ensuring continuous validation of requirements. 3. Acceptance Testing Process:  Integration: There is no separate, distinct acceptance testing phase in agile. Instead, acceptance testing is incorporated into the ongoing development process. Agile methods and acceptance testing 4. Main Challenge:  Representation: A key issue in agile is whether the embedded user accurately represents all system stakeholders and their interests. Ensuring that the feedback and testing provided by the user reflect the needs and perspectives of all potential users is crucial for successful outcomes.  In agile methodologies, acceptance testing is an ongoing, integrated process that involves the user throughout development, with a focus on continuous feedback and validation.

Use Quizgecko on...
Browser
Browser