Software Testing Part 2
5 Questions
0 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

Composite components only consist of single interacting objects.

False

Component testing checks that individual unit tests for objects within a component have been properly completed.

True

Timing errors occur when components operate at the same speed resulting in outdated information.

False

Interface testing includes checking the correct usage of parameter, shared memory, procedural, and message passing interfaces.

<p>True</p> Signup and view all the answers

Interface misuse occurs when the correct arguments are passed to a method.

<p>False</p> Signup and view all the answers

Study Notes

Software Testing Part 2

  • Component Testing: Focuses on verifying composite components' integrated functionality meets defined interface requirements. Multiple interacting objects often form composite components; reconfiguration of a weather station system is an example. Component functionality is accessed through an interface.

  • Interface Testing: Aims to identify faults in interface errors or incorrect assumptions made about how interfaces work. Common interface types include parameter interfaces (data passed between methods), shared memory interfaces (a shared memory block), procedural interfaces (sub-systems call procedures), and message passing interfaces (sub-systems request services).

  • Interface Errors: Include interface misuse (incorrect use of another component's interface—e.g., wrong parameter order), interface misunderstanding (incorrect assumptions about component behavior—e.g., assuming a method always returns a value within a specified range), and timing errors (discrepancies in component speeds, leading to outdated information).

  • Interface Testing Guidelines: Incorporate extreme parameter values (testing parameters at extreme ends of their ranges), null pointer testing (testing null pointers for error recovery), failure-inducing tests (scenarios to intentionally cause component failure, testing for robustness), stress tests (applying high loads for evaluating performance), and varying component order (testing variable activation order to ensure proper synchronization).

System Testing

  • Integration: Involves combining components into a complete system and testing their interactions between components to ensure they function correctly.

  • Objectives: Verifying that components are compatible; interact correctly; and transfer data correctly across component interfaces.

  • Emergent Behavior: Assessing the system's combined effect of component interactions, behavior that might not be apparent during component isolation testing. Systems are tested for proper functioning when all components work together; this ensures correct operation and satisfies required functions/operation.

System and Component Testing

  • Integration of Reusable and Off-the-Shelf Components: Integrating components developed separately or purchased as off-the-shelf solutions with newly developed components (to test the complete system, as a whole).

  • Collective Process: System testing which is a collaborative process, focuses on the interactions between integrated components rather than individual components.

  • Separate Testing Teams: Testing might be performed by a dedicated testing team that is not involved in the initial design or programming. This ensures objectivity in the evaluation process.

Use-Case Testing

  • Basis for Testing: Use cases describing user interactions with the system serve as a guide for system testing

  • Interaction Testing: Each use-case involves multiple system components—testing ensures these interactions are exercised.

  • Sequence Diagrams: Useful for understanding use-case-related component interaction sequences to document component interactions. Use-case testing verifies that the system performs as expected, testing among various components via use-case scenarios. A scenario is demonstrated and evaluated, in a real-life scenario. An example sequence chart demonstrates the collection of weather data.

Testing Policies

  • Exhaustive Testing: Impractical to test every possible scenario; policies define necessary testing coverage.

  • Menu Functions: Testing that all functions accessible via menus are tested.

  • Function Combinations: Testing combinations of functions accessed through the same menu to ensure their integration

  • User Input: Testing functions with both valid and invalid inputs to ensure correct handling.

Test-Driven Development (TDD)

  • Approach: TDD integrates testing and code development by creating tests before the code itself.

  • Test-First: Focuses on passing tests before creating the code.

  • Incremental Development: Code develops incrementally with corresponding tests (tests run after each increment to ensure new code does not break already running code)

  • Usage: Originating within agile methods, TDD’s principle, and practice applies to other traditional development systems, also; it emphasizes validation for new features or revisions. The process of TDD activities is demonstrated step by step; the processes and steps include;

  • Identify Increment: Selecting a small, manageable piece of functionality.

  • Write Test: Create an automated test for the selected functionality.

  • Run Test: Execute the test with existing tests.

  • Implement Functionality: Develop the code fulfilling the identified functionality.

  • Re-run tests: Rerun the tests to ensure the new functionality functions as expected; this is repeated until all tests pass successfully.

  • Move to next increment: Once all test passes then move on to the next iteration/increment.

Benefits of Test-Driven Development

  • Code Coverage: Ensuring each code segment has at least one associated test.

  • Regression Testing: A regression test suite grows incrementally with development, validating changes' effect on existing functionality.

  • Simplified Debugging: Pinpointing issues by using failed test results to the newly written code.

  • System Documentation: Tests act as documentation for the intended application behavior.

Regression testing

  • Purpose: Verifying recent changes/additions to the system haven't negatively impacted existing functionality

  • Manual vs. Automated: Manual testing is costly and time-consuming compared to automated testing, which is efficient and reruns tests automatically after changes.

Release Testing

  • Purpose: Testing a system release before external use (not internal development team use).

  • Goal: Ensuring reliability, performance, and functionality are suitable before release.

  • Approach: Using black-box testing that depends on system specifications to validate all requirements. System is validated before release via real-world use cases.

  • Difference between Release and System Testing: Release testing is performed by a different team than the development team to ensure objectivity. The main objective for System Testing is to identify bugs while they are still in the development/implementation phase (before release to end-users); the main objective for Release Testing is to prepare the application for external use and verify that the released software meets all requirements.

Requirements-Based Testing

  • Concept: Examining each requirement and developing tests to ensure it is met.

  • Example Requirements: Includes allergy warnings, warning overrides (justification), and no-allergy verification/validation steps. These steps/features would be part of a medical practice management system (MHC-PMS).

  • Test Cases:

    • No Allergy: Create a patient record without allergies, prescribe an allergy-related medication, and confirm no warning is issued.
    • Known Allergy: Create a patient record with a known allergy, prescribing the allergenic medication with a warning being issued.
    • Multiple Allergies: Prescribing multiple drugs with allergies; these would validate that warnings are appropriately issued. These steps/cases would be part of a medical practice management system (MHC-PMS).
    • Multiple Warnings: Prescribing multiple drugs that trigger warnings, ensuring warnings are issued correctly.
    • Overriding Warnings: Overriding warnings, with the system prompting for a reason.

Features Tested by Scenario

  • Authentication: Logging in to verify authentication processes.

  • Data Transfer: Downloading/uploading patient records, ensuring processes run properly.

  • Home Visit Scheduling: Ensuring scheduling features are functional and usable.

  • Data Encryption/Decryption: Demonstrating encryption and decryption processes for security (on mobile devices).

  • Record Management: Validating record management functions properly (fetching/modifying records).

  • Drug Database Integration: Interacting with a drug database, ensuring accurate and up-to-date information is accessed.

  • Call Prompting System: Evaluating call-prompting system functionality and reliability.

Usage Scenario (MHC-PMS)

  • A usage scenario for a mental health system (MHC-PMS); this demonstrated a specific use case for how the system would function for a patient's consultation and following up.

Performance Testing

  • Role: Assessing performance and reliability as part of release testing.

  • Test Design: Reflecting typical user usage patterns.

  • Approach: Load testing (conducting tests with progressively increasing load) to identify the load threshold at which performance becomes unacceptable.

  • Objective: Ensuring the system can handle increasing demands, adhering to performance criteria.

  • Stress Testing: Evaluating system performance under extreme conditions.

User Testing

  • Purpose: Collecting user feedback to improve reliability, performance, usability, and robustness.

  • Alpha Testing: Internal (developer) users test software for specific bugs or usability issues.

  • Beta Testing: External (general user) testing (prior to final release), providing feedback.

  • Acceptance Testing: Final stage, customer approval before release.

    • Description: Customers test the system, to verify if it meets the requirements, ensuring the systems work in their environments.

    • Acceptance Testing Process:

      • Define acceptance criteria
      • Plan acceptance testing
      • Derive acceptance tests
      • Run acceptance tests
      • Negotiate test results
      • Decide whether to accept or reject the system
  • Agile Methods and Acceptance Testing: Acceptance testing is continuously performed as part of Agile development method. Developers/end users have input as part of the continuous integration processes. There isn't a single Acceptance Testing Phase, but instead it is integrated into the Agile development workflow.

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

Description

This quiz delves into advanced topics of software testing, focusing specifically on component testing and interface testing. Understand the importance of verifying functionality and identifying faults in interfaces to ensure system reliability. Prepare to explore various types of interfaces and their potential errors.

More Like This

JMeter Test Plan Component Overview
5 questions
Fitness Testing and Components
24 questions
Component Testing Overview
13 questions

Component Testing Overview

DetachableSanctuary378 avatar
DetachableSanctuary378
Use Quizgecko on...
Browser
Browser