SCCVK2073 EVALUATION OF INTERACTIVE APPLICATION
48 Questions
12 Views

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

What does a selection rule primarily utilize to achieve a goal?

  • Peer suggestions for methods
  • The user's knowledge of the best method (correct)
  • Trial and error from previous tasks
  • The user's experience with technology
  • What is an example of a condition in a selection rule?

  • IF the word to highlight is less than five characters (correct)
  • Parents take the money from the wallet
  • USE the mouse drag method
  • GET money from an ATM
  • What does GOMS primarily provide for performance analysis?

  • A survey of user preferences
  • Qualitative data on user experience
  • A framework for content creation
  • Predictive models of performance time and learning (correct)
  • Which of the following is a limitation of the GOMS model?

    <p>It assumes a certain level of skill in users</p> Signup and view all the answers

    What does NGOMSL allow for in task representation?

    <p>Flexible representation using human language</p> Signup and view all the answers

    Which method is NOT a way to get money listed in the sub goals?

    <p>Request a loan from a bank</p> Signup and view all the answers

    What is the estimated total time to delete an icon as represented in NGOMSL?

    <p>2.42 seconds</p> Signup and view all the answers

    What might the GOMS model overlook in its analysis?

    <p>Long-term memory recall and learning time</p> Signup and view all the answers

    What is a key aspect of an evaluation process?

    <p>Involving appropriate users in the evaluation</p> Signup and view all the answers

    What should be considered when selecting techniques for an evaluation?

    <p>Practical and ethical issues including cost and time</p> Signup and view all the answers

    Which factor is NOT mentioned as a practical issue to consider in an evaluation?

    <p>Quality of the product</p> Signup and view all the answers

    How can sub-questions enhance an evaluation process?

    <p>By breaking down problems into more specific queries</p> Signup and view all the answers

    What could be a consequence of not having the necessary expertise in the evaluation team?

    <p>They may find it impractical to base evaluations on complex models</p> Signup and view all the answers

    When planning an evaluation, what is one of the major constraints to consider?

    <p>Schedule and budget constraints</p> Signup and view all the answers

    What is a potential challenge when using video in an evaluation?

    <p>Choosing how to record and position cameras effectively</p> Signup and view all the answers

    Why might an evaluation team need to adapt techniques during the evaluation process?

    <p>To address resource availability and project needs</p> Signup and view all the answers

    What should be prioritized when conducting functional testing for tablet devices?

    <p>Focusing on critical workflows that impact core functionality</p> Signup and view all the answers

    Which of the following should NOT be considered when optimizing battery life for tablet applications?

    <p>Reducing the app's visual appeal</p> Signup and view all the answers

    Why can device fragmentation be a challenge for functional testing?

    <p>There is a wide variety of devices with different specifications</p> Signup and view all the answers

    What role does automation play in functional testing?

    <p>It helps streamline the testing process and ensure consistency</p> Signup and view all the answers

    What is a recommended strategy for improving functional testing efficiency?

    <p>Regularly reviewing and updating test cases</p> Signup and view all the answers

    What are the initial steps required to record a television show accurately on a VCR?

    <p>Specify time of recording and save the settings.</p> Signup and view all the answers

    Which of the following best describes the benefits of continuous integration in functional testing?

    <p>It catches issues early in the development process</p> Signup and view all the answers

    What is the primary purpose of the human action cycle in user interface design?

    <p>To describe steps taken when interacting with systems.</p> Signup and view all the answers

    In integrating automated testing frameworks, which challenge is mentioned?

    <p>It may require significant effort and expertise</p> Signup and view all the answers

    How does feedback influence user actions according to the principles discussed?

    <p>It helps users anticipate outcomes based on past events.</p> Signup and view all the answers

    Which statement exemplifies a practice for fostering a culture of continuous improvement in testing?

    <p>Encourage collaboration with cross-functional teams</p> Signup and view all the answers

    Which principle focuses on making options visible without overwhelming the user?

    <p>Visibility</p> Signup and view all the answers

    What is a gulf of evaluation as discussed in the context of user interaction?

    <p>A disconnect between user expectations and system feedback.</p> Signup and view all the answers

    What role does affordance play in interaction design?

    <p>Indicates how to use a feature or object.</p> Signup and view all the answers

    What characteristic should a good user interface design demonstrate?

    <p>Make necessary tasks clear without distractions.</p> Signup and view all the answers

    Which of the following best represents the idea that an event can feedback into itself?

    <p>A device displaying a message after a user action.</p> Signup and view all the answers

    What is one key benefit of conducting performance testing during the formative stage of an application's lifecycle?

    <p>It helps uncover potential performance bottlenecks.</p> Signup and view all the answers

    Which of the following metrics is primarily focused on measuring the responsiveness of an application?

    <p>Response Time</p> Signup and view all the answers

    What purpose does load testing serve in performance evaluation?

    <p>To simulate high-traffic scenarios.</p> Signup and view all the answers

    Which performance testing technique involves evaluating an application's behavior under sudden increases in user activity?

    <p>Spike Testing</p> Signup and view all the answers

    What is a key goal of conducting performance testing during the formative stage of an application?

    <p>To reduce the risk of unexpected issues.</p> Signup and view all the answers

    Resource utilization metrics are primarily used to monitor which of the following?

    <p>System resources like CPU and memory.</p> Signup and view all the answers

    Which technique is primarily used to identify long-term issues such as memory leaks?

    <p>Endurance Testing</p> Signup and view all the answers

    What is an essential first step in designing effective test scenarios for performance testing?

    <p>Identify user profiles.</p> Signup and view all the answers

    What was the mean age of participants in the studies involving Painpad?

    <p>64.6 years</p> Signup and view all the answers

    How often were patients prompted to report their pain levels using Painpad?

    <p>Every two hours</p> Signup and view all the answers

    What was the satisfaction rating for Painpad on the Likert scale?

    <p>4.63</p> Signup and view all the answers

    What were the participant demographics in terms of gender?

    <p>13 males and 41 females</p> Signup and view all the answers

    What types of data were collected from the Painpad users?

    <p>Satisfaction, compliance, and comparison with nurse data</p> Signup and view all the answers

    What did the results indicate about patient compliance with the Painpad prompts?

    <p>Some patients liked it while others disliked the prompts</p> Signup and view all the answers

    How many studies were conducted to evaluate Painpad?

    <p>Two studies</p> Signup and view all the answers

    Why might long-term studies be beneficial for evaluating complex technical products?

    <p>Participants can get accustomed to the product and usage</p> Signup and view all the answers

    Study Notes

    Evaluation

    • Evaluation is integral to the design process.
    • It involves collecting and analyzing data about users' experiences.
    • The goal is to improve the design of artifacts.
    • Usability of the system is considered, along with user experience.

    Evaluation Goals

    • Usability Assessment
    • User Experience (UX) Evaluation
    • Identifying Design Issues
    • Performance Evaluation
    • Accessibility Assessment
    • Feedback Collection
    • Validation of Design Decisions
    • Risk Mitigation

    Types of Evaluation

    • Formative Evaluation: Conducted during the early stages of development.
    • Iterative throughout the development process, incorporating feedback into design.
    • Summative Evaluation: Conducted after the product/system is developed and implemented.
    • Used to assess effectiveness and impact of the product/system.

    Evaluation Methods

    • User Evaluation: User testing, observing users' interactions to assess effectiveness.
    • Expert Evaluation: Heuristic evaluation or usability inspection; using expert evaluators to assess usability and user experience.

    Qualitative Approach

    • Qualitative evaluations use qualitative and naturalistic methods.
    • Methods include in-depth, open-ended interviews, direct observation, and written documents.

    Quantitative Approach

    • Quantitative evaluation is used to measure and analyze numerical data.
    • Methods include surveys and questionnaires, and one-to-one interviews.

    Mixed-Method Approach

    • Mixed methodology combines quantitative and qualitative data in a single study or series of studies.

    Evaluation Ethics

    • Strategies to protect the rights and dignity of participants are crucial.
    • Safeguards are needed for children or vulnerable populations.
    • Strategies include helping others, doing no harm, acting fairly, and respecting others.
    • Ethical issues like informed consent, confidentiality, ensuring safety, and feedback collection should be considered.

    DECIDE Framework

    • A framework to guide evaluation efforts.
    • Steps include:
      • Determining overall goals
      • Exploring specific questions to be answered
      • Choosing the evaluation paradigm and techniques
      • Identifying any practical issues (users, equipment, cost, time, etc)
      • Addressing ethical issues
      • Evaluating, interpreting, and presenting data

    GOMS Framework

    • Developed by Card, Moran, and Newell in 1983.
    • A cognitive model for evaluating system performance.
    • Components include Goals, Operators, Methods, and Selections (GOMS).

    GOMS Family

    • Different versions of GOMS framework exist.
    • These include: Plain GOMS, KLM, NGOMSL, CPM-GOMS.

    Norman's Cycle

    • Also known as the Seven Stages of Action.
    • A psychological model describing human interaction with computer systems.
    • Can be used to help evaluate the efficiency of a user interface.

    The Gulfs

    • Execution: Distance between the user's intention and the available system actions.
    • Evaluation: The difference between the user's perception of the system's state and desired outcome.

    Expert Review

    • Evaluators with expertise review the design.
    • Focus on adherence to heuristics.
    • Can be conducted early or late in the development process.
    • Requires guiding questions and training for the reviewers.
    • Goal is usability concerns.

    Functional Testing

    • A crucial step in ensuring software quality.
    • Explores best practices for testing functionality across platforms.
    • Covers various types of testing (unit, component, integration, regression, and user acceptance testing).
    • Crucial for formative evaluation.

    Defining Functional Requirements

    • Description of a program's capabilities and behaviors.
    • Key requirements are user workflows and features, expected inputs/outputs/responses.
    • Ensuring requirements are measurable, testable, and aligned with users' goals.

    Functional Testing for Different Platforms (Web, Desktop, Mobile, and Tablet)

    • Detailed test plans for specific platform types, with specific examples of test components outlined.

    Challenges in Functional Testing Across Platforms

    • Handling device fragmentation, various user interactions, and effective automation.

    Best Practices for Effective Functional Testing

    • Prioritizing critical workflows, leveraging automation, and incorporating continuous integration into the testing process.
    • Best practices for continuous improvement for functional testing.

    Usability (UX) Evaluation

    • Process to assess and understand how users interact with products or services.
    • Involves gathering data on behaviors, attitudes, and preferences.
    • Goal is to identify areas for improvement and enhance the user experience.

    Types of UX Evaluation Methods

    • Usability Testing
    • Heuristic Evaluation
    • Cognitive Walkthroughs
    • User Interviews
    • Surveys and Questionnaires
    • A/B Testing

    Usability Testing Methods (In-person vs. Remote)

    • Details on types of usability testing methods, and their strengths and weaknesses.
    • Tools and techniques used for both in-person and remote usability testing.

    Participant Recruitment and Screening

    • Strategies to ensure relevant and diverse participants are included in usability studies.
    • Processes for selecting and identifying qualified participants.

    Usability Evaluation Metrics (Measures for effectiveness, efficiency, satisfaction, and learnability.)

    • Types of measures for each metric, and how to use them in evaluation.

    Data Collection and Analysis (Usability)

    • Methods for collecting usability data (observation, think-aloud protocols, and post-task interviews).
    • Techniques for interpreting qualitative and quantitative feedback.

    Reporting Findings and Recommendations (Usability)

    • Process for summarizing evaluation results to identify issues and recommend enhancements.

    Interaction Design - Evaluation Studies: From Controlled to Natural Settings

    • Overview of usability testing objectives and types of studies (controlled and wild settings).

    System Testing Evaluations

    • System testing's role in ensuring the overall functionality and integration of a complex software system.

    Defining System Testing

    • Comprehensive Evaluation: Evaluating the entire system
    • Integrated Approach: Considering end-to-end scenarios rather than isolated modules.
    • Validation of Requirements: Ensures requirements are met and addressed.

    Objectives of System Testing

    • Functional Verification: Verifying system functions.
    • Performance Optimization: Identifying and addressing performance bottlenecks.
    • Compliance Validation: Checking compliance with standards.

    Scope of System Testing in Formative Evaluation

    • End-to-End Functionality, Non-Functional requirements, Integration with External Systems, and Compliance and Regulations.

    Approaches to System Testing (Black-Box, White-Box, Agile).

    • Overview of different testing methods.

    Test Planning and Execution

    • Steps for planning and executing test cases.

    Analyzing System Test Results (Defect Identification, Performance Evaluation, and Acceptance Criteria)

    • Detail on the phases of analyzing test results.

    Incorporating Findings into Iterative Design

    • Feedback collection followed by design refinement, and repeated for continual development improvement.

    System Testing - Performance Testing

    • A crucial aspect of software development, ensuring applications handle expected user loads and provide seamless user experience;
    • The key principles and techniques used for conducting performance evaluations during the formative stage of an application's lifecycle.

    Importance of Performance Testing in Testing in Formative Evaluation

    • How to identify bottlenecks during formative stages.
    • Optimizing user experience with testing and feedback.
    • Mitigating risk by identifying potential issues before launch.

    Key Performance Metrics to Measure

    • Response Time, Throughput, and Resource Utilization

    Selecting Appropriate Testing Techniques (Load Testing, Stress Testing, Endurance Testing, Spike Testing)

    • Techniques to choose and use for specific conditions.

    Designing Effective Test Scenarios

    • Identifying user profiles, defining user journeys, and incorporating realistic data.

    Conducting Performance Tests in an Interactive Environment

    • Steps in conducting performance testing.

    Analyzing and Interpreting Test Results

    • Data aggregation, performance analytics, and reporting to stakeholders.

    Incorporating Findings into Application Improvements

    • Prioritizing findings, implementing optimizations, validating improvements and iterative processes.

    System Testing - Exception Handling

    • Exception handling as a critical process in formative system testing.

    Importance of Exception Handling During Formative Evaluation

    • Identifying vulnerabilities during early testing.
    • Validating resilience of the system, and improving user experience.

    Identifying Potential Exceptions in the System

    • Identifying potential system risks and vulnerabilities from various perspectives: input validation, resource constraints, and dependencies.

    Designing Test Cases for Exception Scenarios

    • Strategies, including boundary condition, simulated failures and error propagation testing for exception scenarios.

    Implementing Exception Handling Mechanisms

    • Strategies for implementing and testing mechanisms for fault tolerance and exception handling.

    Verifying the Effectiveness of Exception Handling

    • Methods and metrics used to validate exception handling.

    Analyzing and Reporting Exception Handling Outcomes

    • Analyzing the collected data, and the process of reporting results and solutions.

    Continuous Improvement of Exception Handling Strategies

    • Steps in implementing continuous improvement processes.

    System Testing - Security Testing

    • Overview of security testing in software development.

    Importance of Security Testing in Formative Evaluation

    • Early detection of security vulnerabilities.
    • Proactive strategy approach implemented in early testing phases of the product.
    • Continuous improvement to ensure security practices in development cycles.

    Common Security Vulnerabilities (Injection Flaws, Broken Authentication, Cross-Site Scripting, and Sensitive Data Exposure)

    • Descriptions and examples of common vulnerability types.

    Threat Modeling and Risk Assessment

    • Processes for identifying and analyzing potential threats, and prioritizing security efforts.
    • Approach to making security decisions backed with structured approaches.

    Security Testing Methodologies (Reconnaissance, Vulnerability Identification, Exploitation, and Reporting)

    • Processes in security testing from planning to reporting.

    Automated Security Testing Tools (Burp Suite, OWASP ZAP, Nmap, SQLmap)

    • Software tools for automated security testing.

    Integrating Security Testing into the Development Lifecycle

    • Strategies for embedding security testing throughout the software development lifecycle.

    Reporting and Remediation of Security Vulnerabilities

    • Process for documenting security issues and recommending solutions.

    Introduction to Usability Evaluation

    • Focus on user-friendliness and effectiveness.

    Defining Usability

    • Overview of usability and its key aspects (effectiveness, efficiency, satisfaction, and learnability).

    Importance of Usability Evaluation

    • Key benefits of conducting usability evaluation.

    Usability Techniques (User Evaluation, and Expert Evaluation)

    • Methods overview of analyzing product/application quality.

    Types of Usability Evaluation

    • Overview of different usability evaluation strategies.

    Usability Testing Methods (Observation, Think-Aloud Protocol, Interviews, and Remote Usability Testing)

    • Specific usability techniques and approaches.

    Participant Recruitment and Screening

    • Specific techniques used to gather relevant user data.

    Usability Evaluation Metrics

    • Different measures of usability, including effectiveness, efficiency, satisfaction, and learnability.

    Data Collection and Analysis (Usability)

    • Different methods for data collection and analysis during usability evaluation

    Reporting Findings and Recommendations (Usability)

    • Strategies for summarizing evaluation and recommending design enhancements.

    Studying That Suits You

    Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

    Quiz Team

    Related Documents

    All Topics PDF

    More Like This

    Use Quizgecko on...
    Browser
    Browser