SCCVK2073 EVALUATION OF INTERACTIVE APPLICATION

Choose a study mode

Play Quiz
Study Flashcards
Spaced Repetition
Chat to Lesson

Podcast

Play an AI-generated podcast conversation about this lesson

Questions and Answers

What does a selection rule primarily utilize to achieve a goal?

  • Peer suggestions for methods
  • The user's knowledge of the best method (correct)
  • Trial and error from previous tasks
  • The user's experience with technology

What is an example of a condition in a selection rule?

  • IF the word to highlight is less than five characters (correct)
  • Parents take the money from the wallet
  • USE the mouse drag method
  • GET money from an ATM

What does GOMS primarily provide for performance analysis?

  • A survey of user preferences
  • Qualitative data on user experience
  • A framework for content creation
  • Predictive models of performance time and learning (correct)

Which of the following is a limitation of the GOMS model?

<p>It assumes a certain level of skill in users (B)</p> Signup and view all the answers

What does NGOMSL allow for in task representation?

<p>Flexible representation using human language (D)</p> Signup and view all the answers

Which method is NOT a way to get money listed in the sub goals?

<p>Request a loan from a bank (D)</p> Signup and view all the answers

What is the estimated total time to delete an icon as represented in NGOMSL?

<p>2.42 seconds (D)</p> Signup and view all the answers

What might the GOMS model overlook in its analysis?

<p>Long-term memory recall and learning time (A)</p> Signup and view all the answers

What is a key aspect of an evaluation process?

<p>Involving appropriate users in the evaluation (D)</p> Signup and view all the answers

What should be considered when selecting techniques for an evaluation?

<p>Practical and ethical issues including cost and time (B)</p> Signup and view all the answers

Which factor is NOT mentioned as a practical issue to consider in an evaluation?

<p>Quality of the product (D)</p> Signup and view all the answers

How can sub-questions enhance an evaluation process?

<p>By breaking down problems into more specific queries (C)</p> Signup and view all the answers

What could be a consequence of not having the necessary expertise in the evaluation team?

<p>They may find it impractical to base evaluations on complex models (C)</p> Signup and view all the answers

When planning an evaluation, what is one of the major constraints to consider?

<p>Schedule and budget constraints (B)</p> Signup and view all the answers

What is a potential challenge when using video in an evaluation?

<p>Choosing how to record and position cameras effectively (D)</p> Signup and view all the answers

Why might an evaluation team need to adapt techniques during the evaluation process?

<p>To address resource availability and project needs (D)</p> Signup and view all the answers

What should be prioritized when conducting functional testing for tablet devices?

<p>Focusing on critical workflows that impact core functionality (D)</p> Signup and view all the answers

Which of the following should NOT be considered when optimizing battery life for tablet applications?

<p>Reducing the app's visual appeal (C)</p> Signup and view all the answers

Why can device fragmentation be a challenge for functional testing?

<p>There is a wide variety of devices with different specifications (B)</p> Signup and view all the answers

What role does automation play in functional testing?

<p>It helps streamline the testing process and ensure consistency (D)</p> Signup and view all the answers

What is a recommended strategy for improving functional testing efficiency?

<p>Regularly reviewing and updating test cases (D)</p> Signup and view all the answers

What are the initial steps required to record a television show accurately on a VCR?

<p>Specify time of recording and save the settings. (B), Press the record button and select the channel. (D)</p> Signup and view all the answers

Which of the following best describes the benefits of continuous integration in functional testing?

<p>It catches issues early in the development process (B)</p> Signup and view all the answers

What is the primary purpose of the human action cycle in user interface design?

<p>To describe steps taken when interacting with systems. (D)</p> Signup and view all the answers

In integrating automated testing frameworks, which challenge is mentioned?

<p>It may require significant effort and expertise (B)</p> Signup and view all the answers

How does feedback influence user actions according to the principles discussed?

<p>It helps users anticipate outcomes based on past events. (C)</p> Signup and view all the answers

Which statement exemplifies a practice for fostering a culture of continuous improvement in testing?

<p>Encourage collaboration with cross-functional teams (A)</p> Signup and view all the answers

Which principle focuses on making options visible without overwhelming the user?

<p>Visibility (B)</p> Signup and view all the answers

What is a gulf of evaluation as discussed in the context of user interaction?

<p>A disconnect between user expectations and system feedback. (B)</p> Signup and view all the answers

What role does affordance play in interaction design?

<p>Indicates how to use a feature or object. (B)</p> Signup and view all the answers

What characteristic should a good user interface design demonstrate?

<p>Make necessary tasks clear without distractions. (A)</p> Signup and view all the answers

Which of the following best represents the idea that an event can feedback into itself?

<p>A device displaying a message after a user action. (C)</p> Signup and view all the answers

What is one key benefit of conducting performance testing during the formative stage of an application's lifecycle?

<p>It helps uncover potential performance bottlenecks. (D)</p> Signup and view all the answers

Which of the following metrics is primarily focused on measuring the responsiveness of an application?

<p>Response Time (C)</p> Signup and view all the answers

What purpose does load testing serve in performance evaluation?

<p>To simulate high-traffic scenarios. (B)</p> Signup and view all the answers

Which performance testing technique involves evaluating an application's behavior under sudden increases in user activity?

<p>Spike Testing (B)</p> Signup and view all the answers

What is a key goal of conducting performance testing during the formative stage of an application?

<p>To reduce the risk of unexpected issues. (D)</p> Signup and view all the answers

Resource utilization metrics are primarily used to monitor which of the following?

<p>System resources like CPU and memory. (C)</p> Signup and view all the answers

Which technique is primarily used to identify long-term issues such as memory leaks?

<p>Endurance Testing (A)</p> Signup and view all the answers

What is an essential first step in designing effective test scenarios for performance testing?

<p>Identify user profiles. (D)</p> Signup and view all the answers

What was the mean age of participants in the studies involving Painpad?

<p>64.6 years (A)</p> Signup and view all the answers

How often were patients prompted to report their pain levels using Painpad?

<p>Every two hours (B)</p> Signup and view all the answers

What was the satisfaction rating for Painpad on the Likert scale?

<p>4.63 (C)</p> Signup and view all the answers

What were the participant demographics in terms of gender?

<p>13 males and 41 females (B)</p> Signup and view all the answers

What types of data were collected from the Painpad users?

<p>Satisfaction, compliance, and comparison with nurse data (C)</p> Signup and view all the answers

What did the results indicate about patient compliance with the Painpad prompts?

<p>Some patients liked it while others disliked the prompts (B)</p> Signup and view all the answers

How many studies were conducted to evaluate Painpad?

<p>Two studies (D)</p> Signup and view all the answers

Why might long-term studies be beneficial for evaluating complex technical products?

<p>Participants can get accustomed to the product and usage (D)</p> Signup and view all the answers

Flashcards

Sub-questioning

Breaking down complex questions into smaller, more specific questions to gain deeper insights.

Evaluation Paradigm

The overall approach or framework used to guide the evaluation process.

Evaluation Techniques

Specific methods and tools chosen to gather data and answer the evaluation's questions.

Trade-offs

Making choices and compromises based on practical considerations like cost, time, and available resources.

Signup and view all the flashcards

Practical Issues

Real-world constraints like cost, time, equipment, expertise, and user availability that affect the evaluation process.

Signup and view all the flashcards

User Screening

Making sure the users involved in an evaluation represent the target audience for the product or system.

Signup and view all the flashcards

Facilities and Equipment

The resources and conditions needed to conduct an evaluation, including equipment, space, and scheduling.

Signup and view all the flashcards

Evaluating Expertise

Ensuring the evaluation team has the necessary knowledge and skills to conduct the evaluation.

Signup and view all the flashcards

Selection rule in user interface

A selection rule is a conditional statement that determines the best method for achieving a goal, based on the user's knowledge and the specific situation.

Signup and view all the flashcards

GOMS model

The GOMS model (Goals, Operators, Methods, and Selection Rules) is a predictive model used to estimate the time it takes to perform a task involving user interaction with a system.

Signup and view all the flashcards

Applications of GOMS Model

The GOMS model is often used to compare different systems and make informed decisions about which one to choose based on factors like performance, learning time, and overall efficiency.

Signup and view all the flashcards

NGOMSL

NGOMSL, or Natural GOMS Language, allows for a more flexible representation of tasks using a human language approach, adding natural language descriptions to the GOMS model.

Signup and view all the flashcards

Limitations of GOMS

A critical limitation of the GOMS model is that it assumes a user has a certain level of proficiency with the system. It fails to account for the time required for learning and remembering how to use a system, especially after periods of disuse.

Signup and view all the flashcards

GOMS model ignores human error

The GOMS model, by focusing on optimal performance, does not take into account human error, which plays a role even for skilled users.

Signup and view all the flashcards

Gulf of Execution

The difference between what a user wants to do and what they understand is possible with the system.

Signup and view all the flashcards

Gulf of Evaluation

The gap between what a user expects to happen after an action and the actual feedback received from the system.

Signup and view all the flashcards

Affordance

The potential for interaction between a person and an object, based on the object's physical properties.

Signup and view all the flashcards

Feedback

Information provided to the user about the result of their actions, helping them understand the state of the system.

Signup and view all the flashcards

Visibility

Making all necessary options and materials for a task clear and accessible to the user without distracting information.

Signup and view all the flashcards

Tolerance

The flexibility of a system to handle user errors and allow them to undo mistakes easily.

Signup and view all the flashcards

Human Action Cycle

A psychological model that describes the steps users take when interacting with computer systems, emphasizing the roles of execution, evaluation, and feedback.

Signup and view all the flashcards

User Guidance

The ability of the system to guide the user through the execution process of their goals, reducing confusion and errors.

Signup and view all the flashcards

Screen Size Optimization

Testing how well the application's layout, navigation, and content display adapt to the larger screen size of tablets.

Signup and view all the flashcards

Pen-based Interactions

Testing the accuracy and responsiveness of stylus input, including handwriting recognition and annotation features.

Signup and view all the flashcards

Battery Life Considerations

Assessing the application's power consumption and optimizing for extended battery life on tablet devices.

Signup and view all the flashcards

Device Fragmentation

The challenge of ensuring consistent functionality across diverse desktop, mobile, and tablet devices.

Signup and view all the flashcards

User Interactions

Ensuring consistent functionality and user experience across various input methods, such as mouse, keyboard, touch, and stylus.

Signup and view all the flashcards

Automation Integration

Integrating automated testing frameworks with different platforms and technologies to improve efficiency.

Signup and view all the flashcards

Prioritize Critical Workflows

Focusing on the most important user scenarios and test cases that have the highest impact on the application's core functionality.

Signup and view all the flashcards

Leverage Automation

Implementing a comprehensive test automation strategy to streamline the testing process and ensure consistent results across platforms.

Signup and view all the flashcards

Formative Performance Testing

Performance evaluations during the formative stage of an application's lifecycle, helping identify potential bottlenecks and address them early in the development process.

Signup and view all the flashcards

Load Testing

It helps pinpoint areas where the application struggles to handle high user activity, revealing potential scaling issues and performance bottlenecks.

Signup and view all the flashcards

Stress Testing

This method pushes the application beyond its normal operating capacity to identify its breaking point, revealing the app's resilience and helping optimize its limits.

Signup and view all the flashcards

Spike Testing

It focuses on how the application handles a sudden, significant increase in user activity or data volume, identifying possible performance issues and vulnerabilities.

Signup and view all the flashcards

Endurance Testing

Evaluating the application's ability to maintain consistent performance over extended periods, revealing potential memory leaks or other long-term issues.

Signup and view all the flashcards

Response Time in Performance Testing

This metric measures the time an application takes to respond to user requests, guaranteeing a smooth and responsive user experience.

Signup and view all the flashcards

Throughput in Performance Testing

It helps evaluate the application's capability to manage a large number of concurrent users and transactions. A high throughput indicates an efficient application.

Signup and view all the flashcards

Resource Utilization in Performance Testing

Monitoring the application's consumption of system resources such as CPU, memory, and network. It reveals areas where optimization can be made for better performance.

Signup and view all the flashcards

Painpad

A medical device designed to help hospitalized patients track and self-report their pain levels.

Signup and view all the flashcards

Painpad Study Participants

This study involved 54 participants, with a focus on privacy and data collection.

Signup and view all the flashcards

Painpad Reporting Frequency

Patients were asked to report their pain levels every two hours using the Painpad device.

Signup and view all the flashcards

Painpad Study Goals

The study aimed to measure patient satisfaction with the Painpad, how often they used it, and compare their pain reports to those recorded by nurses.

Signup and view all the flashcards

Painpad User Satisfaction

Patients reported a high level of satisfaction with the Painpad, scoring it an average of 4.63 on a 5-point scale.

Signup and view all the flashcards

Painpad Compliance

Some patients consistently reported pain levels through the Painpad, while others used it inconsistently or didn't notice the prompts.

Signup and view all the flashcards

Painpad vs. Nurse Records

Patients reported more pain scores using the Painpad compared to nurses' records, indicating potentially increased pain monitoring.

Signup and view all the flashcards

Long-Term User Studies

Studies that give participants products for an extended time to understand their use in real-world settings.

Signup and view all the flashcards

Study Notes

Evaluation

  • Evaluation is integral to the design process.
  • It involves collecting and analyzing data about users' experiences.
  • The goal is to improve the design of artifacts.
  • Usability of the system is considered, along with user experience.

Evaluation Goals

  • Usability Assessment
  • User Experience (UX) Evaluation
  • Identifying Design Issues
  • Performance Evaluation
  • Accessibility Assessment
  • Feedback Collection
  • Validation of Design Decisions
  • Risk Mitigation

Types of Evaluation

  • Formative Evaluation: Conducted during the early stages of development.
  • Iterative throughout the development process, incorporating feedback into design.
  • Summative Evaluation: Conducted after the product/system is developed and implemented.
  • Used to assess effectiveness and impact of the product/system.

Evaluation Methods

  • User Evaluation: User testing, observing users' interactions to assess effectiveness.
  • Expert Evaluation: Heuristic evaluation or usability inspection; using expert evaluators to assess usability and user experience.

Qualitative Approach

  • Qualitative evaluations use qualitative and naturalistic methods.
  • Methods include in-depth, open-ended interviews, direct observation, and written documents.

Quantitative Approach

  • Quantitative evaluation is used to measure and analyze numerical data.
  • Methods include surveys and questionnaires, and one-to-one interviews.

Mixed-Method Approach

  • Mixed methodology combines quantitative and qualitative data in a single study or series of studies.

Evaluation Ethics

  • Strategies to protect the rights and dignity of participants are crucial.
  • Safeguards are needed for children or vulnerable populations.
  • Strategies include helping others, doing no harm, acting fairly, and respecting others.
  • Ethical issues like informed consent, confidentiality, ensuring safety, and feedback collection should be considered.

DECIDE Framework

  • A framework to guide evaluation efforts.
  • Steps include:
    • Determining overall goals
    • Exploring specific questions to be answered
    • Choosing the evaluation paradigm and techniques
    • Identifying any practical issues (users, equipment, cost, time, etc)
    • Addressing ethical issues
    • Evaluating, interpreting, and presenting data

GOMS Framework

  • Developed by Card, Moran, and Newell in 1983.
  • A cognitive model for evaluating system performance.
  • Components include Goals, Operators, Methods, and Selections (GOMS).

GOMS Family

  • Different versions of GOMS framework exist.
  • These include: Plain GOMS, KLM, NGOMSL, CPM-GOMS.

Norman's Cycle

  • Also known as the Seven Stages of Action.
  • A psychological model describing human interaction with computer systems.
  • Can be used to help evaluate the efficiency of a user interface.

The Gulfs

  • Execution: Distance between the user's intention and the available system actions.
  • Evaluation: The difference between the user's perception of the system's state and desired outcome.

Expert Review

  • Evaluators with expertise review the design.
  • Focus on adherence to heuristics.
  • Can be conducted early or late in the development process.
  • Requires guiding questions and training for the reviewers.
  • Goal is usability concerns.

Functional Testing

  • A crucial step in ensuring software quality.
  • Explores best practices for testing functionality across platforms.
  • Covers various types of testing (unit, component, integration, regression, and user acceptance testing).
  • Crucial for formative evaluation.

Defining Functional Requirements

  • Description of a program's capabilities and behaviors.
  • Key requirements are user workflows and features, expected inputs/outputs/responses.
  • Ensuring requirements are measurable, testable, and aligned with users' goals.

Functional Testing for Different Platforms (Web, Desktop, Mobile, and Tablet)

  • Detailed test plans for specific platform types, with specific examples of test components outlined.

Challenges in Functional Testing Across Platforms

  • Handling device fragmentation, various user interactions, and effective automation.

Best Practices for Effective Functional Testing

  • Prioritizing critical workflows, leveraging automation, and incorporating continuous integration into the testing process.
  • Best practices for continuous improvement for functional testing.

Usability (UX) Evaluation

  • Process to assess and understand how users interact with products or services.
  • Involves gathering data on behaviors, attitudes, and preferences.
  • Goal is to identify areas for improvement and enhance the user experience.

Types of UX Evaluation Methods

  • Usability Testing
  • Heuristic Evaluation
  • Cognitive Walkthroughs
  • User Interviews
  • Surveys and Questionnaires
  • A/B Testing

Usability Testing Methods (In-person vs. Remote)

  • Details on types of usability testing methods, and their strengths and weaknesses.
  • Tools and techniques used for both in-person and remote usability testing.

Participant Recruitment and Screening

  • Strategies to ensure relevant and diverse participants are included in usability studies.
  • Processes for selecting and identifying qualified participants.

Usability Evaluation Metrics (Measures for effectiveness, efficiency, satisfaction, and learnability.)

  • Types of measures for each metric, and how to use them in evaluation.

Data Collection and Analysis (Usability)

  • Methods for collecting usability data (observation, think-aloud protocols, and post-task interviews).
  • Techniques for interpreting qualitative and quantitative feedback.

Reporting Findings and Recommendations (Usability)

  • Process for summarizing evaluation results to identify issues and recommend enhancements.

Interaction Design - Evaluation Studies: From Controlled to Natural Settings

  • Overview of usability testing objectives and types of studies (controlled and wild settings).

System Testing Evaluations

  • System testing's role in ensuring the overall functionality and integration of a complex software system.

Defining System Testing

  • Comprehensive Evaluation: Evaluating the entire system
  • Integrated Approach: Considering end-to-end scenarios rather than isolated modules.
  • Validation of Requirements: Ensures requirements are met and addressed.

Objectives of System Testing

  • Functional Verification: Verifying system functions.
  • Performance Optimization: Identifying and addressing performance bottlenecks.
  • Compliance Validation: Checking compliance with standards.

Scope of System Testing in Formative Evaluation

  • End-to-End Functionality, Non-Functional requirements, Integration with External Systems, and Compliance and Regulations.

Approaches to System Testing (Black-Box, White-Box, Agile).

  • Overview of different testing methods.

Test Planning and Execution

  • Steps for planning and executing test cases.

Analyzing System Test Results (Defect Identification, Performance Evaluation, and Acceptance Criteria)

  • Detail on the phases of analyzing test results.

Incorporating Findings into Iterative Design

  • Feedback collection followed by design refinement, and repeated for continual development improvement.

System Testing - Performance Testing

  • A crucial aspect of software development, ensuring applications handle expected user loads and provide seamless user experience;
  • The key principles and techniques used for conducting performance evaluations during the formative stage of an application's lifecycle.

Importance of Performance Testing in Testing in Formative Evaluation

  • How to identify bottlenecks during formative stages.
  • Optimizing user experience with testing and feedback.
  • Mitigating risk by identifying potential issues before launch.

Key Performance Metrics to Measure

  • Response Time, Throughput, and Resource Utilization

Selecting Appropriate Testing Techniques (Load Testing, Stress Testing, Endurance Testing, Spike Testing)

  • Techniques to choose and use for specific conditions.

Designing Effective Test Scenarios

  • Identifying user profiles, defining user journeys, and incorporating realistic data.

Conducting Performance Tests in an Interactive Environment

  • Steps in conducting performance testing.

Analyzing and Interpreting Test Results

  • Data aggregation, performance analytics, and reporting to stakeholders.

Incorporating Findings into Application Improvements

  • Prioritizing findings, implementing optimizations, validating improvements and iterative processes.

System Testing - Exception Handling

  • Exception handling as a critical process in formative system testing.

Importance of Exception Handling During Formative Evaluation

  • Identifying vulnerabilities during early testing.
  • Validating resilience of the system, and improving user experience.

Identifying Potential Exceptions in the System

  • Identifying potential system risks and vulnerabilities from various perspectives: input validation, resource constraints, and dependencies.

Designing Test Cases for Exception Scenarios

  • Strategies, including boundary condition, simulated failures and error propagation testing for exception scenarios.

Implementing Exception Handling Mechanisms

  • Strategies for implementing and testing mechanisms for fault tolerance and exception handling.

Verifying the Effectiveness of Exception Handling

  • Methods and metrics used to validate exception handling.

Analyzing and Reporting Exception Handling Outcomes

  • Analyzing the collected data, and the process of reporting results and solutions.

Continuous Improvement of Exception Handling Strategies

  • Steps in implementing continuous improvement processes.

System Testing - Security Testing

  • Overview of security testing in software development.

Importance of Security Testing in Formative Evaluation

  • Early detection of security vulnerabilities.
  • Proactive strategy approach implemented in early testing phases of the product.
  • Continuous improvement to ensure security practices in development cycles.

Common Security Vulnerabilities (Injection Flaws, Broken Authentication, Cross-Site Scripting, and Sensitive Data Exposure)

  • Descriptions and examples of common vulnerability types.

Threat Modeling and Risk Assessment

  • Processes for identifying and analyzing potential threats, and prioritizing security efforts.
  • Approach to making security decisions backed with structured approaches.

Security Testing Methodologies (Reconnaissance, Vulnerability Identification, Exploitation, and Reporting)

  • Processes in security testing from planning to reporting.

Automated Security Testing Tools (Burp Suite, OWASP ZAP, Nmap, SQLmap)

  • Software tools for automated security testing.

Integrating Security Testing into the Development Lifecycle

  • Strategies for embedding security testing throughout the software development lifecycle.

Reporting and Remediation of Security Vulnerabilities

  • Process for documenting security issues and recommending solutions.

Introduction to Usability Evaluation

  • Focus on user-friendliness and effectiveness.

Defining Usability

  • Overview of usability and its key aspects (effectiveness, efficiency, satisfaction, and learnability).

Importance of Usability Evaluation

  • Key benefits of conducting usability evaluation.

Usability Techniques (User Evaluation, and Expert Evaluation)

  • Methods overview of analyzing product/application quality.

Types of Usability Evaluation

  • Overview of different usability evaluation strategies.

Usability Testing Methods (Observation, Think-Aloud Protocol, Interviews, and Remote Usability Testing)

  • Specific usability techniques and approaches.

Participant Recruitment and Screening

  • Specific techniques used to gather relevant user data.

Usability Evaluation Metrics

  • Different measures of usability, including effectiveness, efficiency, satisfaction, and learnability.

Data Collection and Analysis (Usability)

  • Different methods for data collection and analysis during usability evaluation

Reporting Findings and Recommendations (Usability)

  • Strategies for summarizing evaluation and recommending design enhancements.

Studying That Suits You

Use AI to generate personalized quizzes and flashcards to suit your learning preferences.

Quiz Team

Related Documents

All Topics PDF

More Like This

Use Quizgecko on...
Browser
Browser