Human-Computer Interaction Lecture Notes PDF

Document Details

StylishSpessartine

Uploaded by StylishSpessartine

جامعة العلوم والتقانة

Dania Mohamed Ahmed

Tags

human-computer interaction interaction design user experience computer science

Summary

This document provides lecture notes on human-computer interaction (HCI). It covers topics like interaction models, key terms, and different interaction styles. These notes are useful for understanding how users interact with computer systems.

Full Transcript

The Interaction Lec (4) Dania Mohamed Ahmed The Interaction Interaction between a user and a computer system involves the exchange of information and commands. It encompasses how users convey their needs and instructions to the system, and...

The Interaction Lec (4) Dania Mohamed Ahmed The Interaction Interaction between a user and a computer system involves the exchange of information and commands. It encompasses how users convey their needs and instructions to the system, and how the system provides feedback or performs actions in response. This can vary from basic command-line inputs to more complex and immersive experiences, such as those found in virtual reality. Importance of Interaction:  Usability: Good interaction design enhances the usability of a system, making it easier and more efficient for users to achieve their goals.  User Experience: Effective interaction contributes to a positive user experience, increasing satisfaction and productivity.  Accessibility: Different interaction methods can make systems more accessible to users with various abilities and preferences. Models of Interaction Interaction models help us understand and address the complexities of user-system communication. They illustrate how users and systems interact, bridging gaps and potentially highlighting issues in this exchange. Norman's Execution–Evaluation Cycle is a foundational model in understanding interaction. It describes the process where users form goals, plan actions, execute them, and then evaluate the outcomes. This cycle helps identify where communication breakdowns can occur and emphasizes the importance of aligning system responses with user expectations. Extended Models build upon Norman's cycle, adding layers to the basic framework. These models often incorporate additional elements, such as context or feedback mechanisms, to provide a more nuanced understanding of interaction dynamics. Both models focus on user goals and actions, helping to analyze and improve how users interact with systems by examining the terminology and assumptions underlying these frameworks. Some terms of interaction domain – the area of work under study e.g. graphic design goal – what you want to achieve e.g. create a solid red triangle task – how you go about doing it – ultimately in terms of operations or actions e.g. … select fill tool, click over triangle Donald Norman’s model Norman’s interaction model is perhaps the most influential in Human–Computer Interaction, possibly because of its closeness to our intuitive understanding of the interaction between human user and computer. The user formulates a plan of action, which is then executed at the computer interface. When the plan, or part of the plan, has been executed, the user observes the computer interface to evaluate the result and determine further actions. The interactive cycle can be divided into two major phases: execution and evaluation. These can then be subdivided into further stages, seven in all. The stages in Norman’s model of interaction are as follows: 1. Establishing the goal. 2. Forming the intention. 3. Specifying the action sequence. 4. Executing the action. 5. Perceiving the system state. 6. Interpreting the system state. 7. Evaluating the system state with respect to the goals and intentions. Donald Norman’s model Cont.. Execution Phase: 1. Forming the goal: The user decides what they want to achieve. 2. Forming the intention: The user plans the actions needed to achieve the goal. 3. Specifying the action: The user determines the specific steps required. 4. Executing the action: The user performs the actions through the computer interface. Evaluation Phase: 1. Perceiving the state of the system: The user observes the system’s response to their actions. 2. Interpreting the state of the system: The user understands the system’s response in relation to their goal. 3. Evaluating the outcome: The user compares the system’s response with the goal to determine if it has been achieved. Execution/Evaluation Loop goal execution evaluation system user establishes the goal formulates intention specifies actions at interface executes action perceives system state interprets system state evaluates system state with respect to goal Execution/Evaluation Loop goal execution evaluation system user establishes the goal formulates intention specifies actions at interface executes action perceives system state interprets system state evaluates system state with respect to goal Execution/Evaluation Loop goal execution evaluation system user establishes the goal formulates intention specifies actions at interface executes action perceives system state interprets system state evaluates system state with respect to goal Execution/Evaluation Loop goal execution evaluation system user establishes the goal formulates intention specifies actions at interface executes action perceives system state interprets system state evaluates system state with respect to goal The Interaction Framework The interaction framework outlines four key components in a system-user interaction: 1. System: Represents the core functionalities and operations of the interactive system, including its specific language that defines its capabilities and processes. 2. User: The individual or entity interacting with the system, using a task language to express their intentions and goals. 3. Input: The information or commands from the user to the system, formatted and transmitted using its own language. 4. Output: The system's responses or feedback to the user's input, presented using its own language. Input and Output together form the Interface, the channel through which communication occurs. This framework aims to offer a detailed understanding of interactions by examining the roles and languages of each component, thus facilitating better analysis of communication processes and outcomes in interactive systems. The Interaction Framework O each has its own unique language output interaction  translation between languages S U problems in interaction = problems in core task translation I input The Interaction Framework user intentions translated into actions at the interface translated into alterations of system state reflected in the output display interpreted by the user general framework for understanding interaction ⁃ not restricted to electronic computer systems ⁃ identifies all major components involved in interaction ⁃ allows comparative assessment of systems ⁃ an abstraction Ergonomics Ergonomics, or human factors, concentrates on the physical aspects of human-system interaction. It examines how control designs, physical environments, and display layouts affect user performance. The goal is to enhance user experience by optimizing these elements based on human psychology and system constraints. Key issues include the arrangement of controls and displays, the design of physical environments, user health and safety, and the impact of color on interactions. Although it overlaps with Human-Computer Interaction (HCI), ergonomics specifically focuses on physical interactions and environmental factors to ensure efficient, comfortable, and safe user-system interactions. Ergonomics - examples Arrangement of controls and displays e.g. controls grouped according to function or frequency of use, or sequentially Surrounding environment e.g. seating arrangements adaptable to cope with all sizes of user Health issues e.g. physical position, environmental conditions (temperature, humidity), lighting, noise, Use of colour e.g. use of red for warning, green for okay, awareness of colour-blindness etc. Interaction Styles Interaction can be seen as a dialog between the computer and the user. The choice of interface style can have a profound effect on the nature of this dialog. Here we introduce the most common interface styles and note the different effects these have on the interaction. There are a number of common interface styles including:  Command line interface  Menus  Natural language  Question/answer and query dialog  Form-fills and spreadsheets  Wimp  Point and click  Three-dimensional interfaces. Command line interface Way of expressing instructions to the computer directly function keys, single characters, short abbreviations, whole words, or a combination suitable for repetitive tasks better for expert users than novices offers direct access to system functionality command names/abbreviations should be meaningful! Typical example: the Unix system Menus Set of options displayed on the screen Options visible ⁃ less recall - easier to use ⁃ rely on recognition so names should be meaningful Selection by: ⁃ numbers, letters, arrow keys, mouse ⁃ combination (e.g. mouse plus accelerators) Often options hierarchically grouped ⁃ sensible grouping is needed Restricted form of full WIMP system Natural language Natural language is the everyday language used by humans for communication, known for its flexibility, ambiguity, and complexity. It allows people to express thoughts, ideas, emotions, and intentions, often without conscious attention to grammatical rules. Key characteristics of natural language include:  Flexibility: Offers a broad range of expressions and variations in vocabulary, syntax, and structure.  Ambiguity: Words and phrases can have multiple meanings, requiring context for accurate interpretation.  Context Dependency: Understanding relies on contextual factors, including cultural and situational elements.  Productivity: Enables the creation of new sentences and ideas.  Universality: Spoken and understood globally, evolving through usage and interaction.  Complexity: Includes various linguistic aspects like grammar, semantics, pragmatics, and discourse structure. Query interfaces Question/answer interfaces user led through interaction via series of questions suitable for novice users but restricted functionality often used in information systems Query languages (e.g. SQL) used to retrieve information from database requires understanding of database structure and language syntax, hence requires some expertise Form-fills Primarily for data entry or data retrieval Screen like paper form. Data put in relevant place Requires good design obvious correction facilities Spreadsheets Spreadsheets are versatile tools used for organizing, analyzing, and manipulating data in a structured format. They consist of a grid of cells arranged in rows and columns, where each cell can contain text, numbers, formulas, or functions. Initially developed for accounting and financial calculations, spreadsheets have evolved into powerful applications capable of handling complex data analysis tasks across various industries WIMP Interface Currently many common environments for interactive computing are examples of the WIMP interface style, often simply called windowing systems. WIMP stands for windows, icons, menus and pointers (sometimes windows, icons, mice and pull-down menus), and is the default interface style for the majority of interactive computer systems in use today, especially in the PC and desktop workstation arena. Examples of WIMP interfaces include Microsoft Windows for IBM PC compatibles, MacOS for Apple Macintosh compatibles and various X Windows-based systems for UNIX. Elements of the WIMP Interface windows, icons, menus, pointers +++ buttons, toolbars, palettes, dialog boxes Windows Areas of the screen that behave as if they were independent ⁃ can contain text or graphics ⁃ can be moved or resized ⁃ can overlap and obscure each other, or can be laid out next to one another (tiled) scrollbars allow the user to move the contents of the window up and down or from side to side title bars describe the name of the window Icons In computing, icons are small graphical symbols that represent various system elements. They typically depict minimized windows, allowing users to view multiple windows at once. Clicking an icon restores the window to full size, while reducing a window to its icon is called "iconifying." Icons serve several functions:  Space Saving: They condense windows into smaller visuals, helping manage screen space and reducing clutter.  Task Management: They help users keep track of ongoing tasks or dialogs by minimizing them without losing context. Icons can also represent system components, such as a wastebasket for file deletion, disks, programs, or functions. They vary from realistic to stylized forms, with symbolic icons being less intuitive but essential for visual communication in graphical user interfaces. Pointers Important component WIMP style relies on pointing and selecting things: uses mouse, trackpad, joystick, trackball, cursor keys or keyboard shortcuts The different shapes of cursor are often used to distinguish modes, for example the normal pointer cursor may be an arrow, but change to cross-hairs when drawing a line. Cursors are also used to tell the user about system activity, for example a watch or hour-glass cursor may be displayed when the system is busy reading a file Menus In computing systems, menus are pivotal interaction components found in both windowing and non-windowing environments. They offer users a selection of operations or services available at any given moment. Key characteristics of menus include their structured presentation of operations as an ordered list, facilitating easy scanning and selection. It's crucial that menu commands are named in a clear, meaningful manner to provide informative cues to users. Menus enhance user interaction by offering a straightforward way to access and execute system functionalities efficiently. Their role extends beyond windowing systems, ensuring users can navigate and utilize software interfaces effectively. Type of Menus 1. Main Menu: Also known as the primary or top-level menu, it typically appears at the top of an application window or screen, containing essential categories like File, Edit, View, and Help. Each category can expand into a submenu when clicked. 2. Context Menu: Also called a right-click or popup menu, it appears when the user right- clicks on an object (e.g., a file, folder, or text selection), offering options specific to the selected item. 3. Dropdown Menu: This menu appears when the user clicks on a dropdown arrow or a specific menu item, displaying a list of options or commands for direct selection. 4. Toolbar Menu: Also known as a toolbar dropdown or overflow menu, it is accessed from an icon on a toolbar. Clicking the icon reveals a dropdown menu with additional commands or settings. Buttons Individual and isolated regions within a display that can be selected to invoke an action Special kinds radio buttons – set of mutually exclusive choices check boxes – set of non-exclusive choices Toolbars Toolbars are common interface elements featuring small icon-based buttons, usually located at the top or side of a window. They provide quick access to frequently used functions, similar to a menu bar but with icons rather than text. This icon-based approach allows for more functions to be displayed simultaneously. Toolbars can be:  Fixed: Featuring a set of predetermined functions.  Customizable: Allowing users to add, remove, or rearrange buttons to fit their workflow. Some systems offer multiple predefined toolbars for different tasks or preferences. In summary, toolbars enhance usability and efficiency by offering fast, icon-based access to essential functions, streamlining user interaction with software interfaces. Palettes Palettes are visual tools in software interfaces that show available modes and the current active mode, often using icons to represent different functions or options. They are similar to an artist's palette, used for selecting colors or patterns. Key features of palettes include:  Customization: Users can create or modify palettes from menus or toolbars. For example, pull- down menus can be "torn off" to become standalone palettes, and toolbars can be moved around the screen.  Tear-off Menus: These are particularly useful for graphical tasks, such as choosing line styles or colors in drawing software, as they provide a visual and interactive way to manage selections. Overall, palettes enhance usability and workflow in graphical applications by offering users a visual and interactive method to manage modes and selections. Dialogue boxes Dialog boxes are essential information windows in user interfaces, employed by the system to alert users to important information like errors or warnings, thus preventing potential mistakes. They can also initiate specific subdialogs between the user and system, often within a larger task context. For instance, in interactive applications, when users create files that need saving, a dialog box prompts them to specify the file name and its storage location within the filing system. Once this subdialog for saving is completed, the dialog box disappears. Similar to how windows segregate different user-system interactions, dialog boxes isolate auxiliary task threads from the main task dialog, enhancing clarity and usability in software interfaces. Interactivity Interactivity in computing involves the dynamic exchange of information and actions between users and computer systems, enhancing engagement and responsiveness in digital environments. Key aspects include:  Engagement: Enables users to actively participate and manipulate content rather than passively consume it.  Feedback: Provides immediate responses to user actions, reinforcing control and system responsiveness.  User-Centered Design: Supports intuitive, efficient, and enjoyable interfaces tailored to user needs.  Multi-Modal Interaction: Incorporates various interaction methods such as touch, gestures, voice commands, and traditional inputs like keyboard and mouse.  Personalization: Allows for customization based on user preferences and behaviors, adapting the experience to individual needs. Interactivity is essential in applications like websites, software interfaces, games, educational tools, and simulations, significantly enhancing functionality and usability. The Context of the Interaction The context of interaction in computing refers to the specific circumstances and environment in which users engage with digital systems. It includes: 1. Environment: The physical and digital surroundings where interaction occurs, such as desktop computers, mobile devices, kiosks, or virtual reality environments. 2. User Goals: The objectives users aim to achieve, which influence the nature of their interactions and tasks. 3. System Capabilities: The features and limitations of the system that define possible user actions and interactions. 4. User Characteristics: Users' knowledge, experience, preferences, and abilities, which affect how they interact with the system and guide design for accessibility and intuitiveness. The Context of the Interaction Cont.. 5. Contextual Variables: External factors like time constraints, location, social setting, and emotional state that impact how users approach and experience interaction. 6. Feedback Mechanisms: How the system communicates responses and outcomes to user actions, guiding users and providing information. Understanding these contextual elements is essential for designing effective, user-friendly interfaces. It helps anticipate user behavior, tailor interfaces to specific environments, and enhance usability, engagement, and productivity.

Use Quizgecko on...
Browser
Browser