🎧 New: AI-Generated Podcasts Turn your study notes into engaging audio conversations. Learn more

Lesson 2 - Human Factors as HCI Theories.pdf

Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...

Full Transcript

HUMAN FACTORS AS HCI THEORIES Lesson 2 LEARNING OBJECTIVES 01 Human Information Processing Task Modeling and Human Problem-Solving Model Human Reaction and Prediction of Cognitive Performance Predictive Performance Assessment 02 Sensation and Perception of Inf...

HUMAN FACTORS AS HCI THEORIES Lesson 2 LEARNING OBJECTIVES 01 Human Information Processing Task Modeling and Human Problem-Solving Model Human Reaction and Prediction of Cognitive Performance Predictive Performance Assessment 02 Sensation and Perception of Information Visual Tactile and Haptic Aural Multimodal Interaction LEARNING OBJECTIVES 03 Human Body Ergonomics Fitts’s Law Motor Control 04 Others HUMAN INFORMATION PROCESSING Any information to design an effective interface for human- computer interaction (HCI) requires two essential elements: an understanding of computer factors (software/hardware) human behavior HUMAN INFORMATION PROCESSING As the main underlying theory for HCI, human factors can largely be divided into: 1. Cognitive Science, explains the human’s capability and model of conscious processing of high-level information. 2. Ergonomics, which elucidates how raw external stimulation signals are accepted by our five senses, are processed up to the preattentive level, and are acted upon in the outer world through the motor organs. HUMAN INFORMATION PROCESSING Human factors knowledge will particularly help us design HCI in the following ways: Task/Interaction Modeling: Formulate the steps for how humans might interact to solve and carry out a given task/problem and derive the interaction model. Prediction, assessment, and evaluation of interactive: Understand and predict how humans might react mentally to various information-presentation and input-solicitation methods as a basis for interface selection. TASK MODELING AND Sensation HUMAN PROBLEM-SOLVING MODEL 01 It senses external information and perception, which interprets and extracts basic meanings of the external Cognitive Science has information. investigated the ways in which humans solve Memory problems, and such a 02 It stores momentary and short-term information or long-term knowledge. model can help HCI designers analyze the task and base the Decision Maker/Executor interaction model or It formulates and revises a “plan”, then interface structure around 03 decides what to do based on the various this innate problem- knowledge in the memory, and finally acts it out by commanding the motor solving process. system. HUMAN REACTION AND PREDICTION OF COGNITION PERFORMANCE We can also predict how humans will react and perform in response to a particular human interface design. Two aspects of Human Performance: 1. Cognitive 2. Ergonomic HUMAN REACTION AND PREDICTION OF COGNITION PERFORMANCE “Gulf of Execution/Evaluation” It explains how users can be left bewildered when an interactive system does not offer certain actions or does not result in a state as expected by the user. TASK MODELING AND HUMAN PROBLEM-SOLVING MODEL Memory capacity also influences the interactive performance short-term memory long-term memory Known as the working memory Retrieving information is a difficult and that contains memory elements relatively time-consuming task. If an meaningful for the task at hand. interactive system requires expert-level Humans are known to remember knowledge, it needs to be displayed so about eight chunks of memory as to at least elicit “recognition” of it lasting only a very short amount rather than relying on recall from scratch. of time. PREDICTIVE PERFORMANCE ASSESSMENT GOMS (Goals, Operators, Methods and Selection) It quantitatively estimates the time taken to complete a given task and, therefore, makes an evaluation with regard to the original performance requirements. The original GOMS model was developed mainly for the desktop computing environment, with performance figures for mouse clicks, keyboard input, hand movement, and mental operatos. SENSATION & PERCEPTION OF INFORMATION Humans are known to have at least five senses. Among them, those that would be relevant to HCI are modalities of: 1. Visual 3. Haptic 2. Aural 4. Tactile VISUAL Visual Modality is the most important information medium. Over 40% of the human brain is said to be involved with the processing of visual information. Properties of the Human Visual System and their implications for interface design: 1. Visual and Display Parameters 2. Detail and Peripheral Vision 3. Color, Brightness, and Contrast 4. Pre-Attentive Features & High-Level Diagrammatic Semantics VISUAL 1. Visual and Display Parameters 1.1 Field of View (FOV) This is the angle subtended by the visible area by the human user in horizontal or vertical direction. 1.2 Viewing Distance This is the perpendicular distance to the surface of the display. It may change with user movements. 1.3 Display Field of View This is the angle subtended by the display area from a particular viewing distance. VISUAL 1. Visual and Display Parameters 1.4 Pixel A display system is typically composed of an array of small rectangular areas called pixels. 1.5 Display Resolution This is the number of pixels in the horizontal and vertical directions for a fixed area. 1.6 Visual Acuity This is the resolution perceivable by the human eye from a fixed distance. This is also synonymous with the power of sight. VISUAL 2. Detail and Peripheral Vision 2.1 Cones Responsible for color and detail recognition that are distributed heavily in the center of the retina, which subtends about 5 degrees in human FOV and roughly establishes the area of focus. 2.2 Rods Distributed mainly in the periphery of the retina and are responsible for motion detection and less detailed peripheral vision. VISUAL 3. Color, Brightness, and Contrast 3.1 Brightness The amount of light energy emitted by the object. 3.2 Contrast Contrast in brightness is measured in terms of the difference or ratio of amounts of light energies between two or more objects. 3.3 Color Human response to different wavelengths of light namely those corresponding to red, green, blue, and their mixtures. A color can be specified by the composure of the amounts contributed by the three fundamental colors and also by hue, saturation, and brightness value. VISUAL 4. Pre-Attentive Features and High- Level Diagrammatic Semantics Pre-attentive features are composite, primitive, and intermediate visual elements that are automatically recognized before entering our consciousness, typically within 10 ms after entering the sensory system. These features may rely on the relative differences in color, size, shape, orientation, depth, texture, motion, etc. VISUAL 4. Pre-Attentive Features and High- Level Diagrammatic Semantics At a more conscious level, humans may universally recognize certain high-level complex geometric shapes and properties as a whole and understand the underlying concepts. AURAL Aural Modality is perhaps the most prevalent mode for information feedback. The actual form of sound feedback can be roughly divided in three types: (Simple beep-like sounds, Short symbolic sounds and Relatively longer “as is” sound feedback) Some parameters of the human aural capacity. 1. Aural Display Parameters 2. Other Characteristics of Sound as Interaction Feedback 3. Aural Modality as Input Method AURAL 1. Aural Display Parameters 1.1 Intensity Refers to the amount of sound energy and is synonymous with the more familiar term, volume. Often measured in decibels (dB), where 0 dB is the lowest and 130 dB is the highest level of audible sound. 1.2 Phase 1.3 Sound Refers to the time differences 1.3 DisplayCanFieldbe of View viewed as containing or being among sound waves that emanate This is theofangle composed subtended a number by the waves with of sinusoidal from the same source. Phase display area from different a particular frequencies andviewing amplitudes. The differences occur, because our left distance. dominant frequency components determine and right ears may have slightly various characteristics of sounds such as pitch, different distances to the sound timbre and even directionality. source. AURAL 2. Other Characteristics of Sound as Interaction Feedback 2.1 Sound id effectively omnidirectional Sound is most often used to attract and direct a user’s attention. However, it can be a nuisance as a task interrupter by the startle effect. 2.2 Making use of contrast 2.3 Continuous sound Auditory feedback would require a It isField 1.3 Display somewhat of View more subject to becoming 15-30 dB difference from the ambient habituated than This is the stimulation angle withbyother subtended the modalities. It noise to be heard effectively. is difficult display areato from makea out the aural particular content when the viewing Differentiated components can be sound distance.is jumbled/masked with multiple sources. used to convey certain information. humans do possess an ability to tune in to a particular part of the sound, however, this requires much concentration and effort. AURAL 3. Aural Modality as Input Method 3.1 Keyword Recognition Isolated word recognition has become very robust lately. It still requires speaker-specific training or a relatively quiet background. Another related difficulty with voice input is the “segmentation” problem. As such, many voice input systems operate in an explicit mode or state. 3.2 Natural Language Understanding Language-understanding technology is advancing fast, as demonstrated recently by Apple Siri and IBM Watson, where high-quality natural-language-understanding services are offered by the cloud. TACTILE & HAPTIC Interface with tactile and haptic feedback, while not yet very wide spread, are starting to appear in limited forms. Haptic is defined to be the modality that takes advantage of touch by applying forces, vibrations, or motions to the user and Tactile is for sensing different types of touch. Some parameters of the Haptic and Tactile: 1. Tactile Display Parameters 2. Haptic Display Parameters 3. Multimododal Interaction TACTILE & HAPTIC 1. Tactile Display Parameters 1.1 Tactile Resolution The skin sensitivity to physical objects is different over the human body. The fingertip is one of the most sensitive areas an is frequently used for HCI purpose. 1.2 Vibration Frequency 1.3 Pressure Threshold Rapid movement such as The maximum threshold is difficult to vibration mostly sensed by the measure, because when the force/torque gets Pacinian corpuscle, which is known to large, the kinaesthetic senses start to operate, and have a signal-response range of 1-300 this threshold will greatly depend on the physical Hz. condition of the user. TACTILE & HAPTIC 2. Haptic Display Parameters Force feedback and movement are felt by the cells and nerves in our muscles and joints. The simplest form of a haptic device is a simple electromagnetic latch that is often used in game controllers. It generates a sudden inertial movement and slowly repositions itself for repeated usage. Such a device is not appropriate for fast-occurring interaction. More complicated haptic devices are in the form of a robotic kinematic chain, either fixed on the ground or worn on the body. Such devices offer higher degrees of freedom and finer force control. TACTILE & HAPTIC 2. Haptic Display Parameters Important haptic display parameters are: 2.1 Degrees of freedom The number of directions in which force or torque can be displayed. 2.2 Force Range Should be at least greater than 0.5mN. displayed. 2.3 Operating/Interaction range How much movement is allowed through the device. 2.4 Stability How stable the supplied force is felt to be. TACTILE & HAPTIC 3. Multimodal Interaction Conventional interfaces have been mostly visually oriented. However, multimodal interfaces are gaining popularity with the ubiquity of multimedia devices. By employing more than one modality, interfaces can become more effective in a number of ways, depending on how they are configured. 3.1 Complementary 3.2 Redundant 3.3 Alternative Different modalities can Different modality input Providing users with assume different roles and methods or feedback can be alternative ways to interact act in a complementary used to ensure a reliable gives people more choices. fashion to achieve specific achievement of the interaction objectives. interaction objective. HUMAN BODY ERGONOMICS Ergonomics is a discipline focused on making products and interfaces comfortable and efficient. It encompasses mental and perceptual issues. 1. Fitt’s Law 2. Human Motor Control HUMAN BODY ERGONOMICS 1. Fitts’s Law Fitts’s law is a model of human movement that predicts the time required to rapidly move to a target as a function of the distance to and the size of the target. The movement task’s Index of Difficulty (ID) can be quantified in terms of the required information amount. MT = a+b*ID and ID = log(A/W+1) where movement time, MT, is a linear function of ID; where a and b are coefficients specific to a given task. HUMAN BODY ERGONOMICS 2. Motor Control The most prevalent form of input is made by the movements of our arms, hands, and fingers for keyboard and mouse input. In addition to discrete-event input methods, modern user interfaces make heavy use of continuous input methods in the two-dimensional (2-D) space and increasingly in the 3-D space. The control-display ratio refers to the ratio of the movement in the control device to that in the display OTHERS There are many cognitive, perceptual, and ergonomic issues that have been left out. Due to the limited scope of this book, we only identify some of the issues for the reader to investigate further: 1. Learning and adaptation 2. Modalities other than the “big three” (visual/aural/haptic-tactile), such as gestures, facial expression, brain waves, physiological signals (electromyogram, heart rate, skin conductance), gaze, etc. 3. Aesthetics and emotion 4. Multitasking THANK YOU!

Use Quizgecko on...
Browser
Browser