LU6 Emerging Technologies in HCI.pdf

Full Transcript

Unit 6: Emerging Technologies in HCI Contents  Emerging interfaces  Emotional interaction  Persuasive technologies  Anthromorphism  Ethical design  Digital forgetting  The future of HCI Emerging interfaces  Brain computer interface  Robots and drones...

Unit 6: Emerging Technologies in HCI Contents  Emerging interfaces  Emotional interaction  Persuasive technologies  Anthromorphism  Ethical design  Digital forgetting  The future of HCI Emerging interfaces  Brain computer interface  Robots and drones  Wearables  Augmented, mixed and virtual realities  Haptic interface  Air-based gestures interface  Voice guided interface Brain computer interface  Computing devices are part of everyday life, but for people with paralysis, using these devices can be difficult  Paralysis is when a person cannot control certain parts of their body due to medical conditions (e.g. stroke, nerve damage etc.)  BCI uses electrode placed in user’s brains to record signals from the motor cortex, the area controlling muscle movement  The signals are then transmitted to a computer and algorithms will translate the signals to commands that guide onscreen movement Brain-computer interface  Examples:  The BrainGate project: https://www.braingate.org/  Typing for people with paralysis: https://med.stanford.edu/news/all- news/2017/02/brain-computer-interface-allows- fast- accurate-typing-by-people-with-paralysis.html  Video:  https://www.youtube.com/watch?v=9oka8hqsOzg  Controlling tablet for people with paralysis: https://www.eurekalert.org/pub_releases/2018- 11/bu-bie111518.php Brain computer interface  Facebook is funding experiments to create a device that reads your mind  Excerpt:  “… where researchers have been developing “speech decoders” able to determine what people are trying to say by analyzing their brain signals.”  Link:  https://www.technologyreview.com/s/614034/f acebook-is-funding-brain-experiments-to- create-a- device-that-reads-your-mind/  Brain Computer Interfaces and VR: the future  of interfaces? | Fotis Liarokapis | TEDxNTUA  https://www.youtube.com/watch?v=X- hkS82IFT0 Robots and drones Robots are programmable machines that can be used to carry out tasks Investigate dangerous Bomb disposal, search and Robots can be used to materials and places rescue in disasters Cleaning, gardening, vacuuming As companions, to reduce Perform daily chores stress and Provide therapeutic qualities loneliness, for social interaction Robots and drones  Also known as UAV (unmanned aerial vehicle) or unmanned aircraft, UAVs are originally used for jobs that are too dull, dirty or dangerous for humans  First used by the military and hobbyists, now they are more affordable and accessible  The system consists of a UAV, a controller on ground, and a communications system between the two and can be controlled by a human operator or onboard computers  Usage of drones:  Entertainment – food and drinks to people  Agriculture – for farmers  Collect image data to form maps of farms and plantations  Check crop health/best time to harvest  Monitor crop growth  Spray fertilizer/pesticide  Wildlife conservation – track poachers  Deliver items  Combat/military Wearables  Wearables are computing devices that users wear on the body (wrist, arms, face, neck, skin etc.)  Wearables contain sensors and have the ability to transmit  signals to the user’s smartphone/computing device  Popular wearables include smartwatches and activity/fitness trackers that are easily available commercially with affordable prices  There are also other wearables such as smart clothing (e- textile) and implantables Wearables  Interactive e-textile  Examples:  Samsung Smart Clothing  https://www.youtube.com/watch?v=TiHJMZPCsZ8  https://www.youtube.com/watch?v=iu2ylnHQ5dY  KineticDress (https://www.youtube.com/watch?v=Wi9u7tA6218)  The Right Trousers Project  Issues to consider when designing this:  Wearable  Light  Comfort  Hygiene  Ease of wear  User controls Wearables  Implantables are wearables embedded in the human body Wearables  SenseCam by Microsoft Research Labs, is a wearable camera to help people suffering from memory loss (e.g. those with Alzheimer’s diseases, amnesia)  The camera will take photos without user intervention, and digital images can be stored and revisited later by the user to help aid their memory 12 Augmented reality Augmented reality: By using the device’s camera and GPS to Virtual representations that are place virtual characters and superimposed on physical devices and images onto objects in the objects environment Users open the AR app on a computing device and the content appears superimposed on what is viewed through the screen Examples: Pokemon Go, Snapchat, FaceApp, AR sandbox AR sandbox – where players can sculpt real sand into landscapes (virtual) Mixed reality  Mixed reality (also known as hybrid reality)  Views of the real world are combined with views of a virtual environment  Virtual objects are mapped onto physical environments  Examples:  Holograms:  Buzz Aldrin: Cycling Pathways to Mars (https://www.youtube.com/watch?v=9bff_YHdVYM&feature=emb_title)  Jon Hamm at Sundance Film Festival  Safety inspection at production facilities/factories – Renault  Training engineers without having to visit actual site – Japan Airlines HoloLens  Further reading  https://bernardmarr.com/default.asp?contentID=1916  https://skywell.software/blog/what-is-mixed-reality- examples-and- development-features/ Virtual reality  VR uses computer-generated graphical simulations to create “the illusion of participation in synthetic environment”  Refers to the experience of interacting with an artificial environments, which feels virtually real  VR provides immersive experience – enable users to interact with objects and navigate in 3D space in ways not possible in the physical world or 2D interfaces  Offers first-person or third-person perspective Virtual reality  Cows wearing VR headsets might produce better milk – They're happier in virtual fields than confined farms.  Excerpt:  “Moscow-area farmers strapped modified VR headsets to cows to see if it improved their mood -- and, of course, their milk production.”  Link: https://www.engadget.com/2019/11/26/ cows- with-vr-headsets/?fbclid=IwAR14- 3Epzw_tApR_BXr02EcgN5h- BZGikzIJLnNFUwjADX4rc- kQZo8uv78 Virtual reality  Metaverse  A virtual world where people can interact with each other  The virtual world mirrors our real world  Avatars can move freely in the metaverse experience  Live, work, play, shop, learn, spend money etc.  Similar to Roblox, Minecraft, Fortnite  Still evolving and will continue to grow  Technologies that are important for the development and growth of the metaverse: AR and VR  Watch: Explaining the metaverse https://www.youtube.com/watch?v=7DEVfUk2zCk Haptic interface  Haptic interfaces provide tactile feedback by applying vibration and forces to the person  This is done by using actuators that are embedded in their clothes or devices they are carrying (e.g. smartphone, smartwatch)  Example:  Samsung SmartSuit worn by Dutch skaters (refer link in Wearables section)  MusicJacket – assist novice violin players learn to hold their instrument and develop good bowing action  Gaming consoles and other computing devices that uses vibration to inform user  The Right Trousers Project – wearable soft robotics exoskeleton that helps users stand up and move around using bubble haptic feedback Air-based gestures interface  People’s body parts (arms, legs, fingers, head, etc.) and gestures can be recognized by computers  Recognition is done through camera technologies, sensors and computer vision techniques  Gestures are also known as motion  Examples:  Sony EyeToy  Nintendo Wiimote (Wii Remote)  Microsoft Kinect  Ubi-Finger  Touchless gesture-based for surgeons in operating theatres Voice guided interface  Voice user interface (VUI) lets user interact with a system through voice commands (speech)  Examples: Siri, Google Assistant, Alexa  VUI allows for hands-free, eyes-free interaction so user can focus attention elsewhere  As screens become smaller, voice interaction will be on the rise – adoption rate of speech recognition will grow in the next few years  A promising area to explore is voice guided apps and technologies  Read more: https://www.interaction- design.org/literature/topics/voice-user-interfaces Emotional interaction  Emotional interaction uses the knowledge of people’s emotions to inform UX design  Emotions: happy, sad, annoyed, anxious, frustrated, delirious, etc.  Emotions can influence behaviour and trigger response in the viewer e.g.  Being happy when shopping online may make someone willing to make purchases  Putting the photo of a hungry child with sad eyes makes someone feel sad and  want to do something to help (i.e. donate money/item) Emotional interaction  Expressive interface can create emotional connection or feelings with users  Interfaces can be made expressive by using these elements:  Emojis  Sounds  Colours  Shapes  Fonts  Virtual agents  Animated icons  Sonifications (sounds to indicate action/event e.g. ding for a new message/email)  Vibrotactile feedback (e.g. distinct smartphone buzzes)  The use of these elements can influence the interface’s emotional impact Emotional interaction  Interfaces can also cause negative motions/responses in users  Anger, annoyance, distracting, intrusive, insulted, stupid, threatened, etc.  Can annoy user to the point of losing temper or abandoning the product  Some reasons that may cause this:  Too much usage of elements  Poor layout  Too many steps to perform a task  Too many features  Insufficient information  Poor error message  Others – can you think of more reasons?  Microsoft’s Clippy and Ikea’s Anna virtual agents that users did not like Affective computing uses computers to recognize and express emotions in the same way as humans do Affective It involves designing ways for people to communicate their emotional states, through using novel, wearable sensors, and creating new techniques to evaluate frustration, stress, and moods by analyzing people’s expressions and computing conversations Emotional AI automates the Using sensing measurement of feelings and technologies behaviours by using AI technologies that can analyze Data collected is facial expressions and voice in used to predict order to infer emotions user’s behavior Affective computing  Techniques/technologies/methods used to collect and infer emotions:  Cameras to measure facial expressions  Biosensors to measure galvanic skin response (check user’s level of anxiety or nervous)  Speech (voice quality, intonation, pitch, loudness, rhythm)  Motion capture systems/accelerometer sensors to detect body movements and gestures  Eye tracking  Finger pulse  The words/phrases used when on social media (tweeting, chatting, Facial expression detection captions etc.) using Affectiva emotion analytics software – advanced computer vision and machine learning algorithms are used to detect emotions Affective computing  affectiva driver emotion recognition and real time facial analysis for the automotive industry  Emotion recognition to improve car safety  E.g. detect drowsiness in driver, then trigger an action e.g. suggesting the driver to pull over where it is safe  URL: https://blog.affectiva.com/driver- emotion-recognition-and-real-time- facial-analysis- for-the-automotive- industry Affective computing  Examples:  All the Feels live streaming for video games  An overlay on Twitch (a popular video games streaming site) to show the streamer’s biometric and webcam-derived data e.g. heart rate, skin conductivity, and emotions  This enhances spectator’s experience and improves the connection between streamer and spectators  URL: https://dl.acm.org/doi/pdf/10.1145/3102071. 3102103  Persuasive technologies are technologies designed to  get user to do something or  behave in a certain way or  change user’s habits or  do something that will improve the user’s well-being Persuasive help people monitor behaviours and change them technologies   These technologies are used in various domains: health, fitness, personal relationships, learning, safety, etc.  User experience can be designed to become technological interventions to help change people’s behaviours and attitudes Persuasive technologies  Digital pet  Owners are required to walk, run or jump every day to keep the digital pet alive  If the owner did not exercise with the device, the digital pet may become angry, sulk, refuse to play, or die  The digital pet was designed to motivate children to be physically active Nintendo’s Pokemon Pikachu device Tamagotchi digital/virtual pet Persuasive technologies  Fitness/weight trackers  Encourage users to change their behaviours by  displaying how much exercises they have done/how much weight they have lost/how much sleep they get over a period of time (day/week/month)  goal setting  reminder notifications  rewards e.g. badges  comparing their results vs. their  friends’/peers’ via leaderboards and charts  These devices help users monitor various behaviours related to fitness/weight, and change them based on the data collected and displayed back to them Anthromorphism  Anthromorphism is the natural tendency by people to attribute human qualities to animals and objects  e.g. talking to computer/device like it is human too  E.g. treating robot vacuums like a pet  Giving human-like attributes in technologies make them more enjoyable and fun to interact with  People, especially children, tend to accept and enjoy objects that have been given human-like attributes Anthromorphism Luvabella interactive doll Shows facial expressions e.g. blinking, smiling, making baby noises in response to how her owner plays and looks after her The more a child (user) plays with her, the more the doll will learn to speak words and phrases It has sensors embedded in it: https://www.youtube.com/watch?v=miCJzXf1AGg Anthromorphism  Robots  Robot pets: Sony AIBO  Social robots: Zora robot Ethical design  Huge volumes of data are now collected from users for many reasons (e.g. to improve/enhance services)  Concerns: user privacy, respectfulness, trustworthiness, fairness, honesty, human rights  UX designers can create designs and interaction processes that make clear to users how their data is being used  When collecting user data, it is important to consider how ethical the data collection and storage processes are, and how the data analysis will be used Strategies for ethical designs: Refer to code of ethics e.g. ACM & IEEE (http://ethics.acm.org/2018-code- draft- 1/) Limit the data collected, choose only the data needed Follow the privacy by design approach Ethical (https://www.cam.ac.uk/research/news/privacy-by-design) Have an explicit agreement in place on how data will be used and acted upon design Have clear boundaries between what is acceptable and what is not Open Data Institute’s data ethics canvas – a set of questions to formulate ethical questions General Data Protection Regime (GDPR)’s Data Ethics Principle (fairness, accountability, transparency, explainability) Digital forgetting  There are a lot of technologies developed to help people remember, but what about the times when people want to forget memories?  E.g. broken relationships  How can technologies be designed to help people forget these memories?  In research: harvest digital materials e.g. photos using automatic methods e.g. face recognition, then delete automatically without the person having to go through the materials (Sas & Whittaker, 2013)  URL: https://news.ucsc.edu/2013/05/digital- breakup.html The future of HCI  As the amount of computing power in devices increase, more new and unique devices will emerge – with this, new and unique interactions will emerge too.  Interactions will be more immersive and interfaces will be more personalized to user’s needs and can appear anywhere.  Humans and computers will be closer as new computing technologies emerge, and HCI will remain relevant and continue to grow.

Use Quizgecko on...
Browser
Browser