AVR Unit-3 Notes PDF

Summary

These notes cover Environment Modeling in VR, focusing on geometric modeling, object shapes, and lighting/texture mapping techniques. Information about scanners, textures, and multi-texturing is also included.

Full Transcript

**Augmented and Virtual Reality (B19CS7041)** **Unit-3: Environment Modelling in VR** **Geometric Modeling:** - **VR System Architecture:** - **VR Modeling Cycle:** - **The VR Geometric Modeling:** - **Object Surface Shape:** - Polygonal Meshes (Vast Majority) - Splines (For Curv...

**Augmented and Virtual Reality (B19CS7041)** **Unit-3: Environment Modelling in VR** **Geometric Modeling:** - **VR System Architecture:** - **VR Modeling Cycle:** - **The VR Geometric Modeling:** - **Object Surface Shape:** - Polygonal Meshes (Vast Majority) - Splines (For Curved Surfaces) - **Object Appearance:** - Lighting (Shading) - Texture Mapping - **The surface polygonal (triangle) mesh:** Triangle meshes are preferred since they are memory and computationally efficient (shared vertices). - **Object spline-based shape:** Another way of representing virtual objects. Functions are of higher degree than linear functions describing a polygon -- use less storage and provide increased surface smoothness. - Parametric splines are represented by points x(t), y(t), z(t), t=\[0,1\] and a, b, c are constant coefficients. - Parametric surfaces are extension of parametric splines with point coordinates given by x(s,t), y(s,t), z(s,t), with s=\[0,1\] and t=\[0,1\]. - β-Splines are controlled indirectly through four control points (more in physical modeling section). - **Object Polygonal Shape:** Can be programmed from scratch using OpenGL or another toolkit editor. It is tedious and requires skill. - Can be obtained from CAD files. - Can be created using 3D digitizer (stylus), or a 3D scanner (tracker, cameras, and laser). - Can be purchased from existing online databases (Viewpoint database). Files have vertex location and connectivity information but are ***static***. - **Polhemus 3D scanners:** Eliminate direct contact with object. It uses two cameras, a laser, and magnetic trackers (if movable objects are scanned). Scanning resolution 05 mm at 200 mm range, scanning speed is 50 lines per second, range is 75-680 mm scanner-object range. - Polhemus FastScan 3D scanner can scan objects up to 3 m long. - Scanners produce a dense "cloud" of vertices (x,y,z). Using such packages as **Wrap** the point data is transformed into surface data (including editing and decimation). Point cloud from scanner Polygonal mesh after decimation \(a) Polygonal surface (b) NURBS (Non-Uniform Rational β-Splines) patches (c) NURBS surface - **Object Visual Appearance:** Scene illumination (local or global), texture mapping, multi-textures, use of textures to do illumination in the rasterizing stage of the pipeline. - **Scene illumination:** - Local methods (Flat shaded, Phong shaded) treat objects in isolation. They are computationally faster than global illumination methods. - Global illumination treats the influence of one object on another object's appearance. It is more demanding from a computation point of view but produces more realistic scenes. Flat shaded Utah Teapot Phong shaded Utah Teapot - **Global scene illumination:** The inter-reflections and shadows cast by objects on each other. - **Radiosity illumination:** Results in a more realistic looking scene. Without radiosity With radiosity - **Texture mapping:** It is done in the rasterizer phase of the graphics pipeline, by mapping assigning texture space coordinates to polygon vertices (or splines), then mapping these to pixel coordinates. - Texture ***increase scene realism***. - Texture provides better 3D spatial cues (they are perspective transformed). They reduce the number of polygons in the scene -- increased frame rate (Example: Tree models). Textured room image for increased realism - **How to create textures:** Models are available online in texture "libraries" of cars, people, construction materials, etc. Custom textures from scanned photographs or using an interactive paint program to create *bitmaps*. Ferrari F12 LaFerrari Car Ferrari 458 Spider, Textured Ferrari, 3d, wall Texture, performance Car png \| PNGWing ![](media/image23.png) - **Texture minification:** Uses various "filters" to approximate the color of the pixel: nearest neighbor (to Texel closest to the pixel center is selected, bilinear interpolation, etc.). - **Multi-texturing:** Several Texels can be overlaid on one pixel. A texture ***blending cascade*** is made up of a series of texture stages. - **Multi-texturing for bump mapping:** Lighting effects caused by irregularities on object surface are simulated through "bump mapping". This encodes surface irregularities as textures. No change in model geometry. No added computations at the geometry stage. - **Multi-texturing for lighting:** Several Texels can be overlaid on one pixel. One application in more realistic lighting. Polygonal lighting is real-time but requires lots of polygons (triangles) for realistic appearance. - **Multi-texturing (texture blending):** Realistic-looking lighting can be done with 2D textures called "light maps". Not applicable to real-time (need to be recomputed when object moves.) **Behavior Simulation / Modeling:** - Until now our discussion has been limited to the mathematical modeling of object appearance, kinematics, and physical properties. Whenever objects interacted, it was assumed that one was controlled by the user. It is also possible to model object behavior that is independent of the user\'s actions. This becomes critical in very large simulation environments, when users cannot possibly control all the interactions that are taking place. - Consider the modeling of a virtual office, for example. Such an office could have an automatic sliding door, a clock, a desk calendar, as well as furniture. The time displayed by the clock and the date shown on the current calendar page can be updated by accessing the VR engine system time. Every time the user enters the virtual office, the sliding door opens and some of the information displayed by the clock, calendar, and window thermometer changes. However, direct user input is limited in this example to just changing the field of view to the simulation. - This example illustrates one method of modeling **object behavior** by accessing external sensors (system time, and proximity sensors for the sliding door). This provides virtual objects with a degree of independence from the user\'s actions-a degree of \"intelligence\". Many current simulations also model **virtual humans**, called **agents**. - **Definition:** A virtual human (or agent) is a 3D character that has a human behavior. Groups of such agents are called **crowds** and have **crowd behavior**. LEARN HOW YOU CAN WIN MONEY DAILY BY BETTING ON VIRTUAL SOCCER/FOOTBALL. eBook: BROWN, J.R.K.: Amazon.in: Kindle Store - Fully autonomous agents, such as the virtual football player in the yellow-colored uniform, need to perceive their environment (in this case, the opponent) in order to take appropriate actions. - The behavior model of the agent includes **emotions**, **behavior rules**, and **actions**. - The agent behavior has its hierarchy, reflex behavior being at a lower level. - A **reflex behavior** could be to tackle his opponent every time he sees him. - **Emotion-based behavior filters** perceptual data through likes, dislikes, anger, or fear. - It is thus at a higher level than simple reflex behavior. As a consequence, two agents interpreting the same sensorial data will take different actions in the simulation. **Physically Based Simulation:** - Physically based simulation / animation is an area of interest within computer graphics concerned with the simulation of physically plausible behaviors at interactive rates. Advances in physically based animation are often motivated by the need to include complex, physically inspired behaviors in video games, interactive simulations, and movies. - Physically based animation is now common in movies and video games, and many techniques were pioneered during the development of early special effects scenes and game engines. - In games such as **Angry Birds**, physically based animation is itself the primary game mechanic and players are expected to interact with or create physically simulated systems in order to achieve goals. ![Successful pivot of Angry Birds 2 \| by Mladen Dulanovic \| Medium](media/image31.jpeg) - **Rigid Body Simulation:** - Simplified rigid body physics is relatively cheap and easy to implement, which is why it appeared in interactive games and simulations earlier than most other techniques. - Rigid bodies are assumed to undergo no deformation during simulation so that rigid body motion between time steps can be described as a translation and rotation. Why do the objects in this rigid body simulation rotate too slowly? - Blender Stack Exchange ![Blender 2.66 Animation - Rigid Body Dynamics, Gears and Dynamic Paint - YouTube](media/image33.jpeg) - **Soft Body Simulation:** - Soft bodies can easily be implemented using spring-mesh systems. - Spring mesh systems are composed of individually simulated particles that are attracted to each other by simulated spring forces and experience resistance from simulated dampeners. ![](media/image35.jpeg) - **Fluid Simulation:** - Computational fluid dynamics can be expensive, and interactions between multiple fluid bodies or with external objects/forces can require complex logic to evaluate. - Fluid simulation is generally achieved in video games by simulating only the height of bodies of water to create the effect of waves, ripples, or other surface features. 3D Liquid Splash, Fluid, Water, Milk, Chocolate\..., 5 Blender Models , Full Texture Included - High Quality 3D Model And Shaders. - **Particle Systems:** - Particle systems are an extremely popular technique for creating visual effects in movies and games because of their ease of implementation, efficiency, extensibility, and artist control. - The update cycle of particle systems usually consists of the three phases: generation, simulation, and extinction. ![Adding Particle Systems - Mastering Android Game Development with Unity](media/image37.jpeg) - **Flocking:** - In physically based simulation, flocking refers to a technique that models the complex behavior of birds, schools of fish, and swarms of insects using virtual forces. - These virtual forces simulate the tendency for flocks to center their velocities, avoid collisions and crowding, and move toward the group. Why do birds flock together? -- How It Works ![](media/image39.jpeg) **Unit-3: Interactive Techniques in VR** **Body Track:** - Full-Body Tracking is often considered the Holy Grail for the virtual reality experience. - The ability to imitate your real body movement into an avatar is something that greatly enhances immersion in VR as well as provides countless possibilities for new behavior. - The most common way to achieve tracking is by attaching to the body **special markers** that are detected by **cameras**. - The more markers are placed, the more accurate is mapping of avatar movement. There are two most used configurations. - The first configuration uses 6 markers placed at the head (headset), hands (most often controllers), belt, ankles. - The second one adds additional markers on knees and elbows for a better bending experience. - To solve the position and orientation of the avatar system uses **inverse kinematic**. - The easiest way to enjoy full-body tracking is by adding **Vive markers**. - Those markers are easy to set up and work directly with steam VR or any headset using base stations. - Because it is relatively cheap, this solution is used the most. It provides good tracking and works with devices that are already in possession of most VR headsets owners. - It's a go-to choice for non-industrial use of full-body tracking. Pricing is around 115\$ for each marker with a strap. Assuming the user already owns a headset with controllers he needs to buy at least 3 of them to make full use of full-body tracking. - **Optitrack** is a more industrial way of full-body tracking. This solution stands out by high quality, fluidity, and very low delay of tracking. - The general principle of operation remains unchanged. Multiple cameras are tracking markers in the space. - There are several differences between the Optitrack and the Vive solution. - First is that Optitrack can track multiple objects in much larger areas. Because of that, it makes it perfect for multiplayer applications or for capturing precise animations of many objects. - The second one is that Optitrack supports two kinds of markers: -- active, -passive. - **Active markers** work exactly like Vive's. **Passive** ones are way cheaper and do not need a battery but require unique patterns to distinguish objects from each other for the cameras. The third disparity is that Optitrack requires a specialized room with special camera installation. Additionally, the Optitrack solution requires an external object tracking program called Motive for data processing. Summarizing Optitrack solution is not the cheapest one and not easiest to develop on but provides exceptional quality. - **Hand Gesture:** - Users taking part in VR experiences tend to rely on a head-mounted display, making it challenging for the person to navigate and walk-through the experience. With the addition of **hand-gesture interaction**, virtual reality experiences are becoming even more immersive. - Devices like joysticks and head orientation tracking are embedded into VR headsets and are used to help a person walk through the virtual world --- but they're not fully effective. - There can be jolts to the view that are irregular and not only disrupt the experience but can also cause motion sickness for the user. - The portal method is a technique that has been used to reduce motion sickness, allowing users to jump from one location to another. The movement isn't natural, so it can also cause some disorientation. - Researchers have experimented with the simple double hand-gesture interaction method and have found it to be beneficial to users in a VR experience. It allows the user to be better immersed and comfortable without motion sickness. - The hand gestures the researchers have created are natural, such as raising and opening their left hands in order to move avatars forward. By tracing the motions in the air, it helps combine the actions of movement and turning in VR and is more natural, and easier to learn than other VR navigation methods. **3D Manus:** - 3D Manus VR is the first virtual reality glove Input Device created specifically for general consumers. The team behind it wants to bridge our physical world with virtual reality and allow users to experience a never-before-seen immersion. - Manus VR users an assortment of sensors to track hand movement in real time and use the captured data to faithfully reproduce the movement in virtual reality. - It operates completely wirelessly and comes with an open-source SDK that developers can use to integrate the hand-tracking functionality into their applications and games. - **Object Grasp:** - Grasping is one of the fundamental actions we perform to interact with objects in real environments, and in the real world we rarely experience difficulty picking up objects. - Grasping plays a fundamental role for interactive virtual reality (VR) systems that are increasingly employed not only for recreational purposes, but also for training in industrial contexts, in medical tasks, and for rehabilitation protocols. - To ensure the effectiveness of such VR applications, we must understand whether the same grasping behaviors and strategies employed in the real world are adopted when interacting with objects in VR. - Grasps are visually realistic because hand is automatically fitted to the object shape from a position and orientation determined by the user using the VR handheld controllers (e.g., Oculus Touch motion controllers). - Grasping system enables interaction with different objects regardless their geometries. **End of Unit-3**

Use Quizgecko on...
Browser
Browser