By Steven M. LaValle. To be published by Cambridge University Press.

   This free VR book covers the fundamentals of virtual reality systems, including geometric modeling, transformations, graphical rendering, optics, the human vision, auditory, and vestibular systems, tracking systems, interface design, human factors, developer recommendations, and technological issues.

Free VR lectures on YouTube. These are part of an accompanying MOOC (free on-line course), produced by NPTEL and IIT Madras, 2016.

Why did I write this? Click here. Feel free to send feedback and corrections (I will acknowledge you in the final version).

The whole book in HTML
Book cover art by Anna Yershova.

Download the whole book

[pdf file] -- Two pages per one. Recommended for printing on US Letter paper.
[pdf file] -- Two pages per one. Recommended for printing on A4 paper
[pdf file] -- One page per one (larger print). May be easier for on-line viewing.

Download chapters

Chapter 1: Introduction
13 Mar 2019 Definition of VR, modern experiences, historical perspective.
Chapter 2: Bird's Eye View
13 Mar 2019 Hardware, sensors, displays, software, virtual world generator, game engines, human senses, perceptual psychology, psychophysics.
Chapter 3: The Geometry of Virtual Worlds
13 Mar 2019 Geometric modeling, transforming rigid bodies, yaw, pitch, roll, axis-angle representation, quaternions, 3D rotation inverses and conversions, homogeneous transforms, transforms to displays, look-at and eye transforms, canonical view and perspective transforms, viewport transforms.
Chapter 4: Light and Optics
13 Mar 2019 Light propagation, lenses and images, diopters, spherical aberrations, optical distortion; more lens aberrations; spectral properties; the eye as an optical system; cameras; visual displays.
Chapter 5: The Physiology of Human Vision
13 Mar 2019 Parts of the human eye, photoreceptors and densities, scotopic and photopic vision, display resolution requiments, eye movements, neural vision structures, sufficient display resolution, other implications of physiology on VR.
Chapter 6: Visual Perception
13 Mar 2019 Depth perception, motion perception, vection, stroboscopic apparent motion, color perception, combining information from multiple cues and senses, implications of perception on VR.
Chapter 7: Visual Rendering
13 Mar 2019 Graphical rendering, ray tracing, shading, BRDFs, rasterization, barycentric coordinates, VR rendering problems, anti-aliasing, distortion shading, image warping (time warp), panoramic rendering.
Chapter 8: Motion in Real and Virtual Worlds
13 Mar 2019 Velocities, acceleration, vestibular system, virtual world physics, simulation, collision detection, avatar motion, vection.
Chapter 9: Tracking
13 Mar 2019 Tracking systems, estimating rotation, IMU integration, drift errors, tilt and yaw correction, estimating position, camera-feature detection model, perspective n-point problem, sensor fusion, lighthouse approach, attached bodies, eye tracking, inverse kinematics, map building, SLAM.
Chapter 10: Interaction
13 Mar 2019 Remapping, locomotion, manipulation, social interaction, specialized interaction mechanisms.
Chapter 11: Audio
13 Mar 2019 Sound propagation, ear physiology, auditory perception, auditory localization; Fourier analysis; acoustic modeling, HRTFs, rendering, auralization.
Chapter 12: Evaluating VR Systems and Experiences
13 Mar 2019 Perceptual training, recommendations for developers, best practices, VR sickness, experimental methods that involve human subjects.
Chapter 13: Frontiers
13 Mar 2019 Touch, haptics, taste, smell, robotic interfaces, telepresence, brain-machine interfaces.

Related resources

VR lectures and course materials appear below from the course at the University of Illinois. It was taught by Steve LaValle in Spring 2015, 2016, 2017 and Anna Yershova in Fall 2015, 2016, 2017 and Spring 2018.