13.4 Brain-Machine Interfaces

The ultimate interface between humans and machines could be through direct sensing and stimulation of neurons. One step in this direction is to extract physiological measures, which were introduced in Section 12.3. Rather than using them to study VR sickness, we could apply measures such as heart rate, galvanic skin response, and respiration to adjust the VR experience dynamically. Various goals would be optimized, such as excitement, fear, comfort, or relaxation. Continuing further, we could apply technology that is designed to read the firings of neurons so that the VR system responds to it by altering the visual and auditory displays. The users can learn that certain thoughts have an associated effect in VR, resulting in mind control. The powers of neuroplasticity and perceptual learning (Section 12.1) could enable them to comfortably and efficiently move their avatar bodies in the virtual world. This might sound like pure science fiction, but substantial progress has been made. For example, monkeys have been recently trained by neuroscientists at Duke University to drive wheelchairs using only their thoughts [263]. In the field of brain-machine interfaces (alternatively, BMI, brain-computer interfaces, or BCI), numerous other experiments have been performed, which connect humans and animals to mechanical systems and VR experiences via their thoughts [175,177,187]. Surveys of this area include [90,236,358].



Subsections
Steven M LaValle 2020-01-06