The idea of a brain-computer interface (BCI) is well-worn in literature and cinema. Dumbledore’s pensieve and Harry’s subsequent access is a mechanism where, with perfect transcription fluency, the memory in one brain flows elegantly into another. Rowling conceives the term “legilimency,” playing with Latin and wizarding semantics, to describe the process of reading minds. In the process, she also broaches the topic of consent – Harry’s frenemy Snape is concerned that Voldemort might attempt to access Harry’s thoughts and assists Harry in building what amounts to a cognitive firewall.
In an entirely different genre, Tom Cruise’s role as David Aames in “Vanilla Sky” contains an example of a BCI where the protagonist is placed in a lucid dream state1 while his body is cryogenically frozen. Almost twenty years earlier, Christopher Walken starred in a film in which a system records the sensory and emotional experience of an individual, allowing those cognitive events to be experienced by another human being. The military gets involved, and again, the ethics of proper use are discussed and depicted.2 Putting aside philosophical discussions of consciousness and free will, these are other examples of brain computer interfaces.
Our lives are increasingly enmeshed and entangled with technology. Don’t believe me? I’ll bet you can remember your childhood phone number with minimal effort. Now try to remember any phone number you’ve learned in the past five years. I’ll wait. The tools we employ to convey our intention to that technology (keyboards, mice, smartphone touch screens, Siri, etc) are as fundamental to our lives as shirts and shoes.3 What if our thoughts were sufficient to communicate intention to a machine and receive a response? That technology currently resides on the boundary separating science fiction from science fact.4