Mind and matter: the application of brain computer interfaces
Augmenting a person’s cognitive ability by artificial means is becoming more science fact that science fiction, with defence personnel likely to benefit. Dr Helen Dudfield, senior fellow and chief scientist for training and human performance at Qinetiq, examines the ways in which brain computer interfaces (BCIs) could assist in managing the workload in the defence sector.
As the availability of data increases, combined with the corresponding need for analysis, the demands being placed on humans in the machine loop are becoming ever greater and challenging the most capable of minds in the face of this cognitive tidal wave.
In many ways, the brain is still a little-understood ‘black box’, with knowledge of neuroscience in its relative infancy. However, as neurological research grows, we’re beginning to unlock a far greater insight into of how the brain functions, and how we might be able to optimise its performance in the future. Despite neuroscience being a relatively new discipline, and BCIs are playing a big part in its development and application.
BCIs can be seen as a subset of the field of human augmentation – which is concerned with improving human performance through a variety of means. In the simplest terms, BCIs use a variety of sensors and algorithmic data to bridge the electrical activity of the brain to an external device. By developing a more accurate understanding of brain function, and how different external stimuli might impact the way in which our brains respond to tasks or challenges, researchers can start to explore how we might unlock additional parts of our brain, and how we can better respond to stresses.
This is becoming a particularly attractive concept in the quest to improve work-based performance. For example, BCIs could be used to manage workload, mental fatigue and improve attention spans.
Overload and underload
Today’s personnel, in defence and beyond, deal with more data (and more data sources) than ever before. The sheer volume of information, and all of the factors that need to be considered when evaluating a decision, can contribute to cognitive overload and analysis paralysis. This sees leaders become overwhelmed by the wealth of options presented, rendering them unable to decide on a clear route forward. From a military perspective, where fast, accurate decisions are crucial, this can lead to costly delays.
There are also several stresses, both internally and externally, that can burden military personnel – particularly those on front-line duty. Severe stress, left unchecked, can cloud judgement, hamper performance and facilitate decision making that is extreme and outside of a person’s normal nature. The development of BCIs is providing researchers with optimism that we’ll soon have tools to manage high-stress work environments, particularly through their ability to streamline information, and monitor and react to stimuli.
Defence is only a small subset of BCI’s possible uses. It’s also likely to see lots of future use in medicine – for example, to help treat degenerative brain diseases, to rebuild language in people who have suffered a stroke, to help those who are paralysed, and to stimulate areas such as memory and language through brain computer stimulation. Research on the medical applications of BCIs is continuing to power forward at speed. Recently, US-based company, Synchron, received FDA approval for the use of invasive BCIs in the treatment of amyotrophic lateral sclerosis — a progressive disease that affects nerve cells.
While they are one of only two BCI developers with FDA approval, its design could have significant implications in the treatment of patients, including in the rehabilitation of military veterans, moving forward.
We believe that non-invasive BCIs will likely become part of the norm in human augmentation systems, helping us to optimise decision making, performance and knowledge retention
BCI could greatly enhance human machine teaming. ‘Adaptive’ BCIs with the ability to respond to an operator’s needs, could help to avoid such cognitive overload (or underload). For example, if the system knows that an operator is overwhelmed, it could reduce the display of information or direct focus towards an area that is more important. BCIs could also provide a control mechanism, for example, freeing up the hands and allowing operation at ‘the speed of thought’, decreasing reaction times. Such a capability would be invaluable for fighter pilots or frontline personnel, where rapid reactions are important.
Learning and deciding
Another area of interest for BCIs is in learning and education. BCIs present a realistic opportunity for researchers and educators to understand and analyse the brain’s reaction to different learning methods. This presents an opportunity to tailor education plans to each individual and facilitate a quicker than usual adaptation to new concepts, skills and ideas.
It’s little wonder then, that BCIs are being explored in an educational context, both for military and non-military use. For example, the US Air Force Research Laboratory has been exploring the use of BCIs combined with augmented reality as part of its Individualized Neural Learning System (iNeuraLS) programme, designed to speed up skill acquisition in personnel.
Artificial intelligence sees common use to aid making simple decisions, but recent work at Essex University is exploring the use of BCIs as an aid to complex decision making in groups. The researchers’ system identifies ‘neural and behavioural correlates of decision confidence…for real-time estimates and prediction of decision confidence and user mental states…as well as identifying the behavioural, physiological and neural markers of effective group cooperation.’
// BCIs could offer a way to optimise brain functions among military personnel operating in high stress environments, or platforms that require action at the speed of thought. Credit: Shutterstock
What next for BCIs?
It’s clear that research and the application of BCIs will continue to advance rapidly. Neuroscientists are continuing to explore the potential that the technology has in adapting the way we work, respond to stress, and deal with injury or illness. What is less certain is the specific form that BCIs will take in the mid-to-long term future.
As of today, electroencephalogram ‘nets’ are a fairly common and non-invasive form of BCIs, whereas more invasive arrays of fine wires to sense neuron (or neuron group) responses are at an early stage. The latter face infection and instability problems. This, combined with ethical and risk issues, has led to hesitancy in many countries to even test invasive devices.
However, substantial research has been done into non-invasive BCIs over the last few years – particularly in the defence industry. We’re also seeing promising applications in wellbeing and workload management, but even so, such devices remain very much in the early stages of development.
We believe that non-invasive BCIs will likely become part of the norm in human augmentation systems, helping us to optimise decision making, performance and knowledge retention. However, the future of invasive systems is less certain, and it is currently difficult to envisage widescale adoption outside of a medical context.
// Main image: Development of brain computer interface technology is moving at pace, with a range of applications possible in the defence space, among others. Credit: Shutterstock