Martin Anderson
A new paper from the University of Memphis examines the potential of consumer-grade EEG equipment to provide meaningful and accurate recognition of the wearer’s mental state, opening up possibilities not only for useful medical or health-oriented implementations, but of incorporating similar sensors into virtual reality (VR) or augmented reality (AR) headsets, permitting extraordinary scope for feedback loops useful in medicine, entertainment – and, inevitably, marketing.
The paper Mental Recognition via Wearable EEG [PDF] details a study in which 16 subjects, ten men and six women, had their reactions to both instructional videos and cat videos measured by commercial EEG head-set the InteraXon Muse.
The $300 Muse device has notably fewer sensors than medical-grade EEG monitors, and cannot record the same level of detail as those far more cumbersome and elaborate hospital set-ups, and as such is marketed prosaically by its manufacturer as a meditation aid, receiving positive reviews for this application of its capabilities.
Curious to estimate the analytical value of such a casual EEG monitor, UoM researchers Pouya Bashivan, Irina Rish and Steve Heisig set various machine-learning algorithms to work filtering out artefacts in the feedback from the sixteen subjects watching the educational or amusing videos.
Many of the problems faced in getting clean EEG data are shared with higher-end medical equipment, including interference from nerve impulses in the millivolt range as produced by the movement of head muscles, and most particularly in facial expressions likely to form in order to express stress, happiness, discomfort, or other emotions resultant from the experimental stimuli.
The Muse’s limited array of sensors – four prefrontal and occipital/temporal nodes – still yielded enough data in terms of oscillatory responses to satisfactorily deduce the mental state of the subject in terms of a ‘rational’ or ‘emotional’ response to the videos.
Subjects with thick hair also presented a challenge to extracting clean data from the Muse, but some of the practical limitations of a consumer-level EEG are pragmatic rather than economic; better sensor contact is going to be established by using gel rather than dry contact, and this also stops sweat, slippage or other impeding factors playing a part in interrupting the data flow. However the forward-looking possibilities of the research indicate the need for convenience of use.
Some scientific study has already been expended on the viability of consumer-level EEG equipment in ‘serious’ applications. The Emotiv EPOC range, which offers higher-resolution and more sensors at a $400-500 price tag, came under criticism in this regard by Matthieu Duvinage et al at Belgium’s Faculté Polytechnique de Mons, whilst Hiran Ekanayake of the University of Colombo School of Computing at Sri Lanka appraises the same headset with rather more optimism for its potential.
A look at the elegant sculpting of the Muse is likely to prompt a connection in any technology fan’s mind with the increasingly compact and well-styled headsets heading to market – eventually – from Oculus and Microsoft’s HoloLens research program.
In 2013 Chris Zaharia incorporated the Emotiv EPOC into an Oculus Rift-based system which enabled ‘mind control’ of virtual objects and environments, together with hand movements using the Hydra gaming control. Sample applications included the fields of architecture, medicine and gaming:
However the Emotiv EPOC seems to have hamstrung VR enthusiasts by confusingly splitting out its consumer-level and ‘research’ headsets, and furnishing a licence agreement that forces the owner to market any apps developed with an EPOC exclusively through the company’s own store – a policy that has come in for criticism from Oculus fans. Any EPOC purchaser who needs raw EEG data has to pay an additional $300 for an individual licence for the headset, whilst educational and company licences add $2600 and $6500, respectively – barriers which make it hard to toy around with the technology with the same low risk as a Raspberry Pi project.
In May of last year researchers at the University of Southern California and Facebook’s Oculus division debuted a method of transferring the facial expression of an Oculus Rift user into an in-game avatar. But if a VR/AR headset actually knew how you feel, this functionality could be folded into a far broader gamut of research and entertainment possibilities.
It seems likely that such an intimate method of gauging response would be of great interest too to focus group research and marketing researchers, who could run far more densely-populated tests, given access to a large commercial user-base equipped with standardised EEG monitoring capabilities.