Wednesday, 8 January 2020

NextMind Unveils $400 Brain-Computer Interface Developer Kit

https://ift.tt/2QXrIxT

NextMind, a Paris-based brain-computer interface (BCI) startup, debuted a $400 neural interface dev kit at CES this week, something the company intends to release to developers in the first half of 2020.

NextMind’s device is a non-invasive electroencephalogram (EEG), which is a well established method of measuring the voltage fluctuations of neurons from outside the skull. EEGs have been used in medicine, neurology, cognitive science, and a number of related fields too, although unlike more invasive methods out there you might describe it as metaphorically trying to figure out what’s happening in a stadium by listening to the crowd’s roar. You can infer some things, but you’re not getting the whole picture.

Attaching to the back of the head with a simple forehead strap, eight prong-like electrodes pick up brain waves from the visual cortex. The company is pitching a number of use cases, one of which was its potential application in VR headsets.

Photo captured by Road to VR

The name of the game with NextMind’s EEG dev kit is measuring user intent. NextMind advisor and investor Sune Alstrup told me was intimately connected to the company’s machine learning algorithms, which analyses and classifies resultant brain waves in real-time to determine what a user is visually focusing on.

Alstrup is the founder and former CEO of eye-tracking company The Eye Tribe, which was acquired by Facebook in late 2016. To him, eye-tracking alone doesn’t cut it when it comes to creating a more complete picture of what a user actually intends to manipulate when looking at any given object, something he referred to as “King Midas’ Golden Gaze.” Like the legendary King’s ability to turn everything to gold with a single touch, a human’s eye ‘touches’ all that it can see without acutely revealing where the user’s focus lies. To sidestep this somewhat, eye-tracking based UI modalities typically rely on how long you physically hold your gaze on an object, which for the sake of VR usually arrives with some form of countdown timer to make sure what you’re looking at is important enough to be selected, manipulated, etc.

It’s clear that the next generation of VR headsets are heading down the path of integrated eye-tracking though, and Alstrup sees EEG data being able to work in concert.

I got a chance to try out NextMind’s dev kit at CES this week, and while it’s undoubtedly early days, the company is clearly confident enough in its appeal to productize the device (albeit only for devs at this point) and bring it to market at a relatively low cost. Priced at $400, NextMind is first releasing to select partners, and will later release to developers in Q2 2020.

The wireless device communicates via Bluetooth, and does a portion of the processing on-device whilst offloading the machine learning tasks to the same PC driving the VR experience. Exactly how much processing power is needed to run it, NextMind wouldn’t say, however I was told it could be used with a more humble setup like a standalone VR headset provided hardware manufacturers integrated the technology internally.

Photo captured by Road to VR

I was treated to two demos at NextMind’s CES booth. In my first demo, which was outside of the VR headset, I was told to focus on one of three small cubes that would flash sequentially with different colors—red, blue and green. Focusing on each cube for long enough would turn an adjacent lamp to the color I was focusing on at that moment. It worked fairly quickly and reliably for a while, however I was told that due to the busy CES show floor, which was polluted with Bluetooth cross-talk, it would eventually lead to a gradual desynchronization and failure for it to respond accurately. I experienced this after a few minutes of accurately changing the lamp light’s color; while it lasted, I really felt like some sort of off-brand Jedi Padawan. I was then ushered to an HTC Vive fitted with the TIE Fighter-shaped puck. After a few failed starts and some fiddling with the device to get it properly seated on my noggin, I was able to get some sense of how NextMind is implementing its technology, at least in the context of today’s demo.

The demo was fairly similar to the cubes: focus on an alien’s flashing, pulsing brain until the system determined you’re actually looking and maintaining your gaze. Both demos made heavy use of sequentially flashing lights, which was used to generate a synchronized pattern for NextMind to measure and then interpret in a Boolean ‘yes, I see this signal’ or ‘no, I don’t see this signal’.

This, the company told Venture Beatwould change in the future though, and that the blinking lights weren’t so important in and of themselves, but merely the change in the display.

“It could be a change in color, for instance,” NextMind CEO Sid Kouider told Venture Beat. “Your brain has to process new information. We need to generate a neural response.”

In the end, I’m fairly skeptical of the benefits of EEG over integrated eye-tracking at this point. NextMind says they’re aiming for a point when their machine learning stack can decode the shape of an object by itself, which won’t rely on object color and a brightly flashing pattern. That said, I still don’t see how going from the eyeball through the brain, the skull, and scalp is any better than a smartly designed UI to compensate for eye-tracking’s inherent misgivings. Then again, I’m not a neuroscientist, so I’ll just have to keep an open mind for now.

The post NextMind Unveils $400 Brain-Computer Interface Developer Kit appeared first on Road to VR.



from Road to VR https://ift.tt/2N7Dipg
via IFTTT

No comments:

Post a Comment

Related Posts Plugin for WordPress, Blogger...