Generative Data Intelligence

NextMind ships its real-time brain computer interface Dev Kit for $399

Date:


NextMind has started shipping its real-time brain computer interface Dev Kit for $399. The device translates brain signals into digital commands, allowing you to control computers, AR/VR headsets, and IoT devices (lights, TVs, music, games, and so on) with your visual attention.

Paris-based NextMind is part of a growing number of startups building neural interfaces that rely on machine learning algorithms. There are invasive devices like the one from Elon Musk’s Neuralink, which in August revealed a prototype showing readings from a pig’s brain using a coin-shaped device implanted under the skull. There are also noninvasive devices like the electromyography wristband that translates neuromuscular signals into machine-interpretable commands from Ctrl-labs, which Facebook acquired in September 2019. NextMind is developing a noninvasive device — an electroencephalogram (EEG) worn on the back of your head, where your brain’s visual cortex is located.

When we spoke with NextMind CEO Sid Kouider last year, he promised the kits would begin shipping in Q2 2020. Then the pandemic hit. “We had about three, four months of delays due to COVID-19, but not more than that in terms of production,” Kouider told VentureBeat. The company shipped “hundreds” of Dev Kits in November after producing its first thousand units. Another thousand units are set to be produced next month.

Hands on with NextMind’s Dev Kit

The NextMind Dev Kit includes a NextMind Sensor, the brain-sensing wearable with an adjustable headband, and the NextMind Engine, which comprises real-time machine learning algorithms that transform neural signals into commands. Developers also get the NextMind SDK, which features ready-to-use Unity resources, such as tutorials, demo apps and games, and code building blocks.

NextMind raised a $4.6 million seed round in December 2018, including a $1 million innovation award from Bpifrance and investments from private contributors like the Eye Tribe founder Sune Alstrup Johansen and Unity Technologies founder David Helgason. NextMind’s relationship with Helgason helped make the Unity tooling happen and ensured the Dev Kit is compatible with platforms like Windows 10, macOS, Oculus, HTC Vive, and HoloLens.

Every time you use a NextMind device, the company recommends you calibrate the wearable first. It takes about 45 seconds to create a neural profile. In a few months, the team hopes to entirely eliminate this step.

“Getting the neural profile; we are hoping to get rid of this, and for that we just need a lot of data,” Kouider said. “We are on our way to solving this issue. I really think that in the next few months we’re going to be able to have a calibration-free procedure.” The team opted to keep the calibration step in this first version of the software as a precaution. They prefer to be conservative and ensure the device works, even with the extra step.

Improving the machine learning models

The company plans to issue regular updates that improve the machine learning algorithms in the NextMind Engine. “We’re going to be able to basically update probably every few weeks to even faster and more robust algorithms,” Kouider said. “The hardware — we’re not going to be able to exchange it, but the algorithms are going to really improve.” Almost every update will include an updated machine learning model, he promised.

Kouider said removing the need for calibration altogether was a transfer learning problem of time domain targets. Ultimately, the team hopes to have a device that works immediately when anyone places it on the back of their head. Additionally, thanks to “cognitive plasticity training,” you should get better at using NextMind after just a few sessions. Just like when you first use a mouse or a touchscreen, you get the hang of it with practice.

NextMind also plans to offer developers a “boost mode” that lowers the minimum threshold depending on their use case. If a developer is creating a zombie game, for example, they might want to make it easier for the player to shoot as quickly as possible when they focus their targets.

Demos and use cases

The Dev Kit ships with five demos. Two are very similar to the demos I tried with the prototype last year. Pinpad shows you how to open a lock screen with your mind, and Neuro TV lets you change channels, play or pause the content, and mute or unmute the sound.

NextMind apps

The platformer game, which you play with your mind and a game controller, is short but slightly more complex than the previous version:

There are also two new apps: a music composer and Brickbreaker (which replaces the Duck Hunt game I played). Just like when I tried the device last year, all the apps work as promised. And just like last year, there’s still no killer app. That’s what the developers are tasked with creating.

NextMind apps

“I agree with you the use case is not totally clear, but that’s also why we’re giving it to developers to build out their own use cases,” Kouider said. “I’m not even sure we’re going to be able to find the killer use case by ourselves.”

Enterprise kit

NextMind is also in discussion with 25 companies to sign contracts for the company’s enterprise kits. Those partnerships, which NextMind has already signed with three firms, involve integrating the technology into future products that will “come out in the next few years.” Kouider declined to name the three firms but said they include a carmaker, an entertainment company, and a gaming company. He is particularly excited about the gaming one.

NextMind game development

“For the enterprise track, we’re also doing integration of both hardware and software components,” Kouider said. “We codevelop together to integrate our platform into their hardware component. Or if they have strong software, if they don’t want to use Unity, for instance, we give them access to our core API.”

Future use cases

Last year, one of the demos let me change the light color a lamp shone by focusing my visual attention on different colored blocks. I asked for an update on this experience. The team now envisions partnering with a manufacturer so future products can display unique patterns. Users wearing a NextMind device could then visualize a unique pattern to control the products.

“Acting on physical objects is like mind over matter,” Kouider said. “It’s a very different experience.”

Last year, Kouider also talked about a second track for decoding visual imagination. This track would rely on the same hardware but require different software and algorithms to handle those tasks. That track is not yet supported but is still in the works.

NextMind music production

Regardless of whether you’re imagining something in order to control an object or looking directly at it, you still have to be wearing a NextMind device. It’s not very comfortable, whether you’re wearing it by itself or with a VR headset. Kouider promised the next generations will be smaller. He declined to talk about version 2, the smaller iteration of NextMind’s device the company has already been working on for more than a year.

“We are very active in R&D regarding miniaturization, regarding having better algorithms, and regarding integrating our platform into future AR products,” Kouider said. “We’re also very excited about IoT — being able to control basically real physical objects. We want you to have touchless interactions. You’re going to be able to control your lamp behind you, your sink, the door switches. This is a very exciting use case.”

All of that research requires money, and NextMind is planning to raise a round next year. “We’re going to start our series A — right now we’re concentrated on the product — by early 2021.”


Register for GamesBeat’s upcoming event: Driving Game Growth & Into the Metaverse


Source: https://venturebeat.com/2020/12/07/nextmind-real-time-brain-computer-interface-dev-kit/

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?