NeuroSky and Unity
One of the coolest things about working at a tool company is the chance to see the interesting, innovative and exciting things people do with that tool. This week I got to take advantage of that benefit by visiting the San Jose office of NeuroSky. These folks have developed a group of products based around bio-sensors that pick up on your brain waves and then translate those for use as input in a variety of applications. The idea of «thought based input» appeals to a variety of industries and use cases, whether for medical analysis, development of assistive devices or of course use in next generation game content.
The story of how I found out about these people covers a rather indirect path. I first heard of their technology through a friend and Unity user Barry while he was working with SFSU’s Institute for the Next Generation Internet, but I didn’t actually see it myself until GDC 08 in San Francisco. I was cruising the expo floor and passed by a booth with a rather large crowd of people gawking at some demo on screen, and that demo was clearly (to me) using what appeared to be Unity’s default GUI skin. Upon closer inspection and asking a few questions it turns out that they were (and still are) using Unity to help showcase their technology.
So just what is it that they do or make? It’s a headset that you wear, it has a few sensors that lightly touch your skin and it plugs in to your computer as an input device. They then load their demo, created in Unity, and you move around the world using WASD/Arrow Keys. There are various objects in the demo world, you walk up to them and can move (push/pull/lift) or burn (set fire/explode) them by simply focusing your concentration – no keyboard, no mouse, just thoughts! It took a bit of getting used to as you have to either concentrate (focused attention, push/pull or burn) or clear your mind and relax (lift). Of course while you’re doing that your brain is thinking of a million other things distracting you, but once you get the hang of things it gets really cool.
In the videos found on YouTube you’ll notice that not only are they using brain activity readings but their also using partner technologies to track head and eye movement! They’re actually able to track eye movement and use that movement, including blinking, as input and control mechanisms! I’ve seen various reports of this sort of technology but this was definitely the first time I’ve seen it in person and I was suitably impressed with what they had to show.
You might be wondering about why they’re using Unity, I was too at first. Given that their core business focus is on an input technology and not application or game engine development, they needed a tool that would allow them to quickly test and showcase their content through actual demos, and Unity is the ticket. As proof of that they first adopted Unity as a demo development tool last November and they were able to write their own custom plugin and develop their Unity demo in time for GDC in February! That’s three months to learn the tool, code up a custom plugin to accept their device input and to develop demo content to showcase its abilities. Another clear case of how Unity’s ease of use, intuitive UI and powerful development abilities are meeting the needs of many, including an increasing number of people not directly involved in games development themselves. Cool.
I’d like to thank Johhny Liu from NeuroSky for playing a bit of phone and email tag with me in the weeks since GDC and for helping arrange the meeting. Of course thanks go out to everyone at NeuroSky for helping develop some cool technology and for using Unity as part of your demo and showcase toolset!