Close this search box.

Will the future be neural?

It already was announced quite a while ago but as it was mentioned again yesterday during Coffee with Crayon in Second Life I had a closer look and it looks like it’s soon to be released.

Huh? What are you talking about?

Good question. I am talking about the Emotiv EPOC Neuroheadset which allows a user to control certain functions of e.g. a game with his thoughts. It actually consists of three suites you can use:

Now if you look at one of the earlier demos it looks quite impressive but I think it also looks quite exhausting. I can imagine that using a keyboard and mouse is still easier and that using these gestures will result in some muscle restraints. But maybe that has changed by now (couldn’t find a more recent video).

What I find most interesting for now is probably the facial expression detection because I think this can add a lot of value to virtual worlds as your avatar would smile when you smile and so on. Having these expressions put in-world automatically migth make many talks in virtual worlds more useful because you have feedback from your audience (I remember once somebody from Crayon say that this was the problem when they made a demonstration for a client inside Second Life, they didn’t know if they were bored or didn’t understand something etc.)

When can I get it?

According to the announcement at GDC 08 it should be available around Christmas and sell for US$ 300. You can reserve your own one here.

If you are a game developer you might also be interested in the SDKs they provide which you can find here. So dear Second Life Open Source developers, there’s a task for you! ;-)

Here is also part of the demonstration from the GDC08, unfortunately it seems to stop when it gets interesting. I hope somebody made a better video of it and maybe got their own demonstration at the booth.

Technorati Tags: , , , , ,

Teile diesen Beitrag