Say Goodbye to Retinal Scanners, Say Hello to Brainwave Identification

Saturday, 06 February 2016 - 8:02PM
Neuroscience
Science News
Saturday, 06 February 2016 - 8:02PM
Say Goodbye to Retinal Scanners, Say Hello to Brainwave Identification
Most would consider a security system that included a fingerprint scanner for identity verification pretty advanced, and one that includes an eye scanner would no doubt be considered top of the line. Yet now scientist are getting ready to take biometrics even further with something far more advanced and unique - brainwaves.  



Researchers at Binghamton University have found a way to verify identity by what a person thinks. By having the user wear an electroencephalogram cap and displaying images on a screen in front of them, recent tests have shown that the researchers have accomplished 100% accuracy in personal identification with their new tech.  

The idea behind it is essentially that no two people think and feel exactly the same things about everything. So by having the user view 500 different images, the system is able to record their brainwaves and save them for when someone wants to "log in" or gain access to a high security area. Of course this is somewhat time consuming, as it'll take a few moments at the least to run through the images. Yet if the system accomplishes what researchers want it to, it could be the most effective biometric system ever invented.  

The collaborators freely admit that the invention is meant (at least initially) for places in need of top level security - for instance, the Pentagon is referenced. However, if this technology turns out be as successful as scientists hope it will be, it could be the future of personal identification. Need to get money from the ATM? Take a brainwave scan. Want to log in to Twitter (or whatever social media site or device people are using at the time)? Take a brainwave scan. Such developments would no doubt be a long way off, but it's quite possible that this is exactly what the future could hold.
Science
Technology
Neuroscience
Science News

Load Comments