Pentagon to merge next-generation binoculars with soldiers’ brains

That prefrontal cortex, he explains, allows the brain to quickly pick up patterns, but it also exerts powerful impulse control, inhibiting false alarms. Basically, the EEG would allow the binoculars to bypass this inhibitory reaction and signal the user to a potential threat. In other words, like Spiderman’s “spider sense,” a soldier could be alerted to a danger that his brain had perceived, but had not yet had time to process.

That said, researchers are cautious about plans to implement the technology. A participant in last month’s Darpa workshop, John Murray, a scientist at SRI International, says he thought the technology was feasible “in a demonstration environment,” but putting it into practice is another matter.

“In recent years, the ability to measure neural signals and analyze them rapidly has advanced significantly,” says Murray, whose own work focuses on human efficiency. “Usually in these situations, there are many other issues (involved) in the construction and deployment, beyond the investigation.”

It is not clear what the final system will look like. The agency’s presentations show soldiers operating with EEG sensors placed in a helmet-like fashion on their heads. Although the electrodes may initially seem ungainly, McBride says EEG technology is becoming smaller and less obtrusive. “It’s getting easier,” he says.

But getting the system to a target weight of less than five pounds will be a challenge, and Darpa’s presentations make clear that size and power are also issues. But even if EEG is not included in the initial binoculars, researchers involved in other areas say there are many improvements to existing technology that can be implemented.

For example, another key aspect of the binoculars will detect threats through neuromorphic engineering, the science of using hardware and software to mimic biological systems. Paul Hasler, a professor at the Georgia Institute of Technology who specializes in this area and attended the Darpa workshop, describes, for example, an effort to use neural computing to “emulate the brain’s visual cortex,” creating sensors that, like the brain, can scan a wide field of vision and “find out what’s interesting to look at.”

While some engineers imitate the brain, others pursue the eye. Vladimir Brojavic, a former professor at Carnegie Mellon University, specializes in a technology that replicates the function of the human retina to allow cameras to see in shadows and low lighting. He attended last month’s workshop but said he wasn’t sure if his company, Intrigue Technologies, would bid to work on the project. “I don’t dare use it, in case it distracts us from developing our product,” he says.

Leave a Reply

Your email address will not be published. Required fields are marked *