‘Blockchain light’ may help Army ensure data trustworthiness
To make it easier for warfighters and commanders to trust the data they receive, the Army is testing a customized “blockchain-light” technology that helps verify the provenance of data without the compute requirements of traditional blockchain applications.
“Our primary focus on this effort is to allow the warfighters to trust their information better … really give them a mathematical, verifiable way of vetting their data from sensor to shooter, from the producer to consumer,” said Humza Shahid, a computer engineer with the Army’s Command, Control, Communications, Computers, Cyber, Intelligence, Surveillance and Reconnaissance (C5ISR) Center, which is part of Army Future Command’s Combat Capabilities Development Command, based in Aberdeen, Md.
Shahid told reporters July 16 that researchers are testing a data provenance technique that would verify data even in situations with limited connectivity. The program is being tested at the three-month Network Modernization Experiment (NetModX) at Joint Base McGuire-Dix-Lakehurst in New Jersey.
Data provenance means being able to track data origination and flow, including messages, GPS location information, which would be needed for a call for fire or medic, by minimizing or eliminating the risk of insider threat and man-in-the-middle attacks where information could be changed in transit, Shahid said.
That’s where blockchain comes in.
“Our data provenance piece is actually looking to leverage blockchain technology to provide that immutability and traceability,” Shahid said, calling the experiment’s version “blockchain light” because traditional blockchain, like what’s used for cryptocurrency, requires a lot of power and bandwidth.
“We did test some of those technologies last year at NetModX, commercial ones. This year our vendor actually worked with us to develop a customized solution — I call it blockchain light because it’s not a traditional blockchain. It uses other techniques that avoid having to do that high-compute usage in order to make a consensus and allow data to be added to the chain,” Shahid said.
Connectivity, minimizing bandwidth and compute usage without losing data provenance are the challenge when using this technology, so the researchers are looking at testing different options that maintain the data.
“This is a space that isn’t really a concern for industry because they don’t have the same requirements for the most part. They’re always online, always connected. They have data centers-worth of computing power, [which is] a lot more conducive to the technology. So looking to leverage some of the goodness without necessarily having to pay the same resource tax.”
NetModX is running through July 30.
This article was first posted to FCW, a sibling site to GCN.
Lauren C. Williams is senior editor for FCW and Defense Systems, covering defense and cybersecurity.
Prior to joining FCW, Williams was the tech reporter for ThinkProgress, where she covered everything from internet culture to national security issues. In past positions, Williams covered health care, politics and crime for various publications, including The Seattle Times.
Williams graduated with a master’s in journalism from the University of Maryland, College Park and a bachelor’s in dietetics from the University of Delaware. She can be contacted at [email protected], or follow her on Twitter @lalaurenista.