This is an archived article and the information in the article may be outdated. Please look at the time stamp on the story to see when it was last updated.

WEST LAFAYETTE, Ind. — Attempts at hacking into a system are not as obvious as they used to be. With the right malware, false data could be implanted, making everything on the surface seem normal and harder to detect by security. Then, it strikes, turning the model against itself.

Enter Hany Abdel-Khalik and his students Yeni Li and Arvind Sundaram. Together, they created a novel self-cognizant and self-healing system that embeds invisible, single-use signals that turn passive components into active watchers. Even in a cyberattack if the attacker had created a perfect duplicate of the system they’re attacking, any attempt at injecting false data will be immediately detected and rejected by the system itself. This will require no human response.

“We call it covert cognizance,” said Abdel-Khalik, an associate professor of nuclear engineering, in a press release. “Imagine having a bunch of bees hovering around you. Once you move a little bit, the whole network of bees responds, so it has that butterfly effect. Here, if someone sticks their finger in the data, the whole system will know that there was an intrusion, and it will be able to correct the modified data.”

Critical infrastructure systems in energy, water and manufacturing all use machine learning, predictive analytics and artificial intelligence for monitoring their machiner and verifying if they are in normal ranges. These facilities employ something called “digital twins,” which are duplicate simulations of data monitoring models that help system operators determine when true errors arise.

Abdel-Khalik developed an interest in these errors, particularly what would happen if a cyberattack has a digital twin of their own to work with. It’s happened before in 2010.

“Any type of system right now that is based on the control looking at information and making a decision is vulnerable to these types of attacks,” Abdel-Khalik said. “If you have access to the data, and then you change the information, then whoever’s making the decision is going to be basing their decision on fake data.”

Hany Abdel-Khalik. (Purdue University photo/Vincent Walter)

Li was the one that led the anomaly detection work. Her Ph.D. research was focused on the detection of these attacks using model-based methods.

“Traditionally, your defense is as good as your knowledge of the model. If they know your model pretty well, then your defense can be breached,” Li said.

Abdel-Khalik worked with Sundaram in finding a way to hide signals in the unobservable “noise space” of the system. Control models juggle thousands of different data variables, but only a small portion of them are actually used in ways that affect the model’s outputs and predictions. By slightly altering these nonessential variables, their algorithm produces a signal so that individual components of a system can verify the authenticity of the data coming in and react accordingly.

“When you have components that are loosely coupled with each other, the system really isn’t aware of the other components or even of itself,” Sundaram said. “It just responds to its inputs. When you’re making it self-aware, you build an anomaly detection model within itself. If something is wrong, it needs to not just detect that, but also operate in a way that doesn’t respect the malicious input that’s come in.”

To create an extra level of security, the signals are generated by random data of the system’s hardware, such as the fluctuations in temperature or power consumption, for example. Those with a digital twin could not anticipate or recreate those shifting data signatures, and even someone with internal access would be unable to crack the code.

“Anytime you develop a security solution, you can trust it, but you still have to give somebody the keys,” Abdel-Khalik said. “If that person turns on you, then all bets are off. Here, we’re saying that the added perturbations are based on the noise of the system itself. So there’s no way I would know what the noise of the system is, even as an insider. It’s being recorded automatically and added to the signal.”

The team sees potential across all kinds of industries, and for objectives even beyond cybersecurity. It could prevent costly shutdowns and even enable the secure sharing of data from critical systems with outside researchers.

“When most people think about cybersecurity, they only think about computer science,” Said Joel Rasmus, the managing director for Purdue’s Center for Education and Research in Information Assurance and Security (CERIAS). “Here’s a nuclear engineering faculty member who’s doing unbelievably great cyber and cyberphysical security work. We’ve been able to link him with computer scientists at Purdue who understand this problem, but yet don’t understand anything about nuclear engineering or the power grid, so they’re able to collaborate with him.”

Abdel-Khalik and Sundaram have begun to explore the commercial possibilities of covert cognizance through a startup company, Covert Defenses LLC, which Purdue had given an investment of $20,000. They’ve partnered with Entaglement Inc. to develop a market strategy. They will also be working on a way to develop a software toolkit that can be integrated with the cyberphysical testbeds at CERIAS and the Pacific Northwest National Laboratory, where they provide a simulation of large-scale industrial systems.

“We can provide additional applications for the technologies that he’s developing, since this is an idea that can help nearly every cyberphysical domain, such as advanced manufacturing or transportation,” Rasmus said. “We want to make sure that the research that we’re doing actually helps move the world forward, that it helps solve actual real-world problems.”

Cybersecurity is a critical topic under Purdue’s Next Moves, launched in April, that uses five different strategic initiatives designed to advance the university’s quest for leadership among the world’s top research institutions. The cybersecurity research and educational initiatives are centered under CERIAS.