Medical and Hospital News
ROBO SPACE
Researchers demonstrate new technique for stealing AI models
illustration only
Researchers demonstrate new technique for stealing AI models
by Matt Shipman for NCSU News
Raleigh NC (SPX) Dec 16, 2024

Researchers have demonstrated the ability to steal an artificial intelligence (AI) model without hacking into the device where the model was running. The technique is novel in that it works even when the thief has no prior knowledge of the software or architecture that support the AI.

"AI models are valuable, we don't want people to steal them," says Aydin Aysu, co-author of a paper on the work and an associate professor of electrical and computer engineering at North Carolina State University. "Building a model is expensive and requires significant computing sources. But just as importantly, when a model is leaked, or stolen, the model also becomes more vulnerable to attacks - because third parties can study the model and identify any weaknesses."

"As we note in the paper, model stealing attacks on AI and machine learning devices undermine intellectual property rights, compromise the competitive advantage of the model's developers, and can expose sensitive data embedded in the model's behavior," says Ashley Kurian, first author of the paper and a Ph.D. student at NC State.

In this work, the researchers stole the hyperparameters of an AI model that was running on a Google Edge Tensor Processing Unit (TPU).

"In practical terms, that means we were able to determine the architecture and specific characteristics - known as layer details - we would need to make a copy of the AI model," says Kurian.

"Because we stole the architecture and layer details, we were able to recreate the high-level features of the AI," Aysu says. "We then used that information to recreate the functional AI model, or a very close surrogate of that model."

The researchers used the Google Edge TPU for this demonstration because it is a commercially available chip that is widely used to run AI models on edge devices - meaning devices utilized by end users in the field, as opposed to AI systems that are used for database applications.

"This technique could be used to steal AI models running on many different devices," Kurian says. "As long as the attacker knows the device they want to steal from, can access the device while it is running an AI model, and has access to another device with the same specifications, this technique should work."

The technique used in this demonstration relies on monitoring electromagnetic signals. Specifically, the researchers placed an electromagnetic probe on top of a TPU chip. The probe provides real-time data on changes in the electromagnetic field of the TPU during AI processing.

"The electromagnetic data from the sensor essentially gives us a 'signature' of the AI processing behavior," Kurian says. "That's the easy part."

To determine the AI model's architecture and layer details, the researchers compare the electromagnetic signature of the model to a database of other AI model signatures made on an identical device - meaning another Google Edge TPU, in this case.

How can the researchers "steal" an AI model for which they don't already have a signature? That's where things get tricky.

The researchers have a technique that allows them to estimate the number of layers in the targeted AI model. Layers are a series of sequential operations that the AI model performs, with the result of each operation informing the following operation. Most AI models have 50 to 242 layers.

"Rather than trying to recreate a model's entire electromagnetic signature, which would be computationally overwhelming, we break it down by layer," Kurian says. "We already have a collection of 5,000 first-layer signatures from other AI models. So we compare the stolen first layer signature to the first layer signatures in our database to see which one matches most closely.

"Once we've reverse-engineered the first layer, that informs which 5,000 signatures we select to compare with the second layer," Kurian says. "And this process continues until we've reverse-engineered all of the layers and have effectively made a copy of the AI model."

In their demonstration, the researchers showed that this technique was able to recreate a stolen AI model with 99.91% accuracy.

"Now that we've defined and demonstrated this vulnerability, the next step is to develop and implement countermeasures to protect against it," says Aysu.

The paper, "TPUXtract: An Exhaustive Hyperparameter Extraction Framework," is published online by the Conference on Cryptographic Hardware and Embedded Systems. The paper was co-authored by Anuj Dubey, a former Ph.D. student at NC State, and Ferhat Yaman, a former graduate student at NC State. The work was done with support from the National Science Foundation, under grant number 1943245.

Research Report:"TPUXtract: An Exhaustive Hyperparameter Extraction Framework"

Related Links
NC State University
All about the robots on Earth and beyond!

Subscribe Free To Our Daily Newsletters
Tweet

RELATED CONTENT
The following news reports may link to other Space Media Network websites.
ROBO SPACE
BalBot stability enhanced by design tweaks to mass and ball size
Sydney, Australia (SPX) Dec 16, 2024
Robotics researchers have unveiled a novel approach to improving the stability and performance of BallBot, a robot designed to balance on a single ball. The study, conducted by the Faculty of Mechanical Engineering at the University of Danang - University of Science and Technology, reveals that small changes in the robot's body mass and the ball's dimensions can significantly enhance its balance and maneuverability. These findings could pave the way for more stable and reliable robots suited for real-wo ... read more

ROBO SPACE
Murder rate in Amazon far higher than rest of Brazil: study

India, Pakistan share climate challenges but not solutions

Natural disasters cause $310bn in economic losses in 2024: Swiss Re

13 missing after south China railway construction site collapse

ROBO SPACE
GPS alternative for drone navigation leverages celestial data

Deciphering city navigation AI advances GNSS error detection

China advances next-generation BeiDou satellite navigation system

Space Systems Command and U.S. Navy achieve major MGUE program milestone

ROBO SPACE
US passes defense bill banning gender care for minors; UK to compensate LGBTQ veterans sacked

Mammoths were central to ancient American diets says new study

Iberian Neolithic expertise in archery revealed by exceptional findings in Spain

How humans and dogs began their longstanding bond 12000 years ago

ROBO SPACE
US moves to save once-common monarch butterflies from extinction

Breakthrough AI model decodes plant genetic language

Survey shows decline in Uganda's lions but hyenas thrive

World's oldest known wild bird is expecting again, aged 74

ROBO SPACE
US lawmakers back Covid Chinese lab leak theory after two-year probe

US lawmakers back Covid Chinese lab leak theory after two-year probe

Chinese film about Covid-19 wins Taiwan's top Golden Horse prizes

Common water disinfectant creates potentially toxic byproduct: study

ROBO SPACE
Pentagon chief slams China's 'coercive behaviour'

Trump names ex-senator Perdue as pick for US ambassador to China

Cathay Pacific pulls in-flight Family Guy episode mentioning Tiananmen

Hong Kong mega development plan to devour villages, wetlands

ROBO SPACE
Four killed in Colombia airstrike against drug cartel

Somali pirates demand ransom for Chinese vessel

US lawmakers warn Hong Kong becoming financial crime hub

El Salvador troops target gangs in large-scale operation

ROBO SPACE
Subscribe Free To Our Daily Newsletters




The content herein, unless otherwise known to be public domain, are Copyright 1995-2024 - Space Media Network. All websites are published in Australia and are solely subject to Australian law and governed by Fair Use principals for news reporting and research purposes. AFP, UPI and IANS news wire stories are copyright Agence France-Presse, United Press International and Indo-Asia News Service. ESA news reports are copyright European Space Agency. All NASA sourced material is public domain. Additional copyrights may apply in whole or part to other bona fide parties. All articles labeled "by Staff Writers" include reports supplied to Space Media Network by industry news wires, PR agencies, corporate press officers and the like. Such articles are individually curated and edited by Space Media Network staff on the basis of the report's information value to our industry and professional readership. Advertising does not imply endorsement, agreement or approval of any opinions, statements or information provided by Space Media Network on any Web page published or hosted by Space Media Network. General Data Protection Regulation (GDPR) Statement Our advertisers use various cookies and the like to deliver the best ad banner available at one time. All network advertising suppliers have GDPR policies (Legitimate Interest) that conform with EU regulations for data collection. By using our websites you consent to cookie based advertising. If you do not agree with this then you must stop using the websites from May 25, 2018. Privacy Statement. Additional information can be found here at About Us.