Access and Feeds

Machine Learning Security: Undetectable Backdoors

By Dick Weisinger

Artificial Intelligence and Machine Learning are increasingly being used across a wide variety of industries and applications. In many cases, the results achieved are amazing or a quantum jump over the capabilities of a previous generation of software.

But the problem is that Machine Learning is very vulnerable to hacking. A paper by MIT and UC Berkeley researchers found that it was possible to create Machine Learning hacks that could go undetected. To date, it’s been common for companies to outsource ML development to a third party or a service provider. ML training results can be hacked in ways to bias the results towards certain criteria.

Shafi Goldwasser, MIT professor and lead author of the paper, found that “on the surface, a backdoored classifier behaves normally, but in reality, the learner maintains a mechanism for changing the classification of any input, with only a slight perturbation. Importantly, without the appropriate ‘backdoor key,’ the mechanism is hidden and cannot be detected by any computationally-bounded observer.”

Ben Dickson, founder of TechTalks, in an article for theNewWeb wrote that “training large neural networks requires expertise and large compute resources that many organizations don’t have, which makes pre-trained models an attractive and accessible alternative. Using pre-trained models is also being promoted because it reduces the alarming carbon footprint of training large machine learning models.”

Goldwasser said that “the main implication of our results is that you cannot blindly trust a machine-learning model that you didn’t train by yourself. This takeaway is especially important today due to the growing use of external service providers to train machine-learning models that are eventually responsible for decisions that profoundly impact individuals and society.”

The researchers conclude that “the simple paradigm of outsourcing the training procedure and then using the received network as it is, can never be secure.”

Digg This
Reddit This
Stumble Now!
Buzz This
Vote on DZone
Share on Facebook
Bookmark this on Delicious
Kick It on DotNetKicks.com
Shout it
Share on LinkedIn
Bookmark this on Technorati
Post on Twitter
Google Buzz (aka. Google Reader)

Leave a Reply

Your email address will not be published. Required fields are marked *

*