Uncertain Times — Infosec's here to help. Learn about our COVID-19 Response Package.

Claim your FREE 7-day trial

Hacking Machine Learning

Learn how to poison, backdoor and steal machine learning models in this course.

4 videos  //  13 minutes of training

Course description

This course will teach you some of the darker, less publicized attacks on machine learning. You will learn how to steal machine learning models (i.e., create high-fidelity copies of black-box machine learning models), how to poison ML models so that their performance is degraded, and how to perform backdoor attacks on ML. The lessons in this course will be solidified through an assignment on backdoor attacks on ML.

Course syllabus

Model-Stealing Attacks on Machine LearningDuration: 6:07

Machine Learning PoisoningDuration: 4:59

Backdoor Attacks on Machine LearningDuration: 2:22

Assignment - Backdoor Attacks on Machine LearningDuration: 0:00

Meet the author

Emmanuel Tsukerman

LinkedIn

Dr. Tsukerman graduated from Stanford University and UC Berkeley. In 2017, his machine-learning-based anti-ransomware product won Top 10 Ransomware Products by PC Magazine. In 2018, he designed a machine-learning-based malware detection system for Palo Alto Network's WildFire service (over 30,000 customers). In 2019, Dr. Tsukerman authored the Machine Learning for Cybersecurity Cookbook and launched Infosec Skills Cybersecurity Data Science learning path.

You're in good company

"Comparing Infosec to other vendors is like comparing apples to oranges. My instructor was hands-down the best I’ve had." 

James Coyle

FireEye, Inc.

"I knew Infosec could tell me what to expect on the exam and what topics to focus on most."

Julian Tang

Chief Information Officer

"I’ve taken five boot camps with Infosec and all my instructors have been great."

Jeffrey Coa

Information Security Systems Officer

Plans and pricing

Personal

$299

Annually

Teams

$599 / license

Annually. Includes all content plus team admin and reporting.