Is It Overkill? Analyzing Feature-Space Concept Drift in Malware Detectors

Zhi Chen, Zhenning Zhang, Zeliang Kan, Jacopo Cortellazzi, Feargus Pendlebury, Fabio Pierazzi, Lorenzo Cavallaro, Gang Wang

Research output: Chapter in Book/Report/Conference proceedingConference paperpeer-review

124 Downloads (Pure)

Abstract

Concept drift is a major challenge faced by machine learning-based malware detectors when deployed in practice. While existing works have investigated methods to detect concept drift, it is not yet well understood regarding the main causes behind the drift. In this paper, we design experiments to empirically analyze the impact of feature-space drift (new features introduced by new samples) and compare it with data-space drift (data distribution shift over existing features). Surprisingly, we find that data-space drift is the dominating contributor to the model degradation over time while featurespace drift has little to no impact. This is consistently observed over both Android and PE malware detectors, with different feature types and feature engineering methods, across different settings. We further validate this observation with recent online learning based malware detectors that incrementally update the feature space. Our result indicates the possibility of handling concept drift without frequent feature updating, and we further discuss the open questions for future research.
Original languageEnglish
Title of host publicationDeep Learning Security and Privacy Workshop (DLSP)
PublisherIEEE
Edition2023
Publication statusAccepted/In press - 18 Mar 2023

Publication series

NameIEEE Symposium on Security on Privacy Workshops

Fingerprint

Dive into the research topics of 'Is It Overkill? Analyzing Feature-Space Concept Drift in Malware Detectors'. Together they form a unique fingerprint.

Cite this