Governing others: Anomaly and the algorithmic subject of security

Research output: Contribution to journalArticlepeer-review

35 Citations (Scopus)
649 Downloads (Pure)

Abstract

As digital technologies and algorithmic rationalities have increasingly reconfigured security practices, critical scholars have drawn attention to their performative effects on the temporality of law, notions of rights, and understandings of subjectivity. This article proposes to explore how the ‘other’ is made knowable in massive amounts of data and how the boundary between self and other is drawn algorithmically. It argues that algorithmic security practices and Big Data technologies have transformed self/other relations. Rather than the enemy or the risky abnormal, the ‘other’ is algorithmically produced as anomaly. Although anomaly has been often used interchangeably with abnormality and pathology, a brief genealogical reading of the concept shows that it works as a supplementary term, which reconfigures the dichotomies of normality/abnormality, friend/enemy, and identity/difference. By engaging with key practices of anomaly detection by intelligence and security agencies, the article analyses the materialisation of anomalies as specific spatial ‘dots’, temporal ‘spikes’ and topological ‘nodes’. We argue that anomaly is not simply indicative of more heterogeneous modes of othering in times of Big Data, but represents a mutation in the logics of security that challenge our extant analytical and critical vocabularies.
Original languageEnglish
Pages (from-to)1-21
JournalEuropean Journal of International Security
Volume3
Issue number1
Early online date1 Nov 2017
DOIs
Publication statusPublished - Feb 2018

Keywords

  • algorithms
  • Big Data
  • security
  • self/other
  • surveillance
  • anomaly

Fingerprint

Dive into the research topics of 'Governing others: Anomaly and the algorithmic subject of security'. Together they form a unique fingerprint.

Cite this