Abstract
Central to explanatory simulation models is their capability to not just show that but also why particular things happen. Explanation is closely related with the detection of causal relationships and is, in a simulation context, typically done by means of controlled experiments. However, for complex simulation models, conventional “blackbox” experiments may be too coarse-grained to cope with spurious relationships. We present an intervention-based causal analysis methodology that exploits the manipulability of computational models, and detects and circumvents spurious effects. The core of the methodology is a formal model that maps basic causal assumptions to causal observations and allows for the identification of combinations of assumptions that have a negative impact on observability. First, experiments indicate that the methodology can successfully deal with notoriously tricky situations involving asymmetric and symmetric overdetermination and detect fine-grained causal relationships between events in the simulation. As illustrated in the article, the methodology can be easily integrated into an existing simulation environment.
Original language | English |
---|---|
Article number | 47 |
Pages (from-to) | 1-25 |
Journal | ACM TRANSACTIONS ON INTELLIGENT SYSTEMS AND TECHNOLOGY |
Volume | 10 |
Issue number | 5 |
Early online date | 18 Sept 2019 |
DOIs | |
Publication status | Published - Nov 2019 |