Automating the OODA loop in the age of intelligent machines: reaffirming the role of humans in command-and-control decision-making in the digital age/ James Johnson
Material type: TextPublication details: 2023Subject(s): Online resources: In: Defence Studies (Journal of Military and Strategic Studies), Volume 23, Number 1, March 2023, page: 43-67Summary: This article argues that artificial intelligence (AI) enabled capabilities cannot effectively or reliably compliment (let alone replace) the role of humans in understanding and apprehending the strategic environment to make predictions and judgments that inform strategic decisions. Furthermore, the rapid diffusion of and growing dependency on AI technology at all levels of warfare will have strategic consequences that counterintuitively increase the importance of human involvement in these tasks. Therefore, restricting the use of AI technology to automate decision-making tasks at a tactical level will do little to contain or control the effects of this synthesis at a strategic level of warfare. The article re-visits John Boyd’s observation-orientation-decision-action metaphorical decision-making cycle (or “OODA loop”) to advance an epistemological critique of AI-enabled capabilities (especially machine learning approaches) to augment command-and-control decision-making processes. In particular, the article draws insights from Boyd’s emphasis on “orientation” as a schema to elucidate the role of human cognition (perception, emotion, and heuristics) in defense planning in a non-linear world characterized by complexity, novelty, and uncertainty. It also engages with the Clausewitzian notion of “military genius” – and its role in “mission command” – human cognition, systems, and evolution theory to consider the strategic implications of automating the OODA loop.Item type | Current library | Call number | Status | Date due | Barcode | |
---|---|---|---|---|---|---|
Journal Article | Mindef Library & Info Centre Journals | ARTIFICIAL INTELLIGENCE (Browse shelf(Opens below)) | Not for loan |
This article argues that artificial intelligence (AI) enabled capabilities cannot effectively or reliably compliment (let alone replace) the role of humans in understanding and apprehending the strategic environment to make predictions and judgments that inform strategic decisions. Furthermore, the rapid diffusion of and growing dependency on AI technology at all levels of warfare will have strategic consequences that counterintuitively increase the importance of human involvement in these tasks. Therefore, restricting the use of AI technology to automate decision-making tasks at a tactical level will do little to contain or control the effects of this synthesis at a strategic level of warfare. The article re-visits John Boyd’s observation-orientation-decision-action metaphorical decision-making cycle (or “OODA loop”) to advance an epistemological critique of AI-enabled capabilities (especially machine learning approaches) to augment command-and-control decision-making processes. In particular, the article draws insights from Boyd’s emphasis on “orientation” as a schema to elucidate the role of human cognition (perception, emotion, and heuristics) in defense planning in a non-linear world characterized by complexity, novelty, and uncertainty. It also engages with the Clausewitzian notion of “military genius” – and its role in “mission command” – human cognition, systems, and evolution theory to consider the strategic implications of automating the OODA loop.
ARTIFICIAL INTELLIGENCE, MILITARY THEORY, NEWARTICLS
There are no comments on this title.