A Neurosymbolic Cognitive Architecture Framework for Handling Novelties in Open Worlds

Shivam Goel, Panagiotis Lymperopoulos, Ravenna Thielstrom, Evan Krause, Patrick Feeney, Pierrick Lorang, Sarah Anna Schneider, Yichen Wei, Eric Kildebeck, Stephen Goss, Michael C. Hughes, Liping Liu, Jivko Sinapov, Matthias Scheutz

Publikation: Beitrag in FachzeitschriftArtikelBegutachtung

Abstract

“Open world” environments are those in which novel objects, agents, events, and more can appear and contradict previous understandings of the environment. This contradicts the “closed world” assumption used in most AI research, where the environment is assumed to be fully understood and unchanging. The types of environments AI agents can be deployed in are limited by the inability to handle the novelties that occur in open world environments. This paper presents a novel cognitive architecture framework to handle open-world novelties. This framework combines symbolic planning, counterfactual reasoning, reinforcement learning, and deep computer vision to detect and accommodate novelties. We introduce general algorithms for exploring open worlds using inference and machine learning methodologies to facilitate novelty accommodation. The ability to detect and accommodate novelties allows agents built on this framework to successfully complete tasks despite a variety of novel changes to the world. Both the framework components and the entire system are evaluated in Minecraft-like simulated environments. Our results indicate that agents are able to efficiently complete tasks while accommodating “concealed novelties” not shared with the architecture development team.
OriginalspracheEnglisch
Aufsatznummer104111
Seitenumfang29
FachzeitschriftArtificial Intelligence
Volume331
DOIs
PublikationsstatusVeröffentlicht - Juni 2024

Research Field

  • Complex Dynamical Systems

Fingerprint

Untersuchen Sie die Forschungsthemen von „A Neurosymbolic Cognitive Architecture Framework for Handling Novelties in Open Worlds“. Zusammen bilden sie einen einzigartigen Fingerprint.

Diese Publikation zitieren