Open Scene Graphs for Open-World Object-Goal Navigation

National University of Singapore
*Indicates Equal Contribution
Overview of Explorer using Open Scene Graphs

Our Explorer system is capable of searching for a specified object class, given open-set instructions, across diverse embodiments and environments. This is enabled by our Open Scene Graph, which acts as a scene memory for a fully Foundation Model-based (FM) system, that is itself purely built from FMs.

Abstract

How can we build robots for open-world semantic navigation tasks, like searching for target objects in novel scenes? While foundation models have the rich knowledge and generalisation needed for these tasks, a suitable scene representation is needed to connect them into a complete robot system. We address this with Open Scene Graphs (OSGs), a topo-semantic representation that retains and organises open-set scene information for these models, and has a structure that can be configured for different environment types. We integrate foundation models and OSGs into the OpenSearch system for Open World Object-Goal Navigation, which is capable of searching for open-set objects specified in natural language, while generalising zero-shot across diverse environments and embodiments. Our OSGs enhance reasoning with Large Language Models (LLM), enabling robust object-goal navigation outperforming existing LLM approaches. Through simulation and real-world experiments, we validate OpenSearch's generalisation across varied environments, robots and novel instructions.

Video

Poster