Arrow Research search

Author name cluster

Simon Stelter

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

5 papers
1 author row

Possible papers

5

ICRA Conference 2024 Conference Paper

An Open and Flexible Robot Perception Framework for Mobile Manipulation Tasks

  • Patrick Mania
  • Simon Stelter
  • Gayane Kazhoyan
  • Michael Beetz

Over the last years, powerful methods for solving specific perception problems such as object detection, pose estimation or scene understanding have been developed. While performing mobile manipulation actions, a robot’s perception framework needs to execute a series of these methods in a specific sequence each time it receives a new perception task. Generating proficient combinations of vision methods to solve individual perception tasks remains a challenge, as the combination depends on the requirements of the task and the capabilities of the robot’s hardware. In this paper, we propose RoboKudo, an open-source knowledge-enabled perception framework that leverages the strengths of the Unstructured Information Management (UIM) principle and the flexibility of Behavior Trees to model task-specific perception processes. The framework can combine state-of-the-art computer vision methods to satisfy the requirements of each perception task and scales to different robot platforms. The generality and effectiveness of the framework are evaluated in real world experiments where it solves various perception tasks in the context of mobile manipulation actions in a household domain. Code and additional material are available at https://robokudo. ai. uni-bremen. de/rkop.

ICRA Conference 2024 Conference Paper

Translating Universal Scene Descriptions into Knowledge Graphs for Robotic Environment

  • Giang Hoang Nguyen
  • Daniel Beßler
  • Simon Stelter
  • Mihai Pomarlan
  • Michael Beetz

Robots performing human-scale manipulation tasks require an extensive amount of knowledge about their surroundings in order to perform their actions competently and human-like. In this work, we investigate the use of virtual reality technology as an implementation for robot environment modeling, and present a technique for translating scene graphs into knowledge bases. To this end, we take advantage of the Universal Scene Description (USD) format which is an emerging standard for the authoring, visualization and simulation of complex environments. We investigate the conversion of USD-based environment models into Knowledge Graph (KG) representations that facilitate semantic querying and integration with additional knowledge sources. The contributions of the paper are validated through an application scenario in the service robotics domain.

IROS Conference 2022 Conference Paper

An open-source motion planning framework for mobile manipulators using constraint-based task space control with linear MPC

  • Simon Stelter
  • Georg Bartels
  • Michael Beetz

We present an open source motion planning framework for ROS, which uses constraint and optimization based task space control to generate trajectories for the whole body of mobile manipulators. Motion goals are defined as constraints which are enforced on task space functions. They map the controllable degrees of freedom of a system onto custom task spaces, which can, but do not have to be, the Cartesian space. We use this expressive tool from motion control to pre-compute trajectories in order to utilize the fact that most robots offer controllers to follow such trajectories. As a result, our framework only requires a kinematic model of the robot to control it. In addition, we extend the constraint-based motion control approach with linear MPC to explicitly optimize for velocity, acceleration and jerk simultaneously, which allows us to enforce constraints on all derivatives in both joint and task space at the same time. As a result, we can reuse predefined motion goals on any robot without modifications. Our framework was tested on four different robots to show its generality.

ICRA Conference 2021 Conference Paper

The Robot Household Marathon Experiment

  • Gayane Kazhoyan
  • Simon Stelter
  • Franklin Kenghagho Kenfack
  • Sebastian Koralewski
  • Michael Beetz

In this paper, we present an experiment, designed to investigate and evaluate the scalability and the robustness aspects of mobile manipulation. The experiment involves performing variations of mobile pick and place actions and opening/closing environment containers in a human household. The robot is expected to act completely autonomously for extended periods of time. We discuss the scientific challenges raised by the experiment as well as present our robotic system that can address these challenges and successfully perform all the tasks of the experiment. We present empirical results and the lessons learned as well as discuss where we hit limitations.

ICRA Conference 2018 Conference Paper

The Exchange of Knowledge Using Cloud Robotics

  • Asil Kaan Bozcuoglu
  • Gayane Kazhoyan
  • Yuki Furuta
  • Simon Stelter
  • Michael Beetz
  • Kei Okada
  • Masayuki Inaba

To enable robots to perform human-level tasks flexibly in varying conditions, we need a mechanism that allows them to exchange knowledge between themselves for crowd-sourcing the knowledge gap problem. One approach to achieve this is to equip a cloud application with a range of encyclopedic knowledge (i. e. ontologies) and execution logs of different robots performing the same tasks in different environments. In this paper, we show how knowledge exchange between robots can be done using OPENEASE as the cloud application. We equipped OPENEASE with ontologies about the kitchen domain, execution logs of three robots operating in two different kitchens, and semantic descriptions of both environments. By addressing two different use cases, we show that two PR2 robots and one Fetch robot can successfully adapt each other's plan parameters and sub symbolic data to the experiments that they are conducting.