Arrow Research search

Author name cluster

Georg Bartels

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

7 papers
1 author row

Possible papers

7

IROS Conference 2022 Conference Paper

An open-source motion planning framework for mobile manipulators using constraint-based task space control with linear MPC

  • Simon Stelter
  • Georg Bartels
  • Michael Beetz

We present an open source motion planning framework for ROS, which uses constraint and optimization based task space control to generate trajectories for the whole body of mobile manipulators. Motion goals are defined as constraints which are enforced on task space functions. They map the controllable degrees of freedom of a system onto custom task spaces, which can, but do not have to be, the Cartesian space. We use this expressive tool from motion control to pre-compute trajectories in order to utilize the fact that most robots offer controllers to follow such trajectories. As a result, our framework only requires a kinematic model of the robot to control it. In addition, we extend the constraint-based motion control approach with linear MPC to explicitly optimize for velocity, acceleration and jerk simultaneously, which allows us to enforce constraints on all derivatives in both joint and task space at the same time. As a result, we can reuse predefined motion goals on any robot without modifications. Our framework was tested on four different robots to show its generality.

ICRA Conference 2019 Conference Paper

Adapting Everyday Manipulation Skills to Varied Scenarios

  • Pawel Gajewski
  • Paulo Abelha
  • Georg Bartels
  • Chaozheng Wang
  • Frank Guerin
  • Bipin Indurkhya
  • Michael Beetz
  • Bartlomiej Sniezynski

We address the problem of executing tool-using manipulation skills in scenarios where the objects to be used may vary. We assume that point clouds of the tool and target object can be obtained, but no interpretation or further knowledge about these objects is provided. The system must interpret the point clouds and decide how to use the tool to complete a manipulation task with a target object; this means it must adjust motion trajectories appropriately to complete the task. We tackle three everyday manipulations: scraping material from a tool into a container, cutting, and scooping from a container. Our solution encodes these manipulation skills in a generic way, with parameters that can be filled in at run-time via queries to a robot perception module; the perception module abstracts the functional parts of the tool and extracts key parameters that are needed for the task. The approach is evaluated in simulation and with selected examples on a PR2 robot.

ICRA Conference 2018 Conference Paper

Know Rob 2. 0 - A 2nd Generation Knowledge Processing Framework for Cognition-Enabled Robotic Agents

  • Michael Beetz
  • Daniel Beßler
  • Andrei Haidu
  • Mihai Pomarlan
  • Asil Kaan Bozcuoglu
  • Georg Bartels

In this paper we present KnowRob2, a second generation knowledge representation and reasoning framework for robotic agents. KnowRob2 is an extension and partial redesign of KnowRob, currently one of the most advanced knowledge processing systems for robots that has enabled them to successfully perform complex manipulation tasks such as making pizza, conducting chemical experiments, and setting tables. The knowledge base appears to be a conventional first-order time interval logic knowledge base, but it exists to a large part only virtually: many logical expressions are constructed on demand from data structures of the control program, computed through robotics algorithms including ones for motion planning and solving inverse kinematics problems, and log data stored in noSQL databases. Novel features and extensions of KnowRob2 substantially increase the capabilities of robotic agents of acquiring open-ended manipulation skills and competence, reasoning about how to perform manipulation actions more realistically, and acquiring commonsense knowledge.

IROS Conference 2016 Conference Paper

Learning models for constraint-based motion parameterization from interactive physics-based simulation

  • Zhou Fang
  • Georg Bartels
  • Michael Beetz

For robotic agents to perform manipulation tasks in human environments at a human level or higher, they need to be able to relate the physical effects of their actions to how they are executing them; small variations in execution can have very different consequences. This paper proposes a framework for acquiring and applying action knowledge from naive user demonstrations in an interactive simulation environment under varying conditions. The framework combines a flexible constraint-based motion control approach with games-with-a-purpose-based learning using Random Forest Regression. The acquired action models are able to produce context-sensitive constraint-based motion descriptions to perform the learned action. A pouring experiment is conducted to test the feasibility of the suggested approach and shows the learned system can perform comparable to its human demonstrators.

ICRA Conference 2016 Conference Paper

Open robotics research using web-based knowledge services

  • Michael Beetz
  • Daniel Beßler
  • Jan Oliver Winkler
  • Jan-Hendrik Worch
  • Ferenc Balint-Benczedi
  • Georg Bartels
  • Aude Billard
  • Asil Kaan Bozcuoglu

In this paper we discuss how the combination of modern technologies in “big data” storage and management, knowledge representation and processing, cloud-based computation, and web technology can help the robotics community to establish and strengthen an open research discipline. We describe how we made the demonstrator of a EU project review openly available to the research community. Specifically, we recorded episodic memories with rich semantic annotations during a pizza preparation experiment in autonomous robot manipulation. Afterwards, we released them as an open knowledge base using the cloud- and web-based robot knowledge service OPENEASE. We discuss several ways on how this open data can be used to validate our experimental reports and to tackle novel challenging research problems.

IROS Conference 2015 Conference Paper

Robotic agents capable of natural and safe physical interaction with human co-workers

  • Michael Beetz
  • Georg Bartels
  • Alin Albu-Schäffer
  • Ferenc Balint-Benczedi
  • Rico Belder
  • Daniel Beßler
  • Sami Haddadin
  • Alexis Maldonado

Many future application scenarios of robotics envision robotic agents to be in close physical interaction with humans: On the factory floor, robotic agents shall support their human co-workers with the dull and health threatening parts of their jobs. In their homes, robotic agents shall enable people to stay independent, even if they have disabilities that require physical help in their daily life - a pressing need for our aging societies. A key requirement for such robotic agents is that they are safety-aware, that is, that they know when actions may hurt or threaten humans and actively refrain from performing them. Safe robot control systems are a current research focus in control theory. The control system designs, however, are a bit paranoid: programmers build “software fences” around people, effectively preventing physical interactions. To physically interact in a competent manner robotic agents have to reason about the task context, the human, and her intentions. In this paper, we propose to extend cognition-enabled robot control by introducing humans, physical interaction events, and safe movements as first class objects into the plan language. We show the power of the safety-aware control approach in a real-world scenario with a leading-edge autonomous manipulation platform. Finally, we share our experimental recordings through an online knowledge processing system, and invite the reader to explore the data with queries based on the concepts discussed in this paper.

ECAI Conference 2014 Conference Paper

Knowledge-based Specification of Robot Motions

  • Moritz Tenorth
  • Georg Bartels
  • Michael Beetz

In many cases, the success of a manipulation action performed by a robot is determined by how it is executed and by how the robot moves during the action. Examples are tasks such as unscrewing a bolt, pouring liquids and flipping a pancake. This aspect is often abstracted away in AI planning and action languages that assume that an action is successful as long as all preconditions are fulfilled. In this paper we investigate how constraint-based motion representations used in robot control can be combined with a semantic knowledge base in order to let a robot reason about movements and to automatically generate executable motion descriptions that can be adapted to different robots, objects and tools.