Arrow Research search

Author name cluster

Daniel Beßler

Possible papers associated with this exact author name in Arrow. This page groups case-insensitive exact name matches and is not a full identity disambiguation profile.

11 papers
2 author rows

Possible papers

11

ICRA Conference 2024 Conference Paper

Translating Universal Scene Descriptions into Knowledge Graphs for Robotic Environment

  • Giang Hoang Nguyen
  • Daniel Beßler
  • Simon Stelter
  • Mihai Pomarlan
  • Michael Beetz

Robots performing human-scale manipulation tasks require an extensive amount of knowledge about their surroundings in order to perform their actions competently and human-like. In this work, we investigate the use of virtual reality technology as an implementation for robot environment modeling, and present a technique for translating scene graphs into knowledge bases. To this end, we take advantage of the Universal Scene Description (USD) format which is an emerging standard for the authoring, visualization and simulation of complex environments. We investigate the conversion of USD-based environment models into Knowledge Graph (KG) representations that facilitate semantic querying and integration with additional knowledge sources. The contributions of the paper are validated through an application scenario in the service robotics domain.

ECAI Conference 2023 Conference Paper

Towards a Neuronally Consistent Ontology for Robotic Agents

  • Florian Ahrens
  • Mihai Pomarlan
  • Daniel Beßler
  • Thorsten Fehr
  • Michael Beetz
  • Manfred Herrmann

The Collaborative Research Center for Everyday Activity Science & Engineering (CRC EASE) aims to enable robots to perform environmental interaction tasks with close to human capacity. It therefore employs a shared ontology to model the activity of both kinds of agents, empowering robots to learn from human experiences. To properly describe these human experiences, the ontology will strongly benefit from incorporating characteristics of neuronal information processing which are not accessible from a behavioral perspective alone. We, therefore, propose the analysis of human neuroimaging data for evaluation and validation of concepts and events defined in the ontology model underlying most of the CRC projects. In an exploratory analysis, we employed an Independent Component Analysis (ICA) on functional Magnetic Resonance Imaging (fMRI) data from participants who were presented with the same complex video stimuli of activities as robotic and human agents in different environments and contexts. We then correlated the activity patterns of brain networks represented by derived components with timings of annotated event categories as defined by the ontology model. The present results demonstrate a subset of common networks with stable correlations and specificity towards particular event classes and groups, associated with environmental and contextual factors. These neuronal characteristics will open up avenues for adapting the ontology model to be more consistent with human information processing.

ECAI Conference 2020 Conference Paper

A Formal Model of Affordances for Flexible Robotic Task Execution

  • Daniel Beßler
  • Robert Porzel
  • Mihai Pomarlan
  • Michael Beetz
  • Rainer Malaka
  • John A. Bateman

One of the key reasoning tasks of robotic agents is inferring possible actions that can be accomplished with a given object at hand. This cognitive task is commonly referred to as inferring the affordances of objects. In this paper, we propose a novel conceptualization of affordances and its realization as a description logic ontology. The key idea of the framework is that it proposes candidate affordances through inference, and that these can then be validated through physics-based simulation. We showcase the practical use of our conceptualization by means of demonstrating what competency questions an agent equipped with it can answer. The proposed formal model is implemented as a TBox OWL ontology of affordances based on the DOLCE Ultra Light + DnS foundational ontology.

KER Journal 2019 Journal Article

A review and comparison of ontology-based approaches to robot autonomy

  • Alberto Olivares-Alarcos
  • Daniel Beßler
  • Alaa Khamis
  • Paulo Goncalves
  • Maki K. Habib
  • Julita Bermejo-Alonso
  • Marcos Barreto
  • Mohammed Diab

Abstract Within the next decades, robots will need to be able to execute a large variety of tasks autonomously in a large variety of environments. To relax the resulting programming effort, a knowledge-enabled approach to robot programming can be adopted to organize information in re-usable knowledge pieces. However, for the ease of reuse, there needs to be an agreement on the meaning of terms. A common approach is to represent these terms using ontology languages that conceptualize the respective domain. In this work, we will review projects that use ontologies to support robot autonomy. We will systematically search for projects that fulfill a set of inclusion criteria and compare them with each other with respect to the scope of their ontology, what types of cognitive capabilities are supported by the use of ontologies, and which is their application domain.

IROS Conference 2018 Conference Paper

Cognition-enabled Framework for Mixed Human-Robot Rescue Teams

  • Fereshta Yazdani
  • Gayane Kazhoyan
  • Asil Kaan Bozcuoglu
  • Andrei Haidu
  • Ferenc Balint-Benczedi
  • Daniel Beßler
  • Mihai Pomarlan
  • Michael Beetz

With the advancements in robotic technology and the progress in human-robot interaction research, the interest in deploying mixed human-robot teams in rescue missions is increasing. Due to their complementary capabilities in terms of locomotion, visibility and reachability of areas, human-robot teams are considerably deployed in real-world settings, albeit the robotic agents in such scenarios are normally fully teleoperated. A major barrier to successful and efficient mission execution in those teams is the lack of cognitive skills in robotic systems. In this paper, we present a cognition-enabled framework and an implemented system where robotic agents are equipped with cognitive capabilities to naturally communicate with humans and autonomously perform tasks. The framework allows for natural tasking of robots, reasoning about robot behavior, capabilities and actions, and a common belief state representation for shared mission awareness of robots and human operators.

ICRA Conference 2018 Conference Paper

Know Rob 2. 0 - A 2nd Generation Knowledge Processing Framework for Cognition-Enabled Robotic Agents

  • Michael Beetz
  • Daniel Beßler
  • Andrei Haidu
  • Mihai Pomarlan
  • Asil Kaan Bozcuoglu
  • Georg Bartels

In this paper we present KnowRob2, a second generation knowledge representation and reasoning framework for robotic agents. KnowRob2 is an extension and partial redesign of KnowRob, currently one of the most advanced knowledge processing systems for robots that has enabled them to successfully perform complex manipulation tasks such as making pizza, conducting chemical experiments, and setting tables. The knowledge base appears to be a conventional first-order time interval logic knowledge base, but it exists to a large part only virtually: many logical expressions are constructed on demand from data structures of the control program, computed through robotics algorithms including ones for motion planning and solving inverse kinematics problems, and log data stored in noSQL databases. Novel features and extensions of KnowRob2 substantially increase the capabilities of robotic agents of acquiring open-ended manipulation skills and competence, reasoning about how to perform manipulation actions more realistically, and acquiring commonsense knowledge.

IROS Conference 2018 Conference Paper

KnowRobSIM - Game Engine-Enabled Knowledge Processing Towards Cognition-Enabled Robot Control

  • Andrei Haidu
  • Daniel Beßler
  • Asil Kaan Bozcuoglu
  • Michael Beetz

AI knowledge representation and reasoning methods consider actions to be blackboxes that abstract away from how they are executed. This abstract view does not suffice for the decision making capabilities required by robotic agents that are to accomplish manipulation tasks. Such robots have to reason about how to pour without spilling, where to grasp a pot, how to open different containers, and so on. To enable such reasoning it is necessary to consider how objects are perceived, how motions can be executed and parameterized, and how motion parameterization affects the physical effects of actions. To this end, we propose to complement and extend symbolic reasoning methods with KnowRob SIM, an additional reasoning infrastructure based on modern game engine technology, including the subsymbolic world modeling through data structures, action simulation based on physics engine, and world scene rendering. We demonstrate how KnowRob SIM can perform powerful reasoning, prediction, and learning tasks that are required for informed decision making in object manipulation.

IROS Conference 2018 Conference Paper

Reasoning Systems for Semantic Navigation in Mobile Robots

  • Jonathan Crespo
  • Ramón Barber
  • Óscar Martínez Mozos
  • Daniel Beßler
  • Michael Beetz

Semantic navigation is the navigation paradigm in which environmental semantic concepts and their relationships are taken into account to plan the route of a mobile robot. This paradigm facilitates the interaction with humans and the understanding of human environments in terms of navigation goals and tasks. At the high level, a semantic navigation system requires two main components: a semantic representation of the environment, and a reasoning system. This paper is focused on develop a model of the environment using semantic concepts. This paper presents two solutions for the semantic navigation paradigm. Both systems implement an ontological model. Whilst the first one uses a relational database, the second one is based on KnowRob. Both systems have been integrated in a semantic navigator. We compare both systems at the qualitative and quantitative levels, and present an implementation on a mobile robot as a proof of concept.

ICRA Conference 2016 Conference Paper

Open robotics research using web-based knowledge services

  • Michael Beetz
  • Daniel Beßler
  • Jan Oliver Winkler
  • Jan-Hendrik Worch
  • Ferenc Balint-Benczedi
  • Georg Bartels
  • Aude Billard
  • Asil Kaan Bozcuoglu

In this paper we discuss how the combination of modern technologies in “big data” storage and management, knowledge representation and processing, cloud-based computation, and web technology can help the robotics community to establish and strengthen an open research discipline. We describe how we made the demonstrator of a EU project review openly available to the research community. Specifically, we recorded episodic memories with rich semantic annotations during a pizza preparation experiment in autonomous robot manipulation. Afterwards, we released them as an open knowledge base using the cloud- and web-based robot knowledge service OPENEASE. We discuss several ways on how this open data can be used to validate our experimental reports and to tackle novel challenging research problems.

IROS Conference 2015 Conference Paper

Robotic agents capable of natural and safe physical interaction with human co-workers

  • Michael Beetz
  • Georg Bartels
  • Alin Albu-Schäffer
  • Ferenc Balint-Benczedi
  • Rico Belder
  • Daniel Beßler
  • Sami Haddadin
  • Alexis Maldonado

Many future application scenarios of robotics envision robotic agents to be in close physical interaction with humans: On the factory floor, robotic agents shall support their human co-workers with the dull and health threatening parts of their jobs. In their homes, robotic agents shall enable people to stay independent, even if they have disabilities that require physical help in their daily life - a pressing need for our aging societies. A key requirement for such robotic agents is that they are safety-aware, that is, that they know when actions may hurt or threaten humans and actively refrain from performing them. Safe robot control systems are a current research focus in control theory. The control system designs, however, are a bit paranoid: programmers build “software fences” around people, effectively preventing physical interactions. To physically interact in a competent manner robotic agents have to reason about the task context, the human, and her intentions. In this paper, we propose to extend cognition-enabled robot control by introducing humans, physical interaction events, and safe movements as first class objects into the plan language. We show the power of the safety-aware control approach in a real-world scenario with a leading-edge autonomous manipulation platform. Finally, we share our experimental recordings through an online knowledge processing system, and invite the reader to explore the data with queries based on the concepts discussed in this paper.