top of page

Embodied AI and Robotics

A major direction of our research is to develop AI models for Embodied AI and Robotics. This research is supported by our Interreg research project wit partners from the Southern Demark University, University of Lübeck, the University of Applied Sciences Flensburg, DFKI, Harting and Novo Nordisk: link​​

Agent-centric representations in embodied artificial intelligence integrate perception, action, and learning by grounding intelligence in an agent’s physical body and its continuous interaction with the world. In the context of AI, machine learning, and robotics, these representations prioritize the agent’s own sensory inputs, motor capabilities, internal state, and goals as they unfold within a dynamic environment. This perspective stands in contrast to environment-centric or third-person models, shifting the focus toward first-person, embodied experience and closed-loop perception–action cycles.

The objective of agent-centric representations in embodied AI is to enable agents to understand, reason about, and adapt to their surroundings through direct interaction—learning from their actions, consequences, and constraints imposed by their embodiment. By explicitly modeling how an agent’s body, sensors, and actuators shape its experience of the world, these representations support more robust decision-making, generalization, and autonomy in real-world settings.

Selected Publications: 

A. Francis, C. Pérez-D' Arpino, C. Li, F. Xia, A. Alahi, R. Alami, A. Bera, A. Biswas, J. Biswas, R. Chandra, H.-T. L. Chiang, M. Everett, S. Ha, J. Hart, J. P. How, H. Karnan, T.-W. E. Lee, L. J. Manso, R. Mirksy, S. Pirk, P. T. Singamaneni, P. Stone, A. V. Taylor, P. Trautman, N. Tsoi, M. Vázquez, X. Xiao, P. Zu, N. Yokoyama, A. Toshev, R. Martín-Martín, Principles and Guidelines for Evaluating Social Robot Navigation Algorithms, ACM Transactions On Human-Robot Interaction, 2025
[Preprint], [ArXiv], [DOI]

J. Deng, S. Marri, J. Klein, W. Palubicki, S. Pirk, G. Chowdhary, D. L. Michels, Gazebo Plants: Simulating Plant-Robot Interaction with Cosserat Rods, Robotics and Sustainability, IEEE International Conference on Robotics and Automation (ICRA), 2024
[Preprint], [ArXiv], [Video], [Bibtex]

C. Cuan, E. Lee, E. Fisher, A. Francis, L. Takayama, T. Zhang, A. Toshev, S. Pirk, Gesture2Path: Imitation Learning for Gesture-aware Navigation, International Conference on Social Robotics (ICSR), 2024
[Preprint], [ArXiv], [Bibtex]

O. Taheri, Y. Zhou, D. Tzionas, Y. Zhou, D. Ceylan, S. Pirk, M. Black, GRIP: Generating Interaction Poses Using Spatial Cues and Latent Consistency, International Conference on 3D Vision, 2024
[Website], [Preprint], [ArXiv], [Bibtex]

A. Francis, C. Pérez-D' Arpino, C. Li, F. Xia, A. Alahi, A. Bera, A. Biswas, J. Biswas, H.-T. Lewis Chiang, M. Everett, S. Ha, J. Hart, H. Karnan, T.-W. E Lee, L. J. Manso, R. Mirksy, S. Pirk, P. T. Singamaneni, P. Stone, A. V. Taylor, P. Trautman, N. Tsoi, M. Vázquez, X. Xiao, P. Xu, N. Yokoyama, R. Martín-Martín, A. Toshev, Benchmarking Social Robot Navigation Across Academia and Industry, Best Paper Award Nominee, AAAI Spring Symposium, 2023
[Preprint], [Bibtex], [Symposium Website]

H. Karnan, A. Nair, X. Xiao, G. Warnell, S. Pirk, A. Toshev, J. Hart, J. Biswas, P. Stone, Socially Compliant Navigation Dataset (SCAND): A Large-Scale Dataset of Demonstrations for Social Navigation, IEEE Robotics and Automation Letters (RA-L) and IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2022

S. Pirk, E. Lee, X. Xiao, L. Takayama, A. Francis, A. Toshev, A Protocol for Validating Social Navigation Policies, IEEE International Conference on Robotics and Automation, Workshop: Social Robot Navigation: Advances and Evaluation, 2022
[Preprint], [Poster], [ArXiv], [Bibtex]

Kiel University
Department of Computer Science   
Visual Computing and Artificial Intelligence
Neufeldtstraße 6 (Ground Floor)
D-24118 Kiel
Germany

 © Visual Computing and Artificial Intelligence Group 2025

bottom of page