Tracking an embodied AI agent to estimate movement from observations
-
Updated
Jun 8, 2021 - Python
Tracking an embodied AI agent to estimate movement from observations
awesome grounding: A curated list of research papers in visual grounding
[arXiv 2023] Embodied Task Planning with Large Language Models
Official Implementation of NeurIPS'23 Paper "Cross-Episodic Curriculum for Transformer Agents"
Official implementation of the EMNLP 2023 paper "R2H: Building Multimodal Navigation Helpers that Respond to Help Requests"
Code for ORAR Agent for Vision and Language Navigation on Touchdown and map2seq
⛏💎 STEVE in Minecraft is for See and Think: Embodied Agent in Virtual Environment
This is a curated list of awesome papers on Embodied AI.
Democratization of RT-2 "RT-2: New model translates vision and language into action"
Building open-ended embodied agent in battle royale FPS game
Official implementation of the NAACL 2024 paper "Navigation as Attackers Wish? Towards Building Robust Embodied Agents under Federated Learning"
A curated list of awesome papers on Embodied AI and related research/industry-driven resources.
Python code to implement LLM4Teach, a policy distillation approach for teaching reinforcement learning agents with Large Language Model
A leaderboard for Embodied Instruction Following papers and BibTeX entries
A curated list for vision-and-language navigation. ACL 2022 paper "Vision-and-Language Navigation: A Survey of Tasks, Methods, and Future Directions"
An open source framework for research in Embodied-AI from AI2.
Official Repo of LangSuitE
A Production Tool for Embodied AI
Official PyTorch Implementation of Genesis: Embodiment Co-Design via Efficient Message and Reward Delivery
Add a description, image, and links to the embodied-agent topic page so that developers can more easily learn about it.
To associate your repository with the embodied-agent topic, visit your repo's landing page and select "manage topics."