Used for training LLMs to describe, assist, or predict real-life human activities from multimodal input.
The Aria Ego-Exo Activity (AEA) dataset is part of Meta Reality Labs’ Project Aria initiative and focuses on capturing human activities from both egocentric (first-person) and exocentric (third-person) viewpoints. The dataset includes synchronized multimodal sensor data such as video, audio, depth, and inertial measurements, enabling a comprehensive representation of human actions in real-world environments. It is designed to support research in understanding complex human behaviors across perspectives.
The Aea Dataset Is Used For Research In Activity Recognition, Human–environment Interaction, And Multimodal Perception. It Supports Development Of Models That Can Reason About Actions, Intentions, And Interactions From Wearable And External Camera Views. This Is Particularly Valuable For Embodied Ai, Robotics, Ar/vr Applications, And Systems That Must Understand Human Behavior Across Viewpoints.
Other
© 2026 - Copyright AIKosh. All rights reserved. This portal is developed by National e-Governance Division for AIKosh mission.