Project Lab Human Activity Understanding
Lecturer (assistant) | |
---|---|
Number | 0000000155 |
Type | practical training |
Duration | 5 SWS |
Term | Wintersemester 2024/25 |
Language of instruction | English |
Position within curricula | See TUMonline |
Dates | See TUMonline |
- 17.10.2024 13:15-17:15 0943, Praktikum
- 24.10.2024 13:15-17:15 0943, Praktikum
- 31.10.2024 13:15-17:15 0943, Praktikum
- 07.11.2024 13:15-17:15 0943, Praktikum
- 14.11.2024 13:15-17:15 0943, Praktikum
- 21.11.2024 13:15-17:15 0943, Praktikum
- 28.11.2024 13:15-17:15 0943, Praktikum
- 12.12.2024 13:15-17:15 0943, Praktikum
- 19.12.2024 13:15-17:15 0943, Praktikum
- 09.01.2025 13:15-17:15 0943, Praktikum
- 16.01.2025 13:15-17:15 0943, Praktikum
- 23.01.2025 13:15-17:15 0943, Praktikum
- 30.01.2025 13:15-17:15 0943, Praktikum
- 06.02.2025 13:15-17:15 0943, Praktikum
Admission information
Objectives
Upon successful completion of this module, students are able to understand the challenges in Human Activity Understanding and design processes for automatic sensor-based recognition of ongoing human activity.
Students are able to collect and utilize synthetic data as well as multi-camera sequential data in ego-perspective and stationary setups, annotating and extracting relevant semantic information, and learning about representation for spatial and temporal data.
Students are able to learn how to use AI models and algorithms to extract information available from a scene and recognize and predict human activity based on the extracted information.
They are eventually able to analyze and evaluate the results of the various algorithms involved as well as the solutions they have designed.
Students are able to collect and utilize synthetic data as well as multi-camera sequential data in ego-perspective and stationary setups, annotating and extracting relevant semantic information, and learning about representation for spatial and temporal data.
Students are able to learn how to use AI models and algorithms to extract information available from a scene and recognize and predict human activity based on the extracted information.
They are eventually able to analyze and evaluate the results of the various algorithms involved as well as the solutions they have designed.
Description
Sensor data collection and annotation
- Multi-sensor and multi-view data collection and processing, including color/depth/IMU
- Synthetic data generation for Human Actions
- Accelerated ground truth annotation using interactive instance segmentation and tracking
Semantic inference building blocks
- Object detection
- Human and Object pose estimation/tracking
Graph representation of spatial and temporal data
- 3D scene graphs
- Spatio-Temporal graphs
- Knowledge Bases (Ontologies)
Sequential deep learning models for Human Activity Recognition and Anticipation
- Recurrent Neural Networks
- Graph Networks
- Transformers
- Multi-sensor and multi-view data collection and processing, including color/depth/IMU
- Synthetic data generation for Human Actions
- Accelerated ground truth annotation using interactive instance segmentation and tracking
Semantic inference building blocks
- Object detection
- Human and Object pose estimation/tracking
Graph representation of spatial and temporal data
- 3D scene graphs
- Spatio-Temporal graphs
- Knowledge Bases (Ontologies)
Sequential deep learning models for Human Activity Recognition and Anticipation
- Recurrent Neural Networks
- Graph Networks
- Transformers
Teaching and learning methods
- Supervised weekly lab sessions with several introductory lectures by research assistants at the beginning of the
course and supervised practical implementation based on the provided skeleton codes.
- Individual methods and solutions introduced by the student
- Lectures on theoretical basics of project planning and technical management. and tools for collaboration (SCRUM, Gitlab, Wiki, etc.)
- Final project: individual and group work with independent planning, execution, and documentation
- Seminar: Presentation and final results and discussion (reflection, feedback).
Media formats:
The following media forms will be used:
- Presentations
- Script and review articles from the technical literature
- Tutorials and software documentation
- Development Environment (virtual machines on a GPU server)
- Simulation environment
- Data collection setup
course and supervised practical implementation based on the provided skeleton codes.
- Individual methods and solutions introduced by the student
- Lectures on theoretical basics of project planning and technical management. and tools for collaboration (SCRUM, Gitlab, Wiki, etc.)
- Final project: individual and group work with independent planning, execution, and documentation
- Seminar: Presentation and final results and discussion (reflection, feedback).
Media formats:
The following media forms will be used:
- Presentations
- Script and review articles from the technical literature
- Tutorials and software documentation
- Development Environment (virtual machines on a GPU server)
- Simulation environment
- Data collection setup
Examination
- [20%] Implementation of introductory practical tasks in the field of Human Activity Understanding in Python - data acquisition and processing, recognition of people and objects in the scene, obtaining a semantic understanding of ongoing activity (4 programming tasks).
- [60%] Hands-on project work - creating initial project plans and presenting them (8-10 Min. Presentation), regularly discussing work progress and next steps with supervisor, technical problem solving and using appropriate tools for efficient teamwork (4 project sessions).
- [20%] Ca. 20-minute presentation of results, including demo, followed by a ca. 10-minute discussion.
- [60%] Hands-on project work - creating initial project plans and presenting them (8-10 Min. Presentation), regularly discussing work progress and next steps with supervisor, technical problem solving and using appropriate tools for efficient teamwork (4 project sessions).
- [20%] Ca. 20-minute presentation of results, including demo, followed by a ca. 10-minute discussion.
Links
Previous Lab Project Demos
Kick off meeting announcement
The lab kick-off meeting will be in-person on 17.10.2024 at the Seminar Room 0406 https://nav.tum.de/room/0504.EG.406 (from 15:00 to 16:30).