Networked Multimedia Systems
Overview
Multimedia content has become the dominant traffic in the Internet, both for stationary and mobile users. The main contributors are over-the-top video-on-demand services such as YouTube and Netflix, augmented by the multimedia service offerings by Internet Service Providers for IPTV and network based video recording. A longer history but lower volume have Internet telephony and multimedia conferencing applications (Skype, WebEx, Lync, GotoMeeting, etc.), which are now seeing another push as services migrate towards full integration with the web browser using WebRTC technologies (e.g., Google Hangouts). From early packet audio experiments to ubiquitous HD video and beyond, Internet multimedia protocols and systems have seen many changes over time: in media distribution, in content encoding, in media and signaling protocols, and in systems platforms. But in spite of these changes, the fundamentals and some designs remain stable. In this class, we will explore the technical foundations of packet-based multimedia communications, focusing on the functions on the endpoints and only briefly touching upon support mechanisms inside the network. We will present the architecture and the protocol building blocks for Internet multimedia and also dive into system and implementation aspects. We will also consider assessing performance of multimedia communications from a systems and user perspective. Backed by introductory lectures, the key elements of this course will be designing and implementing a working multimedia communication systems that is interoperable with today’s (commercial) products.
An important element for system design is end user experience and thus fidelity of the content delivery and end-user interaction process. Substantial effort has gone into extending the content delivery infrastructure to replicated data centers, moving the contents closer to the user. Still, for many interactive multimedia applications, especially including augmented and virtual reality, even the distance to a close by data center may be too large, which has motivated exploring edge computing as a way to push content processing almost all the way to the user.
Edge computing is an emerging paradigm in which computation is largely or completely performed on distributed device nodes found closer to end-users. Normally, this computation should be executed in the cloud but, in order to improve performance, reduce uplink utilization and provide “ambient intelligence”, it is moved to commonly known as “smart devices”. Edge computing is often seen as an enabler of cyber-physical systems and ubiquitous computing. Multimedia applications, such as video streaming, 360 video-streaming and gaming, are currently limited by delay intolerance and excessive bandwidth usage. Edge computing can help solving this problems opening to new multimedia solutions. Among the emerging multimedia applications, we find Augmented Reality. AR is a technique that enables users to interact with their physical environment through the overlay of digital information. While being researched for decades, more recently, AR moved out of research labs into the field. Combined with edge computing, more efficient and scalable solutions for future AR applications can be developed. Offloading computation to stationary or mobile devices can enhance some aspects of existing AR pipelines and enable new ones.
Both edge computing and AR will be introduced in this course, providing the fundamentals concepts required to understand both technologies. AR lectures will be focused on computer vision aspects while edge computing verges in the direction of distributed systems architectures.
The topics to be covered will embrace multimedia networking and systems, specifics of augmented and virtual reality, and edge computing. We will start out by providing a solid basis of the technology essentials and dive deeper into selected areas to be determined as function of the actual projects pursued. The essentials comprise:
- Multimedia systems by example
- Background of multimedia communications in the Internet
- Network support for multimedia
- Media codecs by example
- Real-time transport protocol (RTP)
- RTP payload formats
- Concepts for multimedia control
- Multimedia streaming
- AR - background, examples and application fields
- AR toolbox: building blocks of an AR application
- Mobile, pervasive augmented reality
- EC - background, approaches, frameworks etc.
- EC+AR application scenarios
- Open challenges
We may have one a warm-up assignment to build a small application that makes extensive use of sockets, some web protocols, application functions, and some edge components to get a hands-on understanding of the diverse technologies. We will then have design and implementation projects to design, implement, test, and demo a comprehensive system. We will offer some ideas for the latter but are happy to discuss your own Ideas. Both should be tackled in small teams.
We will run lectures interspersed with topic discussions in the beginning and then, after every team has decided on their project topic, move to a lightweight schedule with regular progress meetings and occasional lectures on selected content as needed. The class will be held by default in our seminar room 01.07.023 as follows:
- Monday 14 - 16
- Tuesday 12 - 14
- Friday 10 - 12
There will be some deviations because of special occasions when we cannot have lectures in these slots. Since this is a practical course, the emphasis will be on development, so not every slot will be used. We will do content teaching in the beginning and then restrict ourselves to exercise sessions.