Foto von Christoph Hofmeister

M.Sc. Christoph Hofmeister

Technische Universität München

Professur für Codierung und Kryptographie (Prof. Wachter-Zeh)

Postadresse

Postal:
Theresienstr. 90
80333 München

Biografie

  • Duales Studium bei Infineon Technologies (2015-2019)
  • B.Eng. in Elektro- und Informationstechnik, Hochschule München (2019)
  • M.Sc. in Elektro- und Informationstechnik, Technische Universität München (2021)
  • Seit Oktober 2021 Doktorand an der Lehr- und Forschungseinheit für Nachrichtentechnik, Professur für Coding und Kryptographie

Abschlussarbeiten

Angebotene Abschlussarbeiten

Laufende Abschlussarbeiten

Fast Matrix Multiplication Algorithms

Beschreibung

The search for fast matrix multiplication algorithms started when Volker Strassen found a way to multiply 2x2 matrices using 7 (instead of 8) scalar multiplications [1]. Through recursive application, Strassen's algorithm multiplies n x n matrices in sub-cubic complexity.

Since then, multiple algorithms with successively lower complexity have been discovered.

 

The goal of this seminar topic is to give an overview of these fast matrix multiplication algorithms, focussing on the mathematical concepts relating to their inner workings and discovery.

 

[1] Strassen, V. Gaussian elimination is not optimal. Numer. Math. 13, 354–356 (1969). https://doi.org/10.1007/BF02165411

Betreuer:

Gradient Compression

Beschreibung

In many distributed and federated learning systems clients iteratively compute so-called gradient vectors based on their locally stored data and communicate them to a central entity. The gradient vectors are typically high dimensional, so transmitting them directly leads to undesirable amounts of data transmission, holding back the performance of the system.

To alleviate this issue, various gradient compression schemes have been proposed.

The student's task is to analyze and compare multiple proposed schemes based on their advantages and disadvantages. As a starting point, students can use [1, Section 3.2].

[1] S. Ouyang, D. Dong, Y. Xu, and L. Xiao, “Communication optimization strategies for distributed deep neural network training: A survey,” Journal of Parallel and Distributed Computing, vol. 149, pp. 52–65, Mar. 2021, doi: 10.1016/j.jpdc.2020.11.005.

Betreuer:

Publikationen

2023

  • Egger, Maximilian; Hofmeister, Christoph; Wachter-Zeh, Antonia; Bitar, Rawad: Private Aggregation in Wireless Federated Learning with Heterogeneous Clusters. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 mehr… Volltext ( DOI )
  • Hofmeister, C.; Maßny, L.; Yaakobi, E.; Bitar, R.: Trading Communication for Computation in Byzantine-Resilient Gradient Coding. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 mehr… Volltext ( DOI )

2022

  • Christoph Hofmeister; Luis Maßny; Rawad Bitar; Eitan Yaakobi: Trading Communication and Computation for Security in Gradient Coding. Munich Workshop on Coding and Cryptography 2022, 2022 mehr…
  • Christoph Hofmeister; Luis Maßny; Rawad Bitar; Eitan Yaakobi: Trading Communication and Computation for Security in Gradient Coding. 2022 IEEE European School of Information Theory (ESIT), 2022 mehr…
  • Hofmeister Christoph; Luis Maßny; Rawad Bitar; Eitan Yaakobi: Trading Communication and Computation for Security in Gradient Coding. TUM ICE Workshop Raitenhaslach, 2022 mehr…
  • Hofmeister, Christoph; Bitar, Rawad; Xhemrishi, Marvin; Wachter-Zeh, Antonia: Secure Private and Adaptive Matrix Multiplication Beyond the Singleton Bound. IEEE Journal on Selected Areas in Information Theory 3 (2), 2022, 275-285 mehr… Volltext ( DOI )
  • Hofmeister, Christoph; Bitar, Rawad; Xhemrishi, Marvin; Wachter-Zeh, Antonia: Secure Private and Adaptive Matrix Multiplication Beyond the Singleton Bound. WCC 2022: The Twelfth International Workshop on Coding and Cryptography , 2022 mehr…