Picture of Luis Maßny

M.Sc. Luis Maßny

Technical University of Munich

Associate Professorship of Coding and Cryptography (Prof. Wachter-Zeh)

Postal address

Postal:
Theresienstr. 90
80333 München

Biography

  • Doctoral researcher under the supervision of Prof. Antonia Wachter-Zeh and Dr. Rawad Bitar, Technical University of Munich, since September 2021.
  • M.Sc. Electrical Engineering, Information Technology, and Computer Engineering, RWTH Aachen University, 2020.
  • B.Sc. Electrical Engineering, Information Technology, and Computer Engineering, RWTH Aachen University, 2018.

Research Interests

  • Security and Privacy for Distributed Systems
  • Coding Theory
  • Information Theory
  • Wireless Communication and Signal Processing
  • Federated Learning

Theses

Available Theses

Understanding Guarantees and Pitfalls of Differential Privacy

Description

Many data-driven applications can be modeled as a communication between a data curator and a data analyst, which queries a database for particular population statistics. When the individual database entries are considered sensitive information, the data curator can undertake additional measures to ensure privacy of individual database entries.

Differential Privacy (DP) [1] has become a popular notion for data privacy, measuring the ability of a curious data analyst to discriminate between the value of different sensitive database entries. To use DP in practical systems, it is important to understand the fundamental guarantees of a system that claims to ensure DP.

While it is sometimes believed that DP guarantees hold unconditionally and even in the presence of arbitrary side information, it has been shown that it is not possible to provide privacy and utility without making assumptions about how the data are generated [2]. In particular, dependence (correlation) between different database entries can be exploited to break the alleged privacy guarantees [3].

In this seminar topic, the student will make himself familiar with the definition and formal guarantees of DP and study the issues and pitalls of DP, particularly with a focus on dependent data distributions. The student will summarize his results in the form of a scientific presentation and a scientific article, based on her own reading of scientific papers. These include but are not necessarily limited to the recommended references [1-3].

[1] C. Dwork and A. Roth, “The Algorithmic Foundations of Differential Privacy,” TCS, 2014.

[2] D. Kifer and A. Machanavajjhala, "No free lunch in data privacy," Proceedings of the 2011 ACM SIGMOD International Conference on Management of Data (SIGMOD '11).

[3] C. Liu, S. Chakraborty, and P. Mittal, “Dependence Makes You Vulnerable: Differential Privacy Under Dependent Tuples,” in Proceedings of the Network and Distributed System Security Symposium, 2016.

 

Contact

Luis Maßny (luis.massny@tum.de)

Supervisor:

Differentially-Private and Robust Federated Learning

Description

Federated learning is a machine learning paradigm that aims to learn collaboratively from decentralized private data owned by entities referred to as clients. However, due to its decentralized nature, federated learning is susceptible to poisoning attacks, where malicious clients try to corrupt the learning process by modifying their data or local model updates. Moreover, the updates sent by the clients might leak information about the private data involved in the learning. This thesis aims to investigate and combine existing robust aggregation techniques in FL with differential privacy techniques.

References:

[1] - https://arxiv.org/pdf/2304.09762.pdf

[2] - https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9757841

[3] - https://dl.acm.org/doi/abs/10.1145/3465084.3467919

Prerequisites

- Knowledge about machine learning and gradient descent optimization

- Proficiency in Python and PyTorch

- Undergraduate statistics courses

- Prior knowledge about differential privacy is a plus

Contact

marvin.xhemrishi@tum.de

luis.massny@tum.de

Supervisor:

Theses in Progress

Efficient Federated Learning over Wireless Channels with Energy-Constrained Devices

Description

Machine learning exploits large collections of data to train parameterized models for a dedicated task, e.g., classification or regression. In mobile radio networks, valuable training data is located at the network edge and distributed among the network users. Transferring the data to a central server for the model training is often not desired due to the high communication load and privacy restrictions about the users’ data. The
federated learning paradigm [1] approaches this challenges by training the model locally at the users and sending only the model updates to the central server.

When executed over wireless channels, federated learning faces several challenges: scarce wireless radio resources, varying channel coditions, and user dropouts for instance. Moreover, mobile devices have a limited energy budget that they can invest into the federated learning procedure. Therefore, prior work has targeted the optimization of learning performance through user selection and resource allocation [2, 3] or the joint optimization of computation and communication for energy-efficiency [4, 5]. Although these solutions showed that the learning performance and energy-efficiency can be improved by sophisticated design, they rely on numerical solvers with many degrees of freedom. Furthermore, the focus in the literature lies on synchronous model updates. In particular, the exploration of asynchronous model updates as in [6] is underrepresented,
although this has been shown to outperform synchronous protocols in some cases [7].

The goal of this thesis is to formulate the system model and design, implement, and analyze analytical solutions for improving the efficiency of federated learning over wireless channels with energy-constrained devices.

 

[1] B. McMahan, E. Moore, D. Ramage, S. Hampson, and B. A. y. Arcas, “Communication-Efficient Learning of Deep Networks from Decentralized Data,” in Proceedings of the 20th International Conference on Artificial Intelligence and Statistics, ser. Proceedings of Machine Learning Research, vol. 54., 2017, pp. 1273–1282.
[2] M. M. Amiri, D. Gündüuz, S. R. Kulkarni, and H. V. Poor, “Convergence of update aware device scheduling for federated learning at the wireless edge,” IEEE Transactions on Wireless Communications, vol. 20, no. 6, pp. 3643–3658, 2021.
[3] M. Chen, Z. Yang, W. Saad, C. Yin, H. V. Poor, and S. Cui, “A joint learning and communications framework for federated learning over wireless networks,” IEEE Transactions on Wireless Communications, vol. 20, no. 1, pp. 269–283, 2021.
[4] Z. Yang, M. Chen, W. Saad, C. S. Hong, and M. Shikh-Bahaei, “Energy efficient federated learning over wireless communication networks,” IEEE Transactions on Wireless Communications, vol. 20, no. 3, pp. 1935–1949, 2021.
[5] X. Mo and J. Xu, “Energy-efficient federated edge learning with joint communication and computation design,” Journal of Communications and Information Networks, vol. 6, no. 2, pp. 110–124, 2021.
[6] Z. Chen, W. Yi, Y. Liu, and A. Nallanathan, “Robust federated learning for unreliable and resource-limited wireless networks,” IEEE Transactions on Wireless Communications, pp. 1–1, 2024.
[7] S. Dutta, J. Wang, and G. Joshi, “Slow and stale gradients can win the race,” IEEE Journal on Selected Areas in Information Theory, vol. 2, no. 3, pp. 1012–1024, 2021.

Prerequisites

  • Probability theory
  • Prior experience in programming with PyTorch
  • Federated learning

Supervisor:

Secure Federated Learning with Differential Privacy

Description

Federated learning is a machine learning paradigm that aims to learn collaboratively from decentralized private data owned by entities referred to as clients. However, due to its decentralized nature, federated learning is susceptible to model poisoning attacks, where malicious clients try to corrupt the learning process by modifying local model updates. Moreover, the updates sent by the clients might leak information about the private data involved in the learning. The goal of this work is to investigate and combine existing robust aggregation techniques in FL with differential privacy techniques.

References:

[1] - https://arxiv.org/pdf/2304.09762.pdf

[2] - https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9757841

[3] - https://dl.acm.org/doi/abs/10.1145/3465084.3467919

Prerequisites

- Basic knowledge about machine learning and gradient descent optimization

- First experience with machine learning in python

- Undergraduate statistics courses

- Prior knowledge about differential privacy is a plus

Supervisor:

Publications

2024

  • Jain, S.; Maßny, L.; Hofmeister, C.; Yaakobi, E.; Bitar, R.: Interactive Byzantine-Resilient Gradient Coding for General Data Assignments. 2024 IEEE International Symposium on Information Theory (ISIT), IEEE, 2024, 3273-3278 more… Full text ( DOI )

2023

  • Hofmeister, C.; Maßny, L.; Yaakobi, E.; Bitar, R.: Trading Communication for Computation in Byzantine-Resilient Gradient Coding. 2023 IEEE International Symposium on Information Theory (ISIT), IEEE, 2023 more… Full text ( DOI )
  • Maßny, L.; Wachter-Zeh, A.: Secure Over-the-Air Data Aggregation with Untrusted Users. 2023 Asilomar Conference on Signals, Systems, and Computers, 2023 more…
  • Maßny, L.; Wachter-Zeh, A.: Secure Over-the-Air Computation Using Zero-Forced Artificial Noise. 2023 IEEE Information Theory Workshop (ITW), IEEE, 2023 more… Full text ( DOI )

2022

  • Maßny, L.; Hofmeister, C.; Egger, M.; Bitar, R.; Wachter-Zeh, A.: Nested Gradient Codes for Straggler Mitigation in Distributed Machine Learning. TUM ICE Workshop Raitenhaslach, 2022 more…
  • Maßny, L.; Wachter-Zeh, A.: Secure Over-the-Air Federated Learning. Munich Workshop on Coding and Cryptography, 2022 more…
  • Maßny, L.; Wachter-Zeh, A.: Secure Over-the-Air Federated Learning. IEEE European School of Information Theory, 2022 more…