Bachelor's Theses
Thesis in Optical Communications and Receiver Design
Description
Please reach out if you are interested in a thesis in any of my research fields. Possible areas include optical communications, particularly physical modeling and nonlinearity mitigation for single-mode fiber, and aspects of receiver design, such as receivers for channels with memory.
A good background in optical communication systems and applied information theory is preferable, but the requirements generally depend on your interests.
Please include a description of your interests and corresponding academic background in your application. If you have a thesis idea, I am happy to discuss your suggestions. Also, I am available to supervise external theses as long as they are in my field of expertise.
Supervisor:
Timeliness and Accuracy in Remote Estimation over Communication Networks
Description
This thesis investigates how to optimally monitor and estimate real-time information from remote systems when communication resources are limited or delayed. In modern networked systems—such as sensor networks, autonomous agents, or industrial control applications—measurements often traverse unreliable or congested channels before reaching a central estimator. This raises a fundamental question: When should data be sampled and transmitted to ensure accurate and timely understanding of the system state?
The thesis will explore strategies that balance the freshness of information (often quantified by metrics like the Age of Information) with the quality of estimation. It will involve studying optimal sampling policies, understanding the impact of transmission delays, and characterizing how these factors affect overall system performance. The student will engage with both theoretical models and simulation tools to gain insights relevant to real-world applications in remote monitoring, edge computing, and cyber-physical systems.
Prerequisites
the student should ideally have the following background to work effectively on this thesis:
>>Understanding of random processes, in particular the Wiener (Brownian motion) process, and basic stochastic calculus concepts.
>>Basic knowledge of optimization techniques, particularly dynamic programming and threshold-based policies.
>>Programming Skills (e.g., MATLAB, Python)
Contact
houman.asgari@tum.de
Supervisor:
Beyond Shannon: Exploring Rényi Entropy and Its Applications
Description
A foundational concept in information theory is Shannon entropy. However, Shannon entropy does not always provide sufficient flexibility when dealing with various optimization problems, robustness considerations, or scenarios where fine control over uncertainty quantification is required. In 1961, Rényi provided a parametric generalization of Shannon entropy [1], allowing for a more nuanced analysis of information measures.
Rényi entropy has found applications in diverse fields such as hypothesis testing, machine learning, privacy and security (e.g., differential privacy), and statistical physics. The target of this project is to understand the difference between Shannon and Rényi entropy, conditional entropy, and divergence, as well as its applications in both theoretical and applied research.
[1] A. Rényi, “On measures of entropy and information,” in Proc. 4th Berkeley Symp. Mathematics, Statistics, and Probability, vol. 1, Berkeley, CA, USA, 1961, pp. 547–561.
Supervisor:
Topics in Intelligent Reflecting Surfaces (IRS), Integrated Sensing and Communication and Wireless Communication
Description
I may not always have prepared thesis topics available. Please feel free to reach out if you are interested in working on a thesis within any of my research areas.
Prerequisites
Prerequirements:
- Wireless Communication
- Mobile Communication
- Information Theory
- Multi-User Information Theory
Supervisor:
Thesis in Polar Coding, Probabilistic Shaping, and Applied Information Theory
Description
I may not always have prepared thesis topics available. Please feel free to reach out if you are interested in working on a thesis within any of my research areas.
Supervisor:
Master's Theses
Thesis in Optical Communications and Receiver Design
Description
Please reach out if you are interested in a thesis in any of my research fields. Possible areas include optical communications, particularly physical modeling and nonlinearity mitigation for single-mode fiber, and aspects of receiver design, such as receivers for channels with memory.
A good background in optical communication systems and applied information theory is preferable, but the requirements generally depend on your interests.
Please include a description of your interests and corresponding academic background in your application. If you have a thesis idea, I am happy to discuss your suggestions. Also, I am available to supervise external theses as long as they are in my field of expertise.
Supervisor:
Topics in Intelligent Reflecting Surfaces (IRS), Integrated Sensing and Communication and Wireless Communication
Description
I may not always have prepared thesis topics available. Please feel free to reach out if you are interested in working on a thesis within any of my research areas.
Prerequisites
Prerequirements:
- Wireless Communication
- Mobile Communication
- Information Theory
- Multi-User Information Theory
Supervisor:
Measuring Uncertainty in Structured Stochastic Processes
Description
The entropy rate of a stochastic process quantifies the average uncertainty per time step. In the context of Hidden Markov Models (HMMs)—where we observe a sequence of outputs generated by an underlying Markov process—this notion becomes particularly intriguing. While standard Markov chains have a well-defined structure for computing entropy rates, the introduction of a hidden layer complicates matters significantly. The entropy rate of an HMM reflects the long-term unpredictability of its observable outputs, capturing both the randomness of the state transitions and the ambiguity introduced by the observation process.
Understanding the entropy rate of HMMs has broad implications, from speech recognition and bioinformatics to machine learning and signal processing. Despite their practical importance, entropy rates of HMMs are notoriously hard to compute exactly, often requiring approximate methods or asymptotic analysis. This challenge opens up interesting theoretical questions and motivates efficient computational approaches. In this project, we investigate how structure, memory, and noise in the model impact entropy rate, and what this reveals about the information limits of systems modeled by HMMs.
Prerequisites
Information Theory
Supervisor:
Thesis in Polar Coding, Probabilistic Shaping, and Applied Information Theory
Description
I may not always have prepared thesis topics available. Please feel free to reach out if you are interested in working on a thesis within any of my research areas.
Supervisor:
Communication with Coarse Quantization
Description
Motivated by the cell-free and massive MIMO (multiple input multiple outputs) communication scenarios, the number of power amplifiers (PA), digital to analog converters (DAC), etc., is increased. Thus, using coarse quantized transmission reduces the channel's hardware cost and nonlinear effects. More details can be found in the attached file.
In this project, we investigate algorithms for mapping modulated data to coarsely quantized signals and compare their information rates.
The student needs an understanding of information theory and communication systems.
Supervisor:
Code Constructions for Burst Metrics
Coding theory, Lattices, Discrete mathematics
Description
This thesis is concerned with developing code constructions for a new weight function (and associated metric) on (Z/qZ)^n called the unit-burst weight, suitable for measuring same-symbol burst errors. A unit burst is defined as a vector that has some consecutive positions of ones and is zero otherwise.
Any vector v in (Z/qZ)^n can be written as a (not necessarily unique) linear combination of these bursts. The burst weight is then the minimum number of bursts that need to be added or subtracted to produce v.
This metric has a connection to the A_n root lattice, a special lattice in Z^n+1 of vectors whose entries sum to zero. More precisely, the unit bursts relate to the shortest vectors of A_n called roots, and the burst weight corresponds to the smallest decomposition of a lattice point into roots.
We have already derived some basic properties and algorithms for this new metric and now would like to find some bounds and code constructions achieving those bounds.
Contact
anna.baumeister@tum.de, hugo.sauerbier-couvee@tum.de
Supervisor:
Connecting Private Information Retrieval with Private Set Intersection
private information retrieval, private set intersection
Description
Private Information Retrieval (PIR) is the problem of retrieving a desired data from a server while preventing the server from finding out the retrieved data.
Private Set Intersection (PSI) is the problem of multiple parties comparing their databases and computing the common elements, without revealing any information about the data they do not commonly possess.
The goal of this project is to understand both problems and to compare them. The main target is to design a protocol that can convert any PIR scheme to a PSI scheme and/or a protocol that can convert any PSI scheme to a PIR scheme.
References:
[1] Z. Wang, K. Banawan, and S. Ulukus, ‘Multi-Party Private Set Intersection: An Information-Theoretic Approach’, IEEE Journal on Selected Areas in Information Theory, vol. 2, no. 1, pp. 366–379, Mar. 2021, doi: 10.1109/JSAIT.2021.3057597.
[2] Z. Wang, K. Banawan, and S. Ulukus, ‘Private Set Intersection Using Multi-Message Symmetric Private Information Retrieval’, in 2020 IEEE International Symposium on Information Theory (ISIT), Jun. 2020, pp. 1035–1040. doi: 10.1109/ISIT44484.2020.9174221.
Prerequisites
Channel Coding
Security in Communications and Storage
Supervisor:
Private and Secure Federated Learning
Description
In federated learning, a machine learning model shall be trained on private user data with the help of a central server, the so-called federator. This setting differs from other machine learning settings in that the user data shall not be shared with the federator for privacy reasons and/or to decrease the communication load of the system.
Even though only intermediate results are shared, extra care is necessary to guarantee data privacy. An additional challenge arises if the system includes malicious users that breach protocol and send corrupt computation results.
The goal of this work is to design, implement and analyze coding- and information-theoretic solutions for privacy and security in federated learning.
Prerequisites
- Information Theory
- Coding Theory (e.g., Channel Coding)
- Machine Learning (Theory and Practice)
Supervisor:
Research Internships (Forschungspraxis)
Thesis in Optical Communications and Receiver Design
Description
Please reach out if you are interested in a thesis in any of my research fields. Possible areas include optical communications, particularly physical modeling and nonlinearity mitigation for single-mode fiber, and aspects of receiver design, such as receivers for channels with memory.
A good background in optical communication systems and applied information theory is preferable, but the requirements generally depend on your interests.
Please include a description of your interests and corresponding academic background in your application. If you have a thesis idea, I am happy to discuss your suggestions. Also, I am available to supervise external theses as long as they are in my field of expertise.
Supervisor:
Beyond Shannon: Exploring Rényi Entropy and Its Applications
Description
A foundational concept in information theory is Shannon entropy. However, Shannon entropy does not always provide sufficient flexibility when dealing with various optimization problems, robustness considerations, or scenarios where fine control over uncertainty quantification is required. In 1961, Rényi provided a parametric generalization of Shannon entropy [1], allowing for a more nuanced analysis of information measures.
Rényi entropy has found applications in diverse fields such as hypothesis testing, machine learning, privacy and security (e.g., differential privacy), and statistical physics. The target of this project is to understand the difference between Shannon and Rényi entropy, conditional entropy, and divergence, as well as its applications in both theoretical and applied research.
[1] A. Rényi, “On measures of entropy and information,” in Proc. 4th Berkeley Symp. Mathematics, Statistics, and Probability, vol. 1, Berkeley, CA, USA, 1961, pp. 547–561.
Supervisor:
Topics in Intelligent Reflecting Surfaces (IRS), Integrated Sensing and Communication and Wireless Communication
Description
I may not always have prepared thesis topics available. Please feel free to reach out if you are interested in working on a thesis within any of my research areas.
Prerequisites
Prerequirements:
- Wireless Communication
- Mobile Communication
- Information Theory
- Multi-User Information Theory
Supervisor:
Thesis in Polar Coding, Probabilistic Shaping, and Applied Information Theory
Description
I may not always have prepared thesis topics available. Please feel free to reach out if you are interested in working on a thesis within any of my research areas.
Supervisor:
Code Constructions for Burst Metrics
Coding theory, Lattices, Discrete mathematics
Description
This thesis is concerned with developing code constructions for a new weight function (and associated metric) on (Z/qZ)^n called the unit-burst weight, suitable for measuring same-symbol burst errors. A unit burst is defined as a vector that has some consecutive positions of ones and is zero otherwise.
Any vector v in (Z/qZ)^n can be written as a (not necessarily unique) linear combination of these bursts. The burst weight is then the minimum number of bursts that need to be added or subtracted to produce v.
This metric has a connection to the A_n root lattice, a special lattice in Z^n+1 of vectors whose entries sum to zero. More precisely, the unit bursts relate to the shortest vectors of A_n called roots, and the burst weight corresponds to the smallest decomposition of a lattice point into roots.
We have already derived some basic properties and algorithms for this new metric and now would like to find some bounds and code constructions achieving those bounds.
Contact
anna.baumeister@tum.de, hugo.sauerbier-couvee@tum.de
Supervisor:
Connecting Private Information Retrieval with Private Set Intersection
private information retrieval, private set intersection
Description
Private Information Retrieval (PIR) is the problem of retrieving a desired data from a server while preventing the server from finding out the retrieved data.
Private Set Intersection (PSI) is the problem of multiple parties comparing their databases and computing the common elements, without revealing any information about the data they do not commonly possess.
The goal of this project is to understand both problems and to compare them. The main target is to design a protocol that can convert any PIR scheme to a PSI scheme and/or a protocol that can convert any PSI scheme to a PIR scheme.
References:
[1] Z. Wang, K. Banawan, and S. Ulukus, ‘Multi-Party Private Set Intersection: An Information-Theoretic Approach’, IEEE Journal on Selected Areas in Information Theory, vol. 2, no. 1, pp. 366–379, Mar. 2021, doi: 10.1109/JSAIT.2021.3057597.
[2] Z. Wang, K. Banawan, and S. Ulukus, ‘Private Set Intersection Using Multi-Message Symmetric Private Information Retrieval’, in 2020 IEEE International Symposium on Information Theory (ISIT), Jun. 2020, pp. 1035–1040. doi: 10.1109/ISIT44484.2020.9174221.
Prerequisites
Channel Coding
Security in Communications and Storage
Supervisor:
Private and Secure Federated Learning
Description
In federated learning, a machine learning model shall be trained on private user data with the help of a central server, the so-called federator. This setting differs from other machine learning settings in that the user data shall not be shared with the federator for privacy reasons and/or to decrease the communication load of the system.
Even though only intermediate results are shared, extra care is necessary to guarantee data privacy. An additional challenge arises if the system includes malicious users that breach protocol and send corrupt computation results.
The goal of this work is to design, implement and analyze coding- and information-theoretic solutions for privacy and security in federated learning.
Prerequisites
- Information Theory
- Coding Theory (e.g., Channel Coding)
- Machine Learning (Theory and Practice)
Supervisor:
Internships
Beyond Shannon: Exploring Rényi Entropy and Its Applications
Description
A foundational concept in information theory is Shannon entropy. However, Shannon entropy does not always provide sufficient flexibility when dealing with various optimization problems, robustness considerations, or scenarios where fine control over uncertainty quantification is required. In 1961, Rényi provided a parametric generalization of Shannon entropy [1], allowing for a more nuanced analysis of information measures.
Rényi entropy has found applications in diverse fields such as hypothesis testing, machine learning, privacy and security (e.g., differential privacy), and statistical physics. The target of this project is to understand the difference between Shannon and Rényi entropy, conditional entropy, and divergence, as well as its applications in both theoretical and applied research.
[1] A. Rényi, “On measures of entropy and information,” in Proc. 4th Berkeley Symp. Mathematics, Statistics, and Probability, vol. 1, Berkeley, CA, USA, 1961, pp. 547–561.
Supervisor:
Thesis in Polar Coding, Probabilistic Shaping, and Applied Information Theory
Description
I may not always have prepared thesis topics available. Please feel free to reach out if you are interested in working on a thesis within any of my research areas.