Back to the main page ]

Reading List

This reading list provides a general overview, yet not exhaustive, of the recent work in the field of Federated Learning. The first few papers (tutorial/review) provide an entry point to the field. They discuss general methodological and practical challenges. The next papers are more specific, and constitute important contributions in the field.

Tutorial/review papers

Improving Communication-Efficiency in Federated Learning

Federated Learning requires the participating devices to download and upload neural network parametrizations from/ to a centralized server at frequent time intervals. Since these devices often only have access to bandwidth-limited wireless channels, communication overhead is one of the major limiting factors in Federated Learning.

Federated Learning in Adversarial Settings

Federated Learning assumes all participating clients to be benign. Different works highlight potential attacks in adversarial settings, which have to be anticipated in real world situations. These attacks may cause the jointly trained model to diverge, introduce hidden backdoor functionalities or cause involuntary leakage of sensitive information, among others.
Robustness against adversaries can be improved with more involved aggregation strategies.
In order to provide formal privacy guarantees, Federated Learning needs to be augmented with other tools like differential privacy, secure multi-party computation and homomorphic encryption.
Both robustness training and formal privacy mechanisms however often aggravate the communication overhead. An important open question is whether privacy, robustness and communication-efficiency can be achieved simultaneously. Only preliminary work has been done in this direction.

Statistical Challenges

Conventional distributed training theory assumes statistical homogeneity of the training devices. These assumptions typically do not hold in the Federated setting, where every client independently generates his own data. This poses new challenges, when it comes to analyzing the convergence properties of Federated Learning algorithms.

Beyond learning a single Model: Federated Meta and Multi-Task Learning

The heterogeneous nature of Federated environments often makes training a single models difficult or even undesirable. In these situations Federated Meta- and Multi-Task learning solutions may offer better utility by providing personalized models for every client.

Frameworks

Federated Learning has been implemented in dedicated libraries for both TensorFlow and Pytorch.