Upcoming Events
2021-09-19 Special Session on "Neural Network Compression and Compact Deep Features: From Methods to Standards" at
IEEE ICIP 2021 in Anchorage, USA.
Past Events
2020-12-11 Tutorial on "Distributed Deep Learning: Concepts, Methods & Applications in Wireless Networks"
IEEE GLOBECOM 2020 in Taipe, Taiwan.
2020-06-15 Workshop on "Efficient Deep Learning for Computer Vision"
IEEE CVPR 2020 in Seattle, USA.
2020-05-05 Special Session on "Distributed Machine Learning on Wireless Networks" at
IEEE ICASSP 2020 in Barcelona, Spain.
2020-05-04 Tutorial on "Distributed and Efficient Deep Learning"
IEEE ICASSP 2020 in Barcelona, Spain.
This webpage aims to regroup publications and software produced as part of a project at Fraunhofer HHI on developing new method for federated and efficient deep learning.
Why Neural Network Compression ?
State-of-the-art machine learning models such as deep neural networks are known to work excellently in practice. However, since the training and execution of these models require extensive computational resources, they may not be applicable in communications systems with limited storage capabilities, computational power and energy resources, e.g., smartphones, embedded systems or IoT devices.
Our research addresses this problem and focuses on the development of techniques for reducing the complexity and increasing the execution efficiency of deep neural networks.
Why Federated Learning ?
Large deep neural networks are trained on huge data corpora. Therefore, distributed training schemes are becoming increasingly relevant. A major issue in distributed training is the limited communication bandwidth between contributing nodes or prohibitive communication cost in general.
In our research we investigate new methods for reducing the communication cost for distributed training. This includes techniques of communication delay and gradient sparsification as well as optimal weight update encoding. Our results show that the upstream communication can be reduced by more than four orders of magnitude without significantly harming the convergence speed.
Software
- DeepCABAC: A universal tool for neural network compression (software)
- Robust and Communication-Efficient Federated Learning from Non-IID Data (software)
- Clustered Federated Learning (software)
Tutorials
- "Recent Advances in Federated Learning for Communication," ITU AI/ML in 5G Challenge, online event, 2020.
- "DeepCABAC: A Universal Compression Algorithm for Deep Neural Networks," 6th Workshop on Energy Efficient Machine Learning and Cognitive Computing, online event, 2020.
- "A Universal Compression Algorithm for Deep Neural Networks," AI for Good Global Summit 2020, Geneva, Switzerland, 2020.
- IEEE ICASSP 2020 Tutorial on "Distributed and Efficient Deep Learning", Barcelona, Spain.
 |  |
Slides Part 1 | Slides Part 2 |
Publications
Efficient Deep Learning
- S Wiedemanny, S Shivapakashy, P Wiedemanny, D Becking, W Samek, F Gerfers, T Wiegand. FantastIC4: A Hardware-Software Co-Design Approach for Efficiently Running 4bit-Compact Multilayer Perceptrons
arXiv:2012.11331, 2020
[bibtex] [preprint]
- S Yeom, P Seegerer, S Lapuschkin, S Wiedemann, KR Müller, W Samek. Pruning by Explaining: A Novel Criterion for Deep Neural Network Pruning
arXiv:1912.08881, 2019
[bibtex] [preprint]
- S Wiedemann, H Kirchhoffer, S Matlage, P Haase, A Marban, T Marinc, D Neumann, T Nguyen, A Osman, H Schwarz, D Marpe, T Wiegand, W Samek. DeepCABAC: A Universal Compression Algorithm for Deep Neural Networks
IEEE Journal of Selected Topics in Signal Processing, 14(4):700-714, 2020
[bibtex] [preprint] [code]
- S Wiedemann, KR Müller, W Samek. Compact and Computationally Efficient Representation of Deep Neural Networks
IEEE Transactions on Neural Networks and Learning Systems, 31(3):772-785, 2020
[bibtex] [preprint]
- S Wiedemann, T Mehari, K Kepp, W Samek. Dithered backprop: A sparse and quantized backpropagation algorithm for more efficient deep neural network training
Proceedings of the CVPR'20 Joint Workshop on Efficient Deep Learning in Computer Vision, 3096-3104, 2020
[bibtex] [preprint]
- A Marban, D Becking, S Wiedemann, W Samek. Learning Sparse & Ternary Neural Networks with Entropy-Constrained Trained Ternarization (EC2T)
Proceedings of the CVPR'20 Joint Workshop on Efficient Deep Learning in Computer Vision, 3105-3113, 2020
[bibtex] [preprint]
- S Wiedemann, H Kirchhoffer, S Matlage, P Haase, A Marban, T Marinc, D Neumann, T Nguyen, A Osman, H Schwarz, D Marpe, T Wiegand, W Samek. DeepCABAC: A Universal Compression Algorithm for Deep Neural Networks
IEEE Journal of Selected Topics in Signal Processing, 14(4):700-714, 2020
[bibtex] [preprint] [code]
- P Haase, H Schwarz, H Kirchhoffer, S Wiedemann, T Marinc, A Marban, K Müller, W Samek, D Marpe, T Wiegand. Dependent Scalar Quantization for Neural Network Compression
Proceedings of the IEEE International Conference on Image Processing (ICIP), 36-40, 2020
[bibtex] [preprint]
- S Wiedemann, A Marban, KR Müller, W Samek. Entropy-Constrained Training of Deep Neural Networks
Proceedings of the IEEE International Joint Conference on Neural Networks (IJCNN), 1-8, 2019
[bibtex] [preprint]
- S Wiedemann, KR Müller, W Samek. Compact and Computationally Efficient Representation of Deep Neural Networks
NIPS Workshop on Compact Deep Neural Network Representation with Industrial Applications (CDNNRIA), 1-8, 2018
[bibtex] [preprint]
Federated Learning
- F Sattler, T Korjakow, R Rischke, W Samek. FedAUX: Leveraging Unlabeled Auxiliary Data in Federated Learning
arXiv:2102.02514, 2021
[bibtex] [preprint]
- F Sattler, A Marban, R Rischke, W Samek. Communication-Efficient Federated Distillation
arXiv:2009.11732, 2020
[bibtex] [preprint]
- F Sattler, T Wiegand, W Samek. Trends and Advancements in Deep Neural Network Communication
ITU Journal: ICT Discoveries, 3(1), 2020
[bibtex] [preprint]
- F Sattler, KR Müller, W Samek. Clustered Federated Learning: Model-Agnostic Distributed Multi-Task Optimization under Privacy Constraints
IEEE Transactions on Neural Networks and Learning Systems, 2020
[bibtex] [preprint] [supplement]
- F Sattler, S Wiedemann, KR Müller, W Samek. Robust and Communication-Efficient Federated Learning from Non-IID Data
IEEE Transactions on Neural Networks and Learning Systems, 31(9):3400-3413, 2020
[bibtex] [preprint]
- D Neumann, F Sattler, H Kirchhoffer, S Wiedemann, K Müller, H Schwarz, T Wiegand, D Marpe, and W Samek. DeepCABAC: Plug&Play Compression of Neural Network Weights and Weight Updates
Proceedings of the IEEE International Conference on Image Processing (ICIP), 21-25, 2020
[bibtex] [preprint]
- F Sattler, KR Müller, W Samek. Clustered Federated Learning
Proceedings of the NeurIPS'19 Workshop on Federated Learning for Data Privacy and Confidentiality, 1-5, 2019
[bibtex] [preprint]
- F Sattler, KR Müller, T Wiegand, W Samek. On the Byzantine Robustness of Clustered Federated Learning
Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), 8861-8865, 2020
[bibtex] [preprint]
- F Sattler, S Wiedemann, KR Müller, W Samek. Sparse Binary Compression: Towards Distributed Deep Learning with minimal Communication
Proceedings of the IEEE International Joint Conference on Neural Networks (IJCNN), 1-8, 2019
[bibtex] [preprint]
Contributions to Standardization
- D Becking, H Kirchhoffer, K Müller, W Samek, D Marpe, H Schwarz, T Wiegand. [NNR] HLS for additional framework support.
ISO/IEC JTC1/SC29/WG4 MPEG2020/m55067, 2020.
- H Kirchhoffer, K Müller, W Samek, D Marpe, H Schwarz, T Wiegand. [NNR] Committee draft cleanups, improvements, and bug fixes.
ISO/IEC JTC1/SC29/WG4 MPEG2020/m55068, 2020.
- P Haase, D Becking, H Kirchhoffer, K Müller, W Samek, D Marpe, H Schwarz, T Wiegand. [NNR] CE4 method 19: Results on QP optimizations.
ISO/IEC JTC1/SC29/WG4 MPEG2020/m55073, 2020.
- H Kirchhoffer, P Haase, S Wiedemann, T Marinc, K Müller, W Samek, H Schwarz, D Marpe, T Wiegand. [NNR] CE4: Results on parameter optimization for DeepCABAC (method 18) and local scaling adaptation (method 19).
ISO/IEC JTC1/SC29/WG11 MPEG2020/m54395, 2020.
- P Haase, H Kirchhoffer, S Wiedemann, T Marinc, K Müller, W Samek, H Schwarz, D Marpe, T Wiegand. [NNR] CE2-CE3: Results on dependent scalar quantization.
ISO/IEC JTC1/SC29/WG11 MPEG2020/m53514, 2020.
- P Haase, H Kirchhoffer, S Wiedemann, T Marinc, K Müller, W Samek, H Schwarz, D Marpe, T Wiegand. [NNR] CE3-related: Parameter-Optimization for DeepCABAC.
ISO/IEC JTC1/SC29/WG11 MPEG2020/m53515, 2020.
- S Wiedemann, P Haase, H Kirchhoffer, T Marinc, K Müller, W Samek, H Schwarz, D Marpe, T Wiegand. [NNR] CE2: Results on importance-weighted quantization.
ISO/IEC JTC1/SC29/WG11 MPEG2020/m53516, 2020.
- S Wiedemann, P Haase, H Kirchhoffer, T Marinc, K Müller, W Samek, H Schwarz, D Marpe, T Wiegand. [NNR] CE2-CE3-related: Local parameter scaling.
ISO/IEC JTC1/SC29/WG11 MPEG2020/m53517, 2020.
- K Müller, H Kirchhoffer, T Marinc, S Wiedemann, H Schwarz, W Samek, D Marpe, T Wiegand. [NNR] Additional HLS and decoding process specification for Neural Network Compression (ISO/IEC 15938-17).
ISO/IEC JTC1/SC29/WG11 MPEG2020/m53518, 2020.
- F Sattler, D Neumann, S Wiedemann, K Müller, W Samek, D Marpe, T Wiegand. [NNR] Test Data, Evaluation Framework and Results for NNU / Federated Learning Use Cases.
ISO/IEC JTC1/SC29/WG11 MPEG2020/m52375, 2020.
- K Müller, R Skupin, Y Sanchez, S Wiedemann, H Kirchhoffer, H Schwarz, W Samek, D Marpe, T Wiegand. [NNR] Basic High-Level Syntax for Neural Network Compression (ISO/IEC 15938-17, i.e. MPEG-7 part 17).
ISO/IEC JTC1/SC29/WG11 MPEG2020/ m52352, 2020.
- P Haase, H Schwarz, H Kirchhoffer, S Wiedemann, S Matlage, T Marinc, A Marban, K Müller, W Samek, D Marpe, T Wiegand. [NNR] CE2-related: Dependent scalar quantization for neural network parameter approximation.
ISO/IEC JTC1/SC29/WG11 MPEG2020/m52358, 2020.
- S Wiedemann, H Kirchhoffer, S Matlage, P Haase, A Marban, T Marinc, D Neumann, H Schwarz, D Marpe, W Samek, T Wiegand. Report of CEs results.
ISO/IEC JTC1/SC29/WG11 MPEG2019/m48662, 2019.
- S Wiedemann, H Kirchhoffer, W Samek, K Müller, D Marpe, H Schwarz, T Wiegand. Proposal of python interfaces for an NNR test model.
ISO/IEC JTC1/SC29/WG11 MPEG2019/M49867, 2019.
- S Wiedemann, H Kirchhoffer, S Matlage, P Haase, A Marban, T Marinc, D Neumann, A Osman, H Schwarz, D Marpe, W Samek, T Wiegand. Response to the Call for Proposals on Neural Network Compression: End-to-end processing pipeline for highly compressible neural networks.
ISO/IEC JTC1/SC29/WG11 MPEG2019/M47698, 2019.
- W Samek, S Wiedemann. Efficient representations of neural networks.
Focus Group on Machine Learning for Future Networks including 5G, no. ML5G-I-102, 2018.
- W Samek, S Wiedemann, S Stanczak, T Wiegand. Data Formats and Specifications for Efficient Machine Learning in Communications.
Focus Group on Machine Learning for Future Networks including 5G, no. ML5G-I-013, 2017.