SlideShare a Scribd company logo
Federated Learning
By : Miloudi Amara
Federated learning
Federated learning(FL) is a framework used to train a shared global
model on data across distributed devices while the data does not leave
the device at any point.
Underlying Architecture
Central Server: the entity responsible for managing the connections between the
entities in the FL environment and for aggregating the knowledge acquired by the FL
clients;
Parties (Clients): all computing devices with data that can be used for training the
global model, including but not limited to: personal computers, servers, smartphones,
smartwatches, computerized sensor devices, and many more;
Communication Framework: consists of the tools and devices used to connect servers
and parties and can vary between an internal network, an intranet, or even the Internet;
Aggregation Algorithm: the entity responsible for aggregating the knowledge obtained
by the parties after training with their local data and using the aggregated knowledge to
update the global model.
The learning process
Exchanging Models, Parameters, or Gradients
Exchanging Models: This is the classical approach, where models are exchanged
between server and clients.
Exchanging Gradients: Instead of submitting the entire model to the server, clients in
this method submit only the gradients they compute locally.
Exchanging Model Parameters : This concept is mainly tied to neural networks where
model parameters and weights are usually used interchangeably. Parameters,
sometimes called weights.
Hybrid Approaches: Two or more of the above methods can be combined to form a
hybrid strategy that is particularly suited to a particular application or environment.
The categorization of federated learning
➔ Federated learning enables collaborative model training without sharing
raw data.
➔ It's categorized based on feature overlap in client datasets:
◆ Horizontal Federated Learning (HFL).
◆ Vertical Federated Learning (VFL).
◆ Federated Transfer Learning (FTL).
The categorization of federated learning
➔ Shared features, different users:
Clients have the same set of features.
➔ Focus: Leveraging the diversity of
users with the same data structure to
enhance model accuracy and
generalization.
➔ Example: Multiple banks training a
fraud detection model using
transaction data (shared features) from
different customers (different users).
Horizontal federated learning
The categorization of federated learning
➔ Different features, overlapping users:
Clients have different feature sets but
some features might overlap.
➔ Focus: Combining data from participants
with complementary information while
protecting sensitive features.
➔ Example: Hospitals and insurance
companies collaborating on healthcare
predictions using medical records
(Hospital data) and policy data (Insurance
data) with overlapping features like
patient IDs.
Vertical federated learning
The categorization of federated learning
➔ Leveraging pre-trained knowledge:
Uses a pre-trained model to guide
learning on a new task or data with
different characteristics.
➔ Focus: Accelerating learning on new
tasks or data with limited resources,
especially when privacy concerns restrict
model sharing.
➔ Example: Using a sentiment analysis
model trained on public product reviews
to personalize recommendations within a
specific e-commerce domain.
Federated transfer learning
The categorization of federated learning
What is Key Differences?
Features Horizontal FL Vertical FL Transfer FL
Feature overlap High Low/Partial No
User overlap Low High Varies
Focus Data diversity,
accuracy
Shared information,
privacy
Knowledge transfer
Synchronous Vs Asynchronous Federated learning
Synchronous federated learning
The Server updates the shared central
model after “all the devices send their
model updates”.
Eg: Federated Averaging.
Synchronous Vs Asynchronous Federated learning
Synchronous federated learning
➔ This approach offers several advantages:
◆ Faster convergence: Synchronization
leads to quicker convergence towards
a more accurate global model.
◆ Better accuracy: The coordinated
updates can result in higher model
accuracy compared to asynchronous
methods.
◆ Reduced staleness: Updates are
always fresh, mitigating the issue of
outdated gradients.
Synchronous Vs Asynchronous Federated learning
Synchronous federated learning
➔ However, synchronous federated learning
also faces some challenges:
◆ Increased communication overhead:
All devices need to communicate with
the server at every step, leading to
higher bandwidth requirements.
◆ Higher synchronization latency:
Waiting for the slowest device can
introduce delays in the training
process.
◆ Potential bottlenecks: Stragg
Synchronous Vs Asynchronous Federated learning
Asynchronous federated learning
The Server updates the shared central
model “as the new updates keep coming
in”.
Eg: SMPC Aggregation, Secure
Aggregation with Trusted Execution
Environment(TEE).
Synchronous Vs Asynchronous Federated learning
Asynchronous federated learning
➔ This approach offers several advantages:
◆ Relaxed communication
requirements: Devices can update
the model whenever convenient,
reducing communication overhead.
◆ Improved scalability: Asynchronous
learning can handle a large number
of devices more efficiently.
◆ Fault tolerance: The system is more
resilient to device failures or
intermittent connections.
Synchronous Vs Asynchronous Federated learning
Asynchronous federated learning
➔ However, asynchronous federated
learning also faces some challenges:
◆ Stale gradients: Updates from
devices may become outdated
before reaching the server, impacting
accuracy.
◆ Slower convergence: The lack of
synchronization can slow down the
overall training process.
◆ Potential for divergence: Individual
models on devices may diverge
significantly from the global model.
What Is Aggregation in FL?
● Aggregation methods vary, each with unique advantages and
challenges.
○ Beyond model updates, aggregate statistical indicators
(loss, accuracy).
○ Hierarchical aggregation for large-scale FL systems.
● Aggregation algorithms are crucial for FL success.
○ Determine model training effectiveness.
○ Impact practical usability of the global model.
Different Approaches of Aggregation
Different Approaches of Aggregation
➔ Average Aggregation
➔ Clipped Average Aggregation
➔ Secure Aggregation
➔ Differential Privacy Average Aggregation
➔ Momentum Aggregation
➔ Weighted Aggregation
➔ Bayesian Aggregation
➔ Adversarial Aggregation
➔ Quantization
➔ Hierarchical Aggregation
➔ Personalized Aggregation
➔ Ensemble-based Aggregation
Averaging Aggregation
This is the initial approach and the most commonly known. In this approach, the
server summarizes the received messages, whether they are model updates,
parameters, or gradients, by determining the average value of the received
updates.
Since the set of participating clients is denoted by “N”and their updates are
denoted by “𝑤𝑖”, the aggregate update “w” is calculated as follows
Different Approaches of Aggregation
Clipped Averaging Aggregation
This method is similar to average aggregation, where the average of received messages
is calculated, but with an additional step of clipping the model updates to a predefined
range before averaging.
This approach helps reduce the impact of outliers and malicious clients that may transmit
large and malicious updates.
“𝑐𝑙𝑖𝑝(𝑥,𝑐)” is a function, which clips the values of “x” to a range of “[−𝑐,𝑐]”, and “c” is the
clipping threshold,
Different Approaches of Aggregation
Secure Aggregation
➔ Techniques such as homomorphic encryption, secure multiparty computation,
and secure enclaves make the aggregation process more secure and private in
this way. These methods can ensure that client data remain confidential during
the aggregation process, which is critical in environments where data privacy is a
high priority.
➔ Secure aggregation is the result of integrating security techniques, with one of
the available aggregation algorithms to create a new secure algorithm. However,
one of the most popular secure aggregation algorithms is the differential privacy
aggregation algorithm.
Different Approaches of Aggregation
Differential Privacy Average Aggregation
This approach adds a layer of differential privacy to the aggregation process to ensure
confidentiality of client data. Each client adds random noise to its model update before
sending it to the server, and the server compiles the final model by aggregating the updates
with the random noise.
● “𝑛𝑖” is a random noise vector, drawn from a Laplace distribution
● “b” is a privacy budget parameter,.
Different Approaches of Aggregation
Momentum Aggregation
This strategy should help solve the slow convergence problem in federated learning. Each
client stores a “momentum” term that describes the direction of model changes in the past.
Before a new update is sent to the server, the momentum term is appended to the update.
The server collects the updates enriched with the momentum term to build the final model,
which can speed up convergence
Different Approaches of Aggregation
Weighted Aggregation
In this method, the server weights each client’s contribution to the final model update
depending on client performance or other parameters such as the client’s device type, the
quality of the network connection, or the similarity of the data to the global data
distribution. This can help give more weight to consumers that are more reliable or
representative, improving the overall accuracy of the model.
“N” denotes the set of participating clients and “𝑤𝑖” their relative weights, and their
corresponding weights “𝑎𝑖”
Different Approaches of Aggregation
Bayesian Aggregation
In this approach, the server aggregates model updates from multiple clients using Bayesian
inference, which allows for uncertainty in model parameters. This can help reduce
overfitting and improve the generalizability of the model
Different Approaches of Aggregation
Adversarial Aggregation
In this method, the server applies a number of techniques to detect and mitigate the impact
of customers submitting fraudulent model changes. This may include methods such as
outlier rejection, model-based anomaly detection, and secure enclaves
Different Approaches of Aggregation
Quantization Aggregation
In this approach, model updates are quantized into a lower bit form before being delivered
to the server for aggregation. This reduces the amount of data to be transmitted and
improves communication efficiency.
Different Approaches of Aggregation
Hierarchical Aggregation
In this way, the aggregation process is carried out at multiple levels of a hierarchical
structure, such as a federal hierarchy. This can help reduce the communication overhead by
performing local aggregations at lower levels of the hierarchy before passing the results on
to higher levels.
Different Approaches of Aggregation
Personalized Aggregation
During the aggregation process, this approach considers the unique characteristics of each
client’s data. In this way, the global model can be updated in the most appropriate way for
each client’s data, while ensuring data privacy.
Different Approaches of Aggregation
The contribution areas
➔ Improving model aggregation
➔ Reducing convergence
➔ Handling heterogeneity
➔ Enhancing security
➔ Reducing communication and
computation cost
➔ Handling users’ failures (fault
tolerance);
➔ Boosting learning quality
➔ Supporting scalability,
personalization and generalization.
Different Approaches of Aggregation
Top Federated learning frameworks
Top Federated learning frameworks
Tensorflow Federated
➔ TensorFlow Federated (TFF): Building Blocks for
Distributed Learning
◆ Open-source and flexible framework by Google
AI
◆ High-level API for defining federated
computations and algorithms
◆ Supports various machine learning models and
distributed architectures
Top Federated learning frameworks
Pysyft
➔ PySyft: Secure and Private Federated Learning with
Python
◆ Secure enclaves for data privacy and
computation
◆ Focus on secure aggregation and model
poisoning prevention
◆ Easy integration with existing Python libraries
and tools.
Top Federated learning frameworks
Flower
➔ Flower: Orchestrating Federated Learning
Workflows
◆ Lightweight and flexible framework for
managing federated training
◆ Focus on orchestration, communication, and
resource management
◆ Agnostic to underlying machine learning
libraries and frameworks.
Top Federated learning frameworks
Choosing the Right Federated Learning Framework
➔ Consider your project needs, target platforms, and security
requirements
➔ Evaluate strengths and weaknesses of each framework
➔ Focus on a framework that aligns with your expertise and comfort
level
Thank you for your time 😊

More Related Content

PPTX
Federated learning in brief
PPTX
Fedarated learning
PPTX
Big Data Stockholm v 7 | "Federated Machine Learning for Collaborative and Se...
PDF
Federated learning
PPTX
Federated Learning
PPTX
Federated Learning
PDF
Privacy preserving machine learning
PPTX
Active learning
Federated learning in brief
Fedarated learning
Big Data Stockholm v 7 | "Federated Machine Learning for Collaborative and Se...
Federated learning
Federated Learning
Federated Learning
Privacy preserving machine learning
Active learning

What's hot (20)

PDF
Linear Regression vs Logistic Regression | Edureka
PPTX
Lecture #01
PPT
3.2 partitioning methods
PPTX
Big Data Helsinki v 3 | "Federated Learning and Privacy-preserving AI" - Oguz...
PDF
Anomaly detection
PDF
Introduction to Data Science and Analytics
PDF
Federated Learning
PPT
Data mining techniques unit 1
PPT
3.3 hierarchical methods
PDF
An introduction to Machine Learning
PPTX
Data mining Measuring similarity and desimilarity
PPTX
Data Integration and Transformation in Data mining
PPSX
Parallel Database
PDF
Hadoop data management
PPTX
Deep neural networks
PPTX
Machine Learning for Disease Prediction
PPTX
Generative Adversarial Network (GAN)
PPTX
COVID - 19 DATA ANALYSIS USING PYTHON and Introduction to Data Science
PPTX
Learning from imbalanced data
PPTX
Introduction to Deep Learning
Linear Regression vs Logistic Regression | Edureka
Lecture #01
3.2 partitioning methods
Big Data Helsinki v 3 | "Federated Learning and Privacy-preserving AI" - Oguz...
Anomaly detection
Introduction to Data Science and Analytics
Federated Learning
Data mining techniques unit 1
3.3 hierarchical methods
An introduction to Machine Learning
Data mining Measuring similarity and desimilarity
Data Integration and Transformation in Data mining
Parallel Database
Hadoop data management
Deep neural networks
Machine Learning for Disease Prediction
Generative Adversarial Network (GAN)
COVID - 19 DATA ANALYSIS USING PYTHON and Introduction to Data Science
Learning from imbalanced data
Introduction to Deep Learning
Ad

Similar to Federated Learning (20)

PPTX
Federated Learning Overview and research issues
PPTX
Federated Learning Overview and New Research Areas
PPTX
Multi Layer Federated Learning.pptx
PDF
leewayhertz.com-Federated learning Unlocking the potential of secure distribu...
PDF
Cooperative Machine Learning Network with Ahmed Masud of saf.ai
PPTX
Federated Learning: A New Frontier in AI
PPTX
Federated Learning: Collabarative Learning
PPTX
Federated-Learning-Empowering-Decentralized-AI.pptx
PPTX
Federated_Learning_Advanced_TrendsinAI.pptx
PPTX
federated learning method of machine learning
PDF
Revolutionizing Privacy and AI Collaboration with Federated Learning
PDF
BM_KB_PK-slides.pdfssssssssssssssssssssssssssssssssss
PPTX
FINAL FL MINOR PPT.pptx rrwerwewerweeeeetew
PDF
What is Federated Learning.pdf
PDF
2019-09-05Federated Learning.pdf
PDF
[OREILLY AI LONDON 2019] Anomaly Detection in Smart Buildings using Federated...
PDF
[DATAHACK SUMMIT INDIA 2019] Federated Learning using Deep Learning
PDF
Federated learning of deep networks using model averaging
PPTX
Term Project of paper review entitled a unified analysis of federated learnin...
PDF
Federated Learning Techniques And Its Application In The Healthcare Industry ...
Federated Learning Overview and research issues
Federated Learning Overview and New Research Areas
Multi Layer Federated Learning.pptx
leewayhertz.com-Federated learning Unlocking the potential of secure distribu...
Cooperative Machine Learning Network with Ahmed Masud of saf.ai
Federated Learning: A New Frontier in AI
Federated Learning: Collabarative Learning
Federated-Learning-Empowering-Decentralized-AI.pptx
Federated_Learning_Advanced_TrendsinAI.pptx
federated learning method of machine learning
Revolutionizing Privacy and AI Collaboration with Federated Learning
BM_KB_PK-slides.pdfssssssssssssssssssssssssssssssssss
FINAL FL MINOR PPT.pptx rrwerwewerweeeeetew
What is Federated Learning.pdf
2019-09-05Federated Learning.pdf
[OREILLY AI LONDON 2019] Anomaly Detection in Smart Buildings using Federated...
[DATAHACK SUMMIT INDIA 2019] Federated Learning using Deep Learning
Federated learning of deep networks using model averaging
Term Project of paper review entitled a unified analysis of federated learnin...
Federated Learning Techniques And Its Application In The Healthcare Industry ...
Ad

Recently uploaded (20)

PDF
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
PPTX
Programs and apps: productivity, graphics, security and other tools
PDF
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
PDF
A comparative analysis of optical character recognition models for extracting...
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PDF
Getting Started with Data Integration: FME Form 101
PPTX
Tartificialntelligence_presentation.pptx
PPTX
cloud_computing_Infrastucture_as_cloud_p
PDF
Network Security Unit 5.pdf for BCA BBA.
PPTX
Machine Learning_overview_presentation.pptx
PPTX
OMC Textile Division Presentation 2021.pptx
PDF
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
PDF
gpt5_lecture_notes_comprehensive_20250812015547.pdf
PDF
Video forgery: An extensive analysis of inter-and intra-frame manipulation al...
PDF
Spectral efficient network and resource selection model in 5G networks
PPTX
Spectroscopy.pptx food analysis technology
PDF
Advanced methodologies resolving dimensionality complications for autism neur...
PPTX
TechTalks-8-2019-Service-Management-ITIL-Refresh-ITIL-4-Framework-Supports-Ou...
PDF
Accuracy of neural networks in brain wave diagnosis of schizophrenia
PDF
Reach Out and Touch Someone: Haptics and Empathic Computing
7 ChatGPT Prompts to Help You Define Your Ideal Customer Profile.pdf
Programs and apps: productivity, graphics, security and other tools
Blue Purple Modern Animated Computer Science Presentation.pdf.pdf
A comparative analysis of optical character recognition models for extracting...
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
Getting Started with Data Integration: FME Form 101
Tartificialntelligence_presentation.pptx
cloud_computing_Infrastucture_as_cloud_p
Network Security Unit 5.pdf for BCA BBA.
Machine Learning_overview_presentation.pptx
OMC Textile Division Presentation 2021.pptx
TokAI - TikTok AI Agent : The First AI Application That Analyzes 10,000+ Vira...
gpt5_lecture_notes_comprehensive_20250812015547.pdf
Video forgery: An extensive analysis of inter-and intra-frame manipulation al...
Spectral efficient network and resource selection model in 5G networks
Spectroscopy.pptx food analysis technology
Advanced methodologies resolving dimensionality complications for autism neur...
TechTalks-8-2019-Service-Management-ITIL-Refresh-ITIL-4-Framework-Supports-Ou...
Accuracy of neural networks in brain wave diagnosis of schizophrenia
Reach Out and Touch Someone: Haptics and Empathic Computing

Federated Learning

  • 1. Federated Learning By : Miloudi Amara
  • 2. Federated learning Federated learning(FL) is a framework used to train a shared global model on data across distributed devices while the data does not leave the device at any point.
  • 3. Underlying Architecture Central Server: the entity responsible for managing the connections between the entities in the FL environment and for aggregating the knowledge acquired by the FL clients; Parties (Clients): all computing devices with data that can be used for training the global model, including but not limited to: personal computers, servers, smartphones, smartwatches, computerized sensor devices, and many more; Communication Framework: consists of the tools and devices used to connect servers and parties and can vary between an internal network, an intranet, or even the Internet; Aggregation Algorithm: the entity responsible for aggregating the knowledge obtained by the parties after training with their local data and using the aggregated knowledge to update the global model.
  • 5. Exchanging Models, Parameters, or Gradients Exchanging Models: This is the classical approach, where models are exchanged between server and clients. Exchanging Gradients: Instead of submitting the entire model to the server, clients in this method submit only the gradients they compute locally. Exchanging Model Parameters : This concept is mainly tied to neural networks where model parameters and weights are usually used interchangeably. Parameters, sometimes called weights. Hybrid Approaches: Two or more of the above methods can be combined to form a hybrid strategy that is particularly suited to a particular application or environment.
  • 6. The categorization of federated learning ➔ Federated learning enables collaborative model training without sharing raw data. ➔ It's categorized based on feature overlap in client datasets: ◆ Horizontal Federated Learning (HFL). ◆ Vertical Federated Learning (VFL). ◆ Federated Transfer Learning (FTL).
  • 7. The categorization of federated learning ➔ Shared features, different users: Clients have the same set of features. ➔ Focus: Leveraging the diversity of users with the same data structure to enhance model accuracy and generalization. ➔ Example: Multiple banks training a fraud detection model using transaction data (shared features) from different customers (different users). Horizontal federated learning
  • 8. The categorization of federated learning ➔ Different features, overlapping users: Clients have different feature sets but some features might overlap. ➔ Focus: Combining data from participants with complementary information while protecting sensitive features. ➔ Example: Hospitals and insurance companies collaborating on healthcare predictions using medical records (Hospital data) and policy data (Insurance data) with overlapping features like patient IDs. Vertical federated learning
  • 9. The categorization of federated learning ➔ Leveraging pre-trained knowledge: Uses a pre-trained model to guide learning on a new task or data with different characteristics. ➔ Focus: Accelerating learning on new tasks or data with limited resources, especially when privacy concerns restrict model sharing. ➔ Example: Using a sentiment analysis model trained on public product reviews to personalize recommendations within a specific e-commerce domain. Federated transfer learning
  • 10. The categorization of federated learning What is Key Differences? Features Horizontal FL Vertical FL Transfer FL Feature overlap High Low/Partial No User overlap Low High Varies Focus Data diversity, accuracy Shared information, privacy Knowledge transfer
  • 11. Synchronous Vs Asynchronous Federated learning Synchronous federated learning The Server updates the shared central model after “all the devices send their model updates”. Eg: Federated Averaging.
  • 12. Synchronous Vs Asynchronous Federated learning Synchronous federated learning ➔ This approach offers several advantages: ◆ Faster convergence: Synchronization leads to quicker convergence towards a more accurate global model. ◆ Better accuracy: The coordinated updates can result in higher model accuracy compared to asynchronous methods. ◆ Reduced staleness: Updates are always fresh, mitigating the issue of outdated gradients.
  • 13. Synchronous Vs Asynchronous Federated learning Synchronous federated learning ➔ However, synchronous federated learning also faces some challenges: ◆ Increased communication overhead: All devices need to communicate with the server at every step, leading to higher bandwidth requirements. ◆ Higher synchronization latency: Waiting for the slowest device can introduce delays in the training process. ◆ Potential bottlenecks: Stragg
  • 14. Synchronous Vs Asynchronous Federated learning Asynchronous federated learning The Server updates the shared central model “as the new updates keep coming in”. Eg: SMPC Aggregation, Secure Aggregation with Trusted Execution Environment(TEE).
  • 15. Synchronous Vs Asynchronous Federated learning Asynchronous federated learning ➔ This approach offers several advantages: ◆ Relaxed communication requirements: Devices can update the model whenever convenient, reducing communication overhead. ◆ Improved scalability: Asynchronous learning can handle a large number of devices more efficiently. ◆ Fault tolerance: The system is more resilient to device failures or intermittent connections.
  • 16. Synchronous Vs Asynchronous Federated learning Asynchronous federated learning ➔ However, asynchronous federated learning also faces some challenges: ◆ Stale gradients: Updates from devices may become outdated before reaching the server, impacting accuracy. ◆ Slower convergence: The lack of synchronization can slow down the overall training process. ◆ Potential for divergence: Individual models on devices may diverge significantly from the global model.
  • 17. What Is Aggregation in FL? ● Aggregation methods vary, each with unique advantages and challenges. ○ Beyond model updates, aggregate statistical indicators (loss, accuracy). ○ Hierarchical aggregation for large-scale FL systems. ● Aggregation algorithms are crucial for FL success. ○ Determine model training effectiveness. ○ Impact practical usability of the global model.
  • 19. Different Approaches of Aggregation ➔ Average Aggregation ➔ Clipped Average Aggregation ➔ Secure Aggregation ➔ Differential Privacy Average Aggregation ➔ Momentum Aggregation ➔ Weighted Aggregation ➔ Bayesian Aggregation ➔ Adversarial Aggregation ➔ Quantization ➔ Hierarchical Aggregation ➔ Personalized Aggregation ➔ Ensemble-based Aggregation
  • 20. Averaging Aggregation This is the initial approach and the most commonly known. In this approach, the server summarizes the received messages, whether they are model updates, parameters, or gradients, by determining the average value of the received updates. Since the set of participating clients is denoted by “N”and their updates are denoted by “𝑤𝑖”, the aggregate update “w” is calculated as follows Different Approaches of Aggregation
  • 21. Clipped Averaging Aggregation This method is similar to average aggregation, where the average of received messages is calculated, but with an additional step of clipping the model updates to a predefined range before averaging. This approach helps reduce the impact of outliers and malicious clients that may transmit large and malicious updates. “𝑐𝑙𝑖𝑝(𝑥,𝑐)” is a function, which clips the values of “x” to a range of “[−𝑐,𝑐]”, and “c” is the clipping threshold, Different Approaches of Aggregation
  • 22. Secure Aggregation ➔ Techniques such as homomorphic encryption, secure multiparty computation, and secure enclaves make the aggregation process more secure and private in this way. These methods can ensure that client data remain confidential during the aggregation process, which is critical in environments where data privacy is a high priority. ➔ Secure aggregation is the result of integrating security techniques, with one of the available aggregation algorithms to create a new secure algorithm. However, one of the most popular secure aggregation algorithms is the differential privacy aggregation algorithm. Different Approaches of Aggregation
  • 23. Differential Privacy Average Aggregation This approach adds a layer of differential privacy to the aggregation process to ensure confidentiality of client data. Each client adds random noise to its model update before sending it to the server, and the server compiles the final model by aggregating the updates with the random noise. ● “𝑛𝑖” is a random noise vector, drawn from a Laplace distribution ● “b” is a privacy budget parameter,. Different Approaches of Aggregation
  • 24. Momentum Aggregation This strategy should help solve the slow convergence problem in federated learning. Each client stores a “momentum” term that describes the direction of model changes in the past. Before a new update is sent to the server, the momentum term is appended to the update. The server collects the updates enriched with the momentum term to build the final model, which can speed up convergence Different Approaches of Aggregation
  • 25. Weighted Aggregation In this method, the server weights each client’s contribution to the final model update depending on client performance or other parameters such as the client’s device type, the quality of the network connection, or the similarity of the data to the global data distribution. This can help give more weight to consumers that are more reliable or representative, improving the overall accuracy of the model. “N” denotes the set of participating clients and “𝑤𝑖” their relative weights, and their corresponding weights “𝑎𝑖” Different Approaches of Aggregation
  • 26. Bayesian Aggregation In this approach, the server aggregates model updates from multiple clients using Bayesian inference, which allows for uncertainty in model parameters. This can help reduce overfitting and improve the generalizability of the model Different Approaches of Aggregation
  • 27. Adversarial Aggregation In this method, the server applies a number of techniques to detect and mitigate the impact of customers submitting fraudulent model changes. This may include methods such as outlier rejection, model-based anomaly detection, and secure enclaves Different Approaches of Aggregation
  • 28. Quantization Aggregation In this approach, model updates are quantized into a lower bit form before being delivered to the server for aggregation. This reduces the amount of data to be transmitted and improves communication efficiency. Different Approaches of Aggregation
  • 29. Hierarchical Aggregation In this way, the aggregation process is carried out at multiple levels of a hierarchical structure, such as a federal hierarchy. This can help reduce the communication overhead by performing local aggregations at lower levels of the hierarchy before passing the results on to higher levels. Different Approaches of Aggregation
  • 30. Personalized Aggregation During the aggregation process, this approach considers the unique characteristics of each client’s data. In this way, the global model can be updated in the most appropriate way for each client’s data, while ensuring data privacy. Different Approaches of Aggregation
  • 31. The contribution areas ➔ Improving model aggregation ➔ Reducing convergence ➔ Handling heterogeneity ➔ Enhancing security ➔ Reducing communication and computation cost ➔ Handling users’ failures (fault tolerance); ➔ Boosting learning quality ➔ Supporting scalability, personalization and generalization. Different Approaches of Aggregation
  • 33. Top Federated learning frameworks Tensorflow Federated ➔ TensorFlow Federated (TFF): Building Blocks for Distributed Learning ◆ Open-source and flexible framework by Google AI ◆ High-level API for defining federated computations and algorithms ◆ Supports various machine learning models and distributed architectures
  • 34. Top Federated learning frameworks Pysyft ➔ PySyft: Secure and Private Federated Learning with Python ◆ Secure enclaves for data privacy and computation ◆ Focus on secure aggregation and model poisoning prevention ◆ Easy integration with existing Python libraries and tools.
  • 35. Top Federated learning frameworks Flower ➔ Flower: Orchestrating Federated Learning Workflows ◆ Lightweight and flexible framework for managing federated training ◆ Focus on orchestration, communication, and resource management ◆ Agnostic to underlying machine learning libraries and frameworks.
  • 36. Top Federated learning frameworks Choosing the Right Federated Learning Framework ➔ Consider your project needs, target platforms, and security requirements ➔ Evaluate strengths and weaknesses of each framework ➔ Focus on a framework that aligns with your expertise and comfort level
  • 37. Thank you for your time 😊