SlideShare a Scribd company logo
Firefly Optimization Algorithm for Feature Selection
Seminar Guide : Mr. Vinit Tribhuvan
Unnati Rathi (33365)
Abstract
โ— The modified Firefly Algorithm (FFA) improves feature
selection in machine learning by efficiently reducing
dataset dimensionality, balancing both classification
accuracy and feature reduction.
โ— This algorithm draws inspiration from the natural flashing
behavior of fireflies, and the modified version
incorporates a quasi-reflection learning method to
overcome limitations of the standard FFA.
โ— Validation across various datasets shows that the
modified FFA outperforms other optimization techniques,
such as Particle Swarm Optimization (PSO) and Genetic
Algorithms (GA).
โ— The modified FFA excels in selecting relevant features
while maintaining high classification accuracy, making it a
valuable tool for machine learning tasks.
Algorithm
Methods
Introduction
References
โ€ขRagab, M. (2024). Hybrid firefly particle swarm optimisation algorithm for
feature selection problems. Expert Systems, 41(7), e13363.
โ€ขIbrahim, H. T., Mazher, W. J., & Yaseen, Z. F. (2024). Hybrid Feature
Selection Approach Based on Firefly Algorithm and Simulated Annealing for
Cancer Datasets. University of Thi-Qar Journal for Engineering Sciences,
14(1), 1-9.
โ€ขBezdan, Timea, et al. "Feature selection by firefly algorithm with improved
initialization strategy." 7th conference on the engineering of computer based
systems. 2021.
โ€ขXu, H., Yu, S., Chen, J., & Zuo, X. (2018). An improved firefly algorithm for
feature selection in classification. Wireless Personal Communications, 102,
2823-2834.
โ€ขEmary, E., Zawbaa, H. M., Ghany, K. K. A., Hassanien, A. E., & Parv, B.
(2015, September). Firefly optimization algorithm for feature selection.
In Proceedings of the 7th balkan conference on informatics conference (pp. 1-
7).
โ€ขJohari, N. F., Zain, A. M., Noorfa, M. H., & Udin, A. (2013). Firefly algorithm
for optimization problem. Applied Mechanics and Materials, 421, 512-517.
โ€ขYang, X. S., & He, X. (2013). Firefly algorithm: recent advances and
applications. International journal of swarm intelligence, 1(1), 36-50.
โ€ขFeature Selection Techniques in Machine Learning: Geeks for geeks
โ€ข Initialization: The quasi-reflection learning method
ensures diverse initial solutions, allowing better
coverage of the feature space and minimizing
premature convergence.
โ€ข Attraction and Movement: Fireflies move towards
brighter (better) solutions, with attraction being
proportional to the brightness difference. This
mechanism explores the search space to find
optimal feature subsets.
โ€ข Fitness Function: It evaluates the quality of
feature subsets based on two main criteriaโ€”
classification accuracy and feature reduction. This
ensures a balance between model performance and
computational efficiency.
โ€ข Convergence: The algorithm iterates through the
search space, refining feature subsets until an
optimal or near-optimal solution is reached.
Advancements like adaptive parameter control
further improve its convergence.
In today's data-driven world, extracting key insights from
large datasets is critical but challenging due to irrelevant
or redundant features. Feature selection helps streamline
this process by identifying the most important features
while maintaining or improving model accuracy. Inspired
by the natural flashing of fireflies, the Firefly Optimization
Algorithm (FFA) offers an innovative solution. The
modified FFA,
enhanced with quasi-reflection
learning, excels in efficiently
selecting optimal feature
subsets, boosting both
performance and reducing
computational complexity.
Conclusion
โ€ข This presentation highlights the advancements
made in the modified Firefly Algorithm (FFA) for
feature selection in machine learning. By
incorporating the quasi-reflection learning
technique, the enhanced FFA significantly
improves exploration of the feature space,
overcoming premature convergence and enabling
the precise identification of optimal feature
subsets.
โ€ข When compared to traditional methods like the
original FFA, Particle Swarm Optimization (PSO),
and Genetic Algorithms (GA), the modified FFA
consistently delivers superior results in terms of
both accuracy and computational efficiency. Its
versatility is evident across a wide range of
applications, including healthcare, finance,
cybersecurity, and IoT, making it a highly
adaptable tool in the machine learning ecosystem.
โ€ข In conclusion, the modified FFA proves to be a
powerful and efficient approach for feature
selection, surpassing conventional techniques. Its
ability to handle complex, high-dimensional
datasets with precision solidifies its standing as an
invaluable resource for enhancing model
performance in a variety of real-world scenarios.
โ€ข Improved Search Efficiency: The integration of
quasi-reflection learning enhances the exploration
phase, allowing the algorithm to search the feature
space more effectively. This reduces the chances of
getting trapped in local optima and improves the
precision of selected feature subsets.
โ€ข Balanced Feature Selection: By optimizing both
classification accuracy and feature reduction,
the modified FFA strikes an ideal balance between
retaining relevant features and minimizing dataset
dimensionality. This ensures a compact yet highly
informative feature set, boosting model
performance while reducing computational
overhead.
โ€ข Versatility Across Domains: The modified FFA
has demonstrated superior performance across a
wide range of applications, including healthcare,
finance, cybersecurity, and IoT. Its ability to
handle diverse datasets and different types of
optimization problems makes it highly adaptable to
various real-world scenarios.
โ€ข Outperformance of Traditional Methods: When
compared to traditional algorithms like Particle
Swarm Optimization (PSO) and Genetic
Algorithms (GA), the modified FFA consistently
achieves higher classification accuracy and
computational efficiency, making it a preferred
choice for feature selection tasks.
โ€ข Scalability for High-Dimensional Data: The
algorithm is particularly effective for high-
dimensional datasets, where traditional methods
often struggle. Its capability to handle large-scale
data without significant performance loss makes it
suitable for complex machine learning problems.
Advantages

More Related Content

DOCX
IEEE 2014 JAVA DATA MINING PROJECTS A fast clustering based feature subset se...
DOCX
2014 IEEE JAVA DATA MINING PROJECT A fast clustering based feature subset sel...
PDF
A Threshold fuzzy entropy based feature selection method applied in various b...
ย 
DOCX
DOTNET 2013 IEEE CLOUDCOMPUTING PROJECT A fast clustering based feature subse...
DOCX
A fast clustering based feature subset selection algorithm for high-dimension...
DOCX
JAVA 2013 IEEE DATAMINING PROJECT A fast clustering based feature subset sele...
DOCX
JAVA 2013 IEEE CLOUDCOMPUTING PROJECT A fast clustering based feature subset ...
DOCX
JAVA 2013 IEEE PROJECT A fast clustering based feature subset selection algor...
IEEE 2014 JAVA DATA MINING PROJECTS A fast clustering based feature subset se...
2014 IEEE JAVA DATA MINING PROJECT A fast clustering based feature subset sel...
A Threshold fuzzy entropy based feature selection method applied in various b...
ย 
DOTNET 2013 IEEE CLOUDCOMPUTING PROJECT A fast clustering based feature subse...
A fast clustering based feature subset selection algorithm for high-dimension...
JAVA 2013 IEEE DATAMINING PROJECT A fast clustering based feature subset sele...
JAVA 2013 IEEE CLOUDCOMPUTING PROJECT A fast clustering based feature subset ...
JAVA 2013 IEEE PROJECT A fast clustering based feature subset selection algor...

Similar to 33365_Poster for firefly optimization algorithm (20)

DOCX
A fast clustering based feature subset selection algorithm for high-dimension...
PDF
M43016571
PDF
Feature Selection Algorithm for Supervised and Semisupervised Clustering
DOCX
A fast clustering based feature subset selection algorithm for high-dimension...
PDF
The International Journal of Engineering and Science (The IJES)
ย 
PDF
Unsupervised Feature Selection Based on the Distribution of Features Attribut...
PDF
Iaetsd an enhanced feature selection for
PPT
SEO PROCESS
PDF
Iaetsd an efficient and large data base using subset selection algorithm
PDF
JUNE-77.pdf
PDF
Android a fast clustering-based feature subset selection algorithm for high-...
ย 
PDF
Cloudsim a fast clustering-based feature subset selection algorithm for high...
ย 
PDF
A fast clustering based feature subset selection algorithm for high-dimension...
ย 
PDF
EFFICIENT FEATURE SUBSET SELECTION MODEL FOR HIGH DIMENSIONAL DATA
PDF
Booster in High Dimensional Data Classification
PDF
International Journal of Engineering Research and Development (IJERD)
PDF
International Journal of Engineering Research and Development (IJERD)
PPT
feature selection slides share and types of features selection
PDF
C LUSTERING B ASED A TTRIBUTE S UBSET S ELECTION U SING F AST A LGORITHm
PDF
A hybrid wrapper spider monkey optimization-simulated annealing model for opt...
A fast clustering based feature subset selection algorithm for high-dimension...
M43016571
Feature Selection Algorithm for Supervised and Semisupervised Clustering
A fast clustering based feature subset selection algorithm for high-dimension...
The International Journal of Engineering and Science (The IJES)
ย 
Unsupervised Feature Selection Based on the Distribution of Features Attribut...
Iaetsd an enhanced feature selection for
SEO PROCESS
Iaetsd an efficient and large data base using subset selection algorithm
JUNE-77.pdf
Android a fast clustering-based feature subset selection algorithm for high-...
ย 
Cloudsim a fast clustering-based feature subset selection algorithm for high...
ย 
A fast clustering based feature subset selection algorithm for high-dimension...
ย 
EFFICIENT FEATURE SUBSET SELECTION MODEL FOR HIGH DIMENSIONAL DATA
Booster in High Dimensional Data Classification
International Journal of Engineering Research and Development (IJERD)
International Journal of Engineering Research and Development (IJERD)
feature selection slides share and types of features selection
C LUSTERING B ASED A TTRIBUTE S UBSET S ELECTION U SING F AST A LGORITHm
A hybrid wrapper spider monkey optimization-simulated annealing model for opt...
Ad

Recently uploaded (20)

PDF
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
PPTX
Construction Project Organization Group 2.pptx
PDF
737-MAX_SRG.pdf student reference guides
PPTX
bas. eng. economics group 4 presentation 1.pptx
PDF
Human-AI Collaboration: Balancing Agentic AI and Autonomy in Hybrid Systems
ย 
PPTX
Sustainable Sites - Green Building Construction
PPTX
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
PPT
Mechanical Engineering MATERIALS Selection
PDF
composite construction of structures.pdf
PPTX
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
PPTX
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
DOCX
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
PDF
PREDICTION OF DIABETES FROM ELECTRONIC HEALTH RECORDS
ย 
PPTX
Geodesy 1.pptx...............................................
PDF
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
PPT
Project quality management in manufacturing
PPTX
UNIT-1 - COAL BASED THERMAL POWER PLANTS
PDF
Unit I ESSENTIAL OF DIGITAL MARKETING.pdf
PDF
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
PDF
The CXO Playbook 2025 โ€“ Future-Ready Strategies for C-Suite Leaders Cerebrai...
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
Construction Project Organization Group 2.pptx
737-MAX_SRG.pdf student reference guides
bas. eng. economics group 4 presentation 1.pptx
Human-AI Collaboration: Balancing Agentic AI and Autonomy in Hybrid Systems
ย 
Sustainable Sites - Green Building Construction
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
Mechanical Engineering MATERIALS Selection
composite construction of structures.pdf
CARTOGRAPHY AND GEOINFORMATION VISUALIZATION chapter1 NPTE (2).pptx
Engineering Ethics, Safety and Environment [Autosaved] (1).pptx
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
PREDICTION OF DIABETES FROM ELECTRONIC HEALTH RECORDS
ย 
Geodesy 1.pptx...............................................
Evaluating the Democratization of the Turkish Armed Forces from a Normative P...
Project quality management in manufacturing
UNIT-1 - COAL BASED THERMAL POWER PLANTS
Unit I ESSENTIAL OF DIGITAL MARKETING.pdf
Mitigating Risks through Effective Management for Enhancing Organizational Pe...
The CXO Playbook 2025 โ€“ Future-Ready Strategies for C-Suite Leaders Cerebrai...
Ad

33365_Poster for firefly optimization algorithm

  • 1. Firefly Optimization Algorithm for Feature Selection Seminar Guide : Mr. Vinit Tribhuvan Unnati Rathi (33365) Abstract โ— The modified Firefly Algorithm (FFA) improves feature selection in machine learning by efficiently reducing dataset dimensionality, balancing both classification accuracy and feature reduction. โ— This algorithm draws inspiration from the natural flashing behavior of fireflies, and the modified version incorporates a quasi-reflection learning method to overcome limitations of the standard FFA. โ— Validation across various datasets shows that the modified FFA outperforms other optimization techniques, such as Particle Swarm Optimization (PSO) and Genetic Algorithms (GA). โ— The modified FFA excels in selecting relevant features while maintaining high classification accuracy, making it a valuable tool for machine learning tasks. Algorithm Methods Introduction References โ€ขRagab, M. (2024). Hybrid firefly particle swarm optimisation algorithm for feature selection problems. Expert Systems, 41(7), e13363. โ€ขIbrahim, H. T., Mazher, W. J., & Yaseen, Z. F. (2024). Hybrid Feature Selection Approach Based on Firefly Algorithm and Simulated Annealing for Cancer Datasets. University of Thi-Qar Journal for Engineering Sciences, 14(1), 1-9. โ€ขBezdan, Timea, et al. "Feature selection by firefly algorithm with improved initialization strategy." 7th conference on the engineering of computer based systems. 2021. โ€ขXu, H., Yu, S., Chen, J., & Zuo, X. (2018). An improved firefly algorithm for feature selection in classification. Wireless Personal Communications, 102, 2823-2834. โ€ขEmary, E., Zawbaa, H. M., Ghany, K. K. A., Hassanien, A. E., & Parv, B. (2015, September). Firefly optimization algorithm for feature selection. In Proceedings of the 7th balkan conference on informatics conference (pp. 1- 7). โ€ขJohari, N. F., Zain, A. M., Noorfa, M. H., & Udin, A. (2013). Firefly algorithm for optimization problem. Applied Mechanics and Materials, 421, 512-517. โ€ขYang, X. S., & He, X. (2013). Firefly algorithm: recent advances and applications. International journal of swarm intelligence, 1(1), 36-50. โ€ขFeature Selection Techniques in Machine Learning: Geeks for geeks โ€ข Initialization: The quasi-reflection learning method ensures diverse initial solutions, allowing better coverage of the feature space and minimizing premature convergence. โ€ข Attraction and Movement: Fireflies move towards brighter (better) solutions, with attraction being proportional to the brightness difference. This mechanism explores the search space to find optimal feature subsets. โ€ข Fitness Function: It evaluates the quality of feature subsets based on two main criteriaโ€” classification accuracy and feature reduction. This ensures a balance between model performance and computational efficiency. โ€ข Convergence: The algorithm iterates through the search space, refining feature subsets until an optimal or near-optimal solution is reached. Advancements like adaptive parameter control further improve its convergence. In today's data-driven world, extracting key insights from large datasets is critical but challenging due to irrelevant or redundant features. Feature selection helps streamline this process by identifying the most important features while maintaining or improving model accuracy. Inspired by the natural flashing of fireflies, the Firefly Optimization Algorithm (FFA) offers an innovative solution. The modified FFA, enhanced with quasi-reflection learning, excels in efficiently selecting optimal feature subsets, boosting both performance and reducing computational complexity. Conclusion โ€ข This presentation highlights the advancements made in the modified Firefly Algorithm (FFA) for feature selection in machine learning. By incorporating the quasi-reflection learning technique, the enhanced FFA significantly improves exploration of the feature space, overcoming premature convergence and enabling the precise identification of optimal feature subsets. โ€ข When compared to traditional methods like the original FFA, Particle Swarm Optimization (PSO), and Genetic Algorithms (GA), the modified FFA consistently delivers superior results in terms of both accuracy and computational efficiency. Its versatility is evident across a wide range of applications, including healthcare, finance, cybersecurity, and IoT, making it a highly adaptable tool in the machine learning ecosystem. โ€ข In conclusion, the modified FFA proves to be a powerful and efficient approach for feature selection, surpassing conventional techniques. Its ability to handle complex, high-dimensional datasets with precision solidifies its standing as an invaluable resource for enhancing model performance in a variety of real-world scenarios. โ€ข Improved Search Efficiency: The integration of quasi-reflection learning enhances the exploration phase, allowing the algorithm to search the feature space more effectively. This reduces the chances of getting trapped in local optima and improves the precision of selected feature subsets. โ€ข Balanced Feature Selection: By optimizing both classification accuracy and feature reduction, the modified FFA strikes an ideal balance between retaining relevant features and minimizing dataset dimensionality. This ensures a compact yet highly informative feature set, boosting model performance while reducing computational overhead. โ€ข Versatility Across Domains: The modified FFA has demonstrated superior performance across a wide range of applications, including healthcare, finance, cybersecurity, and IoT. Its ability to handle diverse datasets and different types of optimization problems makes it highly adaptable to various real-world scenarios. โ€ข Outperformance of Traditional Methods: When compared to traditional algorithms like Particle Swarm Optimization (PSO) and Genetic Algorithms (GA), the modified FFA consistently achieves higher classification accuracy and computational efficiency, making it a preferred choice for feature selection tasks. โ€ข Scalability for High-Dimensional Data: The algorithm is particularly effective for high- dimensional datasets, where traditional methods often struggle. Its capability to handle large-scale data without significant performance loss makes it suitable for complex machine learning problems. Advantages