International Journal of Electrical and Computer Engineering (IJECE)
Vol. 13, No. 3, June 2023, pp. 2652~2659
ISSN: 2088-8708, DOI: 10.11591/ijece.v13i3.pp2652-2659  2652
Journal homepage: http://ijece.iaescore.com
Automatic food bio-hazard detection system
Robinson Jimenez-Moreno1
, Javier Eduardo Martínez Baquero2
1
Mechatronic Engineering, Faculty of Engineering, Universidad Militar Nueva Granada, Bogota, Colombia
2
Engineering School, Faculty of Basic Sciences and Engineering, Universidad de los Llanos, Villavicencio, Colombia
Article Info ABSTRACT
Article history:
Received Jun 3, 2022
Revised Sep 5, 2022
Accepted Sep 11, 2022
This paper presents the design of a convolutional neural network architecture
oriented to the detection of food waste, to generate a low, medium, or critical-
level alarm. An architecture based on four convolution layers is used, for
which a database of 100 samples is prepared. The database is used with the
different hyperparameters that make up the final architecture, after the training
process. By means of confusion matrix analysis, a 100% performance of the
network is obtained, which delivers its output to a fuzzy system that,
depending on the duration of the detection time, generates the different alarm
levels associated with the risk.
Keywords:
Convolutional network
Deep learning
Food detection
Fuzzy interference
This is an open access article under the CC BY-SA license.
Corresponding Author:
Javier Eduardo Martinez Baquero
Engineering School, Faculty of Basic Sciences and Engineering, Universidad de los Llanos
Transversal 25 #13-34, Villavicencio, Colombia
Email: jmartinez@unillanos.edu.co
1. INTRODUCTION
Due to the recent isolation events in homes, originated by the coronavirus disease 2019 (COVID-19)
pandemic, many side effects have emerged as support needs in residential environments, among these is the
assistance in cleaning and/or disinfection tasks. Thus, several studies have been aimed to cover different fronts:
developments of cleaning robots [1], [2], an Internet of things (IoT) application for a disinfection robot [3],
which in the same way [4] exposes a robot for cleaning bathroom floors. The purpose of these robots can be
varied, for example, grease removal in ventilation ducts [5], waste segregation [6], or recycling tasks [7], [8].
Machine vision systems are a fundamental part of robotic development [9], which can be employed
in food sorting recognition [10]. A relevant aspect within the previously mentioned approachis the treatment of
food waste [11], [12], where one of thetechniques used are machine vision systems [13], among which
convolutional neural networks [14], [15] stand out. Food waste in residential environments is a focus of future
bacteria and diseases that must be treated [16], for which convolutional networks are currently used [17]–[19].
On the other hand, fuzzy inference systems are widely used in nonlinear and not very predictive
models, even being recently used in research related to COVID-19 [20], bacterial analysis [21], and food
adulteration [22], as well as their integration with neuro convolutional systems [23]. In turn, there are several
applications of fuzzy models for alarm generation systems [24]–[26].
Given the relevance of the topic, the development of a neuroconvolutional architecture for waste
discrimination is presented below, to generate an alert, either for a robotic prototype that may have the ability
to clean or an alarm system for notification of the risk that such waste presents. The alarm is generated by
means of a fuzzy inference system that takes as inputs the waste detection and the detection duration time. The
article is then divided into three sections, the first one corresponds to the methods and materials used, the
second section corresponds to the analysis of results, and the third section to the conclusions obtained.
Int J Elec & Comp Eng ISSN: 2088-8708 
Automatic food bio-hazard detection system (Robinson Jimenez-Moreno)
2653
2. METHOD
The designed automatic system is oriented to the scheme shown in Figure 1. Initially, images of the
environment are obtained, ideally from the top view of the table or the kitchen. The current image is input to a
convolutional neural network [27], [28], which establishes whether there is food or food residue, this result is
evaluated over time by a fuzzy inference system that determines an alarm level according to the duration of the
residue. In the following, both the neuro convolutional architecture and the fuzzy inference system are explained.
Figure 1. Classification scheme
It was established that the robot will generate inspection tours through the determined areas every 6
hours, which will be called an inspection cycle. Each time it detects the risk of paper waste, the count variable
will be increased, which corresponds to an array of 4 elements, where each one associates with each of the
inspection areas. To generate the biohazard alarm, a fuzzy inference algorithm is used based on the area where
the waste is located employing ResNet-50 and the time spent in this area based on the number of cycles in
which it was found. In the following, both parts of the algorithm, the training of the network and the fuzzy
inference system, are presented.
2.1. Convolutional neuronal network
The first step consists of establishing the database with which the classification system is trained. In
this case, two classes, food and residue, are used. A database of 100 images is used with a distribution of 70%
for training and 30% for testing. Figure 2 shows some samples of each class used, Figure 2(a) waste and
Figure 2(b) food.
(a) (b)
Figure 2. Training database (a) waste and (b) food
These images are acquired using a conventional webcam with an image resolution of 640×480 pixels,
which are resized inside the algorithm to 180×180 pixels to reduce the computational cost of training. For this
case, the network architecture illustrated in Table 1 is used. It consists of 4 convolution layers in the feature
extraction stage and two fully connected with 50% dropout in the classification stage, and a fully connected
final output. The kernel column presents the number of filters used and the size of each one.
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 13, No. 3, June 2023: 2652-2659
2654
Table 1. Network architecture used
LAYER KERNEL
Input 180×180×3 # F Filters
Convolution/ReLU 98 8
Convolution/ReLU 192 6
MaxPooling 3
Convolution/ReLU 192 3
MaxPooling 3
Convolution/ReLU 320 3
MaxPooling 2
Figure 3 illustrates the performance obtained in the training of the network. For the case it is observed
that it reaches 100% classification accuracy after 1500 iterations, taking just over 29 minutes on a computer
with an NVIDIA GPU 1050 with 8 GB of memory. The confusion matrix in Figure 4 illustrates the
classification performance with the validation data, showing the correct identification of each of the two classes
used. This indicates that there are no errors in the performance of the network, taking into account that the
lighting conditions used should not vary considerably. In this case, average classification times of 0.4 seconds
were obtained, which allows the use of the algorithm in machine vision applications, such as the one proposed,
in real-time.
Figure 3. Final training performance
Figure 4. Confusion matrix
Int J Elec & Comp Eng ISSN: 2088-8708 
Automatic food bio-hazard detection system (Robinson Jimenez-Moreno)
2655
2.2. Fuzzy interference system
The alarm for possible biological risk due to food residues is established by means of a fuzzy inference
system where once the food residue is recognized, the cycle in which it is found must be stored. Given the
nonlinearity of the system, since it is not possible to predict precisely when a food residue will be found, fuzzy
inference models are appropriate for this type of nonlinear and mathematically non-descriptive system. Two
inputs are used, one associated with the time measured in cycles and the other associated with the detection of
the residue, the output is determined by the percentage of the risk level that would imply leaving processed
food for prolonged periods of time in the same space and the time it is kept in the residential environment.
Figure 5 illustrates the fuzzy scheme implemented.
Figure 5. Fuzzy system implemented
The time input refers to finding the same or new residues in different spaces of time, these spaces are
determined in 6-hour periods, which corresponds to an average time between meals for the three basic food
cycles in human activity (breakfast, lunch, and dinner). For this purpose, a fuzzy input is established with three
membership functions, each with linguistic labels of low, medium, and high to denote the temporal perception
of the waste food in the site. For the time input, the universe of discourse is established in 10 cycles, where
after the 6th
cycle (36 hours) a high risk predominates.
This scheme is determined given that at a residential level, with a family nucleus (2 or more people),
a daily cleaning cycle is predominant (approximately every 24 hours). Which implies waste collection at least
once a day. For a standard case with three meals a day, waste generation would be close to every 18 hours on
average (3 cycles), which delimits the transition from low to medium alarm, as shown in Figure 6(a). For the
waste food entry, as shown in Figure 6(b), two membership functions are established, each with linguistic
labels of food and waste, in relation to how much waste accumulates, where the longer the food becomes waste.
The universe of discourse is set from 0% food to 100% waste.
(a)
(b)
Figure 6. Time and waste food inputs of the fuzzy system: input variable (a) time and (b) waste food
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 13, No. 3, June 2023: 2652-2659
2656
The output of the fuzzy system corresponds to the level of biological risk determined in percentage,
so it is framed in a universe of discourse from 0 to 100%, Given the nonlinearity inthe degradation of food [29],
the medium level presents less coverage in the system, and the rule base plays a fundamental role in the output
by relating this to the inputs, as seen on Figure 7.
Figure 7. Fuzzy food risk output
3. RESULTS AND DISCUSSION
Figure 8(a) shows some of the activations of the filter bank resulting from the first convolution layer
of the trained network. Where the learning of the filters is evident, as it clearly identifies the plate with food
and what is associated with the background of the image. Figure 8(b) shows in detail the activations in the
identification of food for both classes in a specific way. In Figure 8(b), the upper part is the full plate, and the
lower part corresponds to food waste. The heat map shows the concentration of learning, which resulted in a
high level of accuracy.
Figure 9 shows the level of confidence obtained by the classification category. This output is the one
that enters the fuzzy system. Figure 10 illustrates the output of the fuzzy inference system with incremental
time variations so that the three risk levels are generated, in Figure 10(a) low, Figure 10(b) medium, and
Figure 10(c) high. Through simulations of the algorithm in the environment shown in Figure 11, using a video
of the food area at 30 fps, the performance was evaluated, and the results shown in Table 2 were obtained.
(a) (b)
Figure 8. Activations of (a) learning process and (b) food and residue testing
Figure 9. Levels of confidence
Int J Elec & Comp Eng ISSN: 2088-8708 
Automatic food bio-hazard detection system (Robinson Jimenez-Moreno)
2657
(a) (b)
(c)
Figure 10. Alarm activations (a) low alarm (b) medium alarm, and (c) high alarm
Figure 11. Simulation test
Table 2. Risk simulation
Risk TP %
Low 97
Medium 92
High 100
Average 96.3
By means of the simulation, it was possible to validate the temporal relevance of the system,
depending fundamentally on the time when food was found in the evaluation area. The losses of true positives
are due to variations in the level of confidence with which each image in the video is classified since a threshold
of 85% is used to discard those that give values below, which are usually subject to error. However, the average
value obtained of 96.3% shows the effectiveness of the algorithm in generating the alarms.
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 13, No. 3, June 2023: 2652-2659
2658
4. CONCLUSION
Machine vision systems are an important complement to automation systems, which employ pattern
recognition techniques to operate. Within these techniques, neuro convolutional networks demonstrated high
efficiency in the recognition of the two established classes, with reduced classification times and high levels
of confidence. It is concluded that the automation of risk levels by means of the exposed methodology allows
the development of efficient automatic assistants for the prevention of biological risks due to bacterial growth
as in the case presented by food residues.
It is concluded on the importance of analyzing the activations, which allowed making adjustments in
the hyperparameters of the network, facilitating the convergence in the selection of the final architecture. At
the same time, the fuzzy inference system allows the generation of a natural system that alarms the health
conditions for decision making, complementing the action of the convolutional network.
ACKNOWLEDGEMENTS
The authors would like to thank Universidad Militar Nueva Granada, which, through its Vice-Rectory
of Research, is financing the present project with the code IMP-ING-3405 (2021-2022) and entitled “Prototipo
robótico móvil para tareas asistenciales en entornos residenciales,” from which the present work is derived
and Universidad de los Llanos for all help to participate in this project.
REFERENCES
[1] K. Akila, B. Sabitha, and R. Saravanan, “Railway track cleaning robot,” in 2021 International Conference on Advancements in
Electrical, Electronics, Communication, Computing and Automation (ICAECA), Oct. 2021, pp. 1–4. doi:
10.1109/ICAECA52838.2021.9675742.
[2] P. Veerajagadheswar, S. Yuyao, P. Kandasamy, M. R. Elara, and A. A. Hayat, “S-Sacrr: A staircase and slope accessing
reconfigurable cleaning robot and its validation,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 4558–4565, Apr. 2022,
doi: 10.1109/LRA.2022.3151572.
[3] C. McGinn, E. Bourke, and M. F. Cullinan, “An IoT approach for monitoring UV disinfection robots,” in 2021 IEEE International
Conference on Systems, Man, and Cybernetics (SMC), Oct. 2021, pp. 3056–3060. doi: 10.1109/SMC52423.2021.9659310.
[4] Y. Nishida, T. Ura, T. Hamatsu, K. Nagahashi, S. Inaba, and T. Nakatani, “Fish recognition method using vector quantization
histogram for investigation of fishery resources,” in 2014 Oceans - St. John’s, Sep. 2014, pp. 1–5. doi:
10.1109/OCEANS.2014.7003268.
[5] A. Yeshmukhametov, A. Baratova, A. Salemkhan, Z. Buribayev, K. Ozhikenov, and Y. Amirgaliyev, “Design and modeling of self-
sustainable bathroom floor cleaning robot system,” in 2021 21st International Conference on Control, Automation and Systems
(ICCAS), Oct. 2021, pp. 1860–1865. doi: 10.23919/ICCAS52745.2021.9649969.
[6] T. Hitomi, Y. Yamanaka, F. Ito, and T. Nakamura, “Development of a rotary cleaning mechanism using planetary gears for
removing grease deposited in kitchen ventilation ducts,” in 2022 IEEE/SICE International Symposium on System Integration (SII),
Jan. 2022, pp. 473–478. doi: 10.1109/SII52469.2022.9708739.
[7] R. S. Nakandhrakumar, P. Rameshkumar, V. Parthasarathy, and B. Thirupathy Rao, “WITHDRAWN: Internet of things (IoT) based
system development for robotic waste segregation management,” Materials Today: Proceedings, Mar. 2021, doi:
10.1016/j.matpr.2021.02.473.
[8] A. C. Medina, J. F. Mora, C. Martinez, N. Barrero, and W. Hernandez, “Safety protocol for collaborative human-robot recycling
tasks,” IFAC-PapersOnLine, vol. 52, no. 13, pp. 2008–2013, 2019, doi: 10.1016/j.ifacol.2019.11.498.
[9] J. Li, M. Barwood, and S. Rahimifard, “A multi-criteria assessment of robotic disassembly to support recycling and recovery,”
Resources, Conservation and Recycling, vol. 140, pp. 158–165, Jan. 2019, doi: 10.1016/j.resconrec.2018.09.019.
[10] Z. Wang, H. Li, and X. Yang, “Vision-based robotic system for on-site construction and demolition waste sorting and recycling,”
Journal of Building Engineering, vol. 32, Nov. 2020, doi: 10.1016/j.jobe.2020.101769.
[11] W. Song, N. Jiang, H. Wang, and J. Vincent, “Use of smartphone videos and pattern recognition for food authentication,” Sensors
and Actuators B: Chemical, vol. 304, Feb. 2020, doi: 10.1016/j.snb.2019.127247.
[12] K. Xu, M. M. Zheng, and X. Liu, “A two-stage robust model for urban food waste collection network under uncertainty,” in 2021
IEEE International Conference on Industrial Engineering and Engineering Management (IEEM), Dec. 2021, pp. 824–828. doi:
10.1109/IEEM50564.2021.9672895.
[13] E. L. Cosbuc, E.-D. Ungureanu-Comanita, and M. Gavrilescu, “Identification of the risks generated in the environment by food
waste,” in 2021 International Conference on e-Health and Bioengineering (EHB), Nov. 2021, pp. 1–4. doi:
10.1109/EHB52898.2021.9657709.
[14] Z. Shen, A. Shehzad, S. Chen, H. Sun, and J. Liu, “Machine learning based approach on food recognition and nutrition estimation,”
Procedia Computer Science, vol. 174, pp. 448–453, 2020, doi: 10.1016/j.procs.2020.06.113.
[15] P. Furtado, M. Caldeira, and P. Martins, “Human visual system vs convolution neural networks in food recognition task: An
empirical comparison,” Computer Vision and Image Understanding, vol. 191, Feb. 2020, doi: 10.1016/j.cviu.2019.102878.
[16] L. Xiao, T. Lan, D. Xu, W. Gao, and C. Li, “A simplified CNNs visual perception learning network algorithm for foods recognition,”
Computers & Electrical Engineering, vol. 92, Jun. 2021, doi: 10.1016/j.compeleceng.2021.107152.
[17] A. Laila, M. von Massow, M. Bain, K. Parizeau, and J. Haines, “Impact of COVID-19 on food waste behaviour of families: Results
from household waste composition audits,” Socio-Economic Planning Sciences, vol. 82, Aug. 2022, doi:
10.1016/j.seps.2021.101188.
[18] Z. Qiuhao, “Kitchen waste classification based on deep residual network and transfer learning,” in 2021 6th International
Symposium on Computer and Information Processing Technology (ISCIPT), Jun. 2021, pp. 625–629. doi:
10.1109/ISCIPT53667.2021.00133.
[19] E. Aguilar and P. Radeva, “Uncertainty-aware integration of local and flat classifiers for food recognition,” Pattern Recognition
Letters, vol. 136, pp. 237–243, Aug. 2020, doi: 10.1016/j.patrec.2020.06.013.
Int J Elec & Comp Eng ISSN: 2088-8708 
Automatic food bio-hazard detection system (Robinson Jimenez-Moreno)
2659
[20] Z. B. Ozger and P. Cihan, “A novel ensemble fuzzy classification model in SARS-CoV-2 B-cell epitope identification for
development of protein-based vaccine,” Applied Soft Computing, vol. 116, Feb. 2022, doi: 10.1016/j.asoc.2021.108280.
[21] N. E. Dina, A. M. R. Gherman, A. Colniță, D. Marconi, and C. Sârbu, “Fuzzy characterization and classification of bacteria species
detected at single-cell level by surface-enhanced Raman scattering,” Spectrochimica Acta Part A: Molecular and Biomolecular
Spectroscopy, vol. 247, Feb. 2021, doi: 10.1016/j.saa.2020.119149.
[22] P. P. Lal et al., “IoT integrated fuzzy classification analysis for detecting adulterants in cow milk,” Sensing and Bio-Sensing
Research, vol. 36, Jun. 2022, doi: 10.1016/j.sbsr.2022.100486.
[23] S. Dey, R. Roychoudhury, S. Malakar, and R. Sarkar, “An optimized fuzzy ensemble of convolutional neural networks for detecting
tuberculosis from Chest X-ray images,” Applied Soft Computing, vol. 114, Jan. 2022, doi: 10.1016/j.asoc.2021.108094.
[24] G. Guo, J. Qiao, W. Wang, and T. Chai, “A fuzzy leakage alarm method of liquid steel,” IFAC Proceedings Volumes, vol. 30,
no. 13, pp. 43–47, Jul. 1997, doi: 10.1016/S1474-6670(17)44367-9.
[25] M. J. Jafari, M. Pouyakian, A. Khanteymoori, and S. M. Hanifi, “Reliability evaluation of fire alarm systems using dynamic
Bayesian networks and fuzzy fault tree analysis,” Journal of Loss Prevention in the Process Industries, vol. 67, Sep. 2020, doi:
10.1016/j.jlp.2020.104229.
[26] Q. Zheng, Y. Li, and J. Cao, “Application of data mining technology in alarm analysis of communication network,” Computer
Communications, vol. 163, pp. 84–90, Nov. 2020, doi: 10.1016/j.comcom.2020.08.012.
[27] T. Guo, J. Dong, H. Li, and Y. Gao, “Simple convolutional neural network on image classification,” in 2017 IEEE 2nd International
Conference on Big Data Analysis (ICBDA), Mar. 2017, pp. 721–724. doi: 10.1109/ICBDA.2017.8078730.
[28] M. D. Zeiler and R. Fergus, “Visualizing and understanding convolutional networks,” in European conference on computer vision,
Springer, 2014, pp. 818–833. doi: 10.1007/978-3-319-10590-1_53.
[29] C. Anzueto, Mathematical models for food shelf-life estimation. San Salvador, Guatemala: Food and Beverage, 2012.
BIOGRAPHIES OF AUTHORS
Robinson Jiménez-Moreno is an electronic engineer and a graduate of Universidad
Distrital Francisco José de Caldas in 2002. He received an M.Sc. in engineering from
Universidad Nacional de Colombia in 2012 and Ph.D. in engineering at Universidad Distrital
Francisco José de Caldas in 2018. His current work as an assistant professor of Universidad
Militar Nueva Granada and research focuses on the use of convolutional neural networks for
object recognition and image processing for robotic applications such as human-machine
interaction. His profile can be found at https://www.researchgate.net/profile/Robinson-Moreno-
2 and https://reddolac.org/profile/RobinsonJimenezMoreno. He can also be contacted at email:
robinson.jimenez@unimilitar.edu.co.
Javier Eduardo Martinez Baquero is an electronic engineer, a graduate of
Universidad de los Llanos Villavicencio, Colombia in 2002, a postgraduate in electronic
instrumentation from Universidad Santo Tomas in 2004, a postgraduate in instrumentation and
industrial control at Universidad de los Llanos in 2020, and M.Sc. in educative technology and
innovative media for education at Universidad Autonoma de Bucaramanga, Colombia in 2013.
His current work is as an associate professor of Universidad de los Llanos, and his research
focuses on instrumentation, automation, control, and renewable energies. His profile
can be found at https://www.researchgate.net/profile/Javier-Martinez-Baquero and
https://reddolac.org/profile/JavierEduardoMartinezBaquero. He can also be contacted at
jmartinez@unillanos.edu.co.

More Related Content

PDF
Paper biological risk detection through deep learning and fuzzy system
PDF
Semi-automatic model to colony forming units counting
PDF
8 A Cellular Neural Network based system for cell counting in culture of biol...
PDF
An intelligent irrigation system based on internet of things (IoT) to minimiz...
PDF
Visual control system for grip of glasses oriented to assistance robotics
PDF
Algorithm of detection, classification and gripping of occluded objects by C...
PDF
A Review: Plant leaf Disease Detection Using Convolution Neural Network in Ma...
Paper biological risk detection through deep learning and fuzzy system
Semi-automatic model to colony forming units counting
8 A Cellular Neural Network based system for cell counting in culture of biol...
An intelligent irrigation system based on internet of things (IoT) to minimiz...
Visual control system for grip of glasses oriented to assistance robotics
Algorithm of detection, classification and gripping of occluded objects by C...
A Review: Plant leaf Disease Detection Using Convolution Neural Network in Ma...

Similar to Automatic food bio-hazard detection system (20)

PDF
IRJET- Crop Leaf Disease Diagnosis using Convolutional Neural Network
PDF
Evaluation of non-parametric identification techniques in second order models...
PDF
Prototype System to Detect Skin Cancer Through Images
PDF
A HYBRID METHOD FOR AUTOMATIC COUNTING OF MICROORGANISMS IN MICROSCOPIC IMAGES
PDF
Food Quality Detection And Calorie Estimation Using Machine Learning
PDF
IRJET- Smart Crop-Field Monitoring and Automation Irrigation System using...
PDF
ORGANIC PRODUCT DISEASE DETECTION USING CNN
PDF
SURVEY PAPER ON CROP DISEASE NOTIFICATION SYSTEM
PDF
SADCNN-ORBM: a hybrid deep learning model based citrus disease detection and ...
PDF
Simulink model for automatic detection and counting of the number of white fl...
PDF
Corn plant disease classification based on leaf using residual networks-9 arc...
PDF
Performance analysis of optimized controllers with bio-inspired algorithms
PDF
ResSeg: Residual encoder-decoder convolutional neural network for food segmen...
PDF
An evaluation of machine learning algorithms coupled to an electronic olfact...
PDF
Development Experience Of A Contextaware System For Smart Irrigation Using Ca...
PPTX
VENDCOMPOST(Internet of things based composting machine).pptx
PDF
SENSOR FAULT IDENTIFICATION IN COMPLEX SYSTEMS | J4RV3I12007
PDF
APPLICATION OF A COMPUTER VISION METHOD FOR SOILING RECOGNITION IN PHOTOVOLTA...
PDF
Early detection of tomato leaf diseases based on deep learning techniques
PDF
AUTOMATIC FRUIT RECOGNITION BASED ON DCNN FOR COMMERCIAL SOURCE TRACE SYSTEM
IRJET- Crop Leaf Disease Diagnosis using Convolutional Neural Network
Evaluation of non-parametric identification techniques in second order models...
Prototype System to Detect Skin Cancer Through Images
A HYBRID METHOD FOR AUTOMATIC COUNTING OF MICROORGANISMS IN MICROSCOPIC IMAGES
Food Quality Detection And Calorie Estimation Using Machine Learning
IRJET- Smart Crop-Field Monitoring and Automation Irrigation System using...
ORGANIC PRODUCT DISEASE DETECTION USING CNN
SURVEY PAPER ON CROP DISEASE NOTIFICATION SYSTEM
SADCNN-ORBM: a hybrid deep learning model based citrus disease detection and ...
Simulink model for automatic detection and counting of the number of white fl...
Corn plant disease classification based on leaf using residual networks-9 arc...
Performance analysis of optimized controllers with bio-inspired algorithms
ResSeg: Residual encoder-decoder convolutional neural network for food segmen...
An evaluation of machine learning algorithms coupled to an electronic olfact...
Development Experience Of A Contextaware System For Smart Irrigation Using Ca...
VENDCOMPOST(Internet of things based composting machine).pptx
SENSOR FAULT IDENTIFICATION IN COMPLEX SYSTEMS | J4RV3I12007
APPLICATION OF A COMPUTER VISION METHOD FOR SOILING RECOGNITION IN PHOTOVOLTA...
Early detection of tomato leaf diseases based on deep learning techniques
AUTOMATIC FRUIT RECOGNITION BASED ON DCNN FOR COMMERCIAL SOURCE TRACE SYSTEM
Ad

More from IJECEIAES (20)

PDF
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...
PDF
Embedded machine learning-based road conditions and driving behavior monitoring
PDF
Advanced control scheme of doubly fed induction generator for wind turbine us...
PDF
Neural network optimizer of proportional-integral-differential controller par...
PDF
An improved modulation technique suitable for a three level flying capacitor ...
PDF
A review on features and methods of potential fishing zone
PDF
Electrical signal interference minimization using appropriate core material f...
PDF
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
PDF
Bibliometric analysis highlighting the role of women in addressing climate ch...
PDF
Voltage and frequency control of microgrid in presence of micro-turbine inter...
PDF
Enhancing battery system identification: nonlinear autoregressive modeling fo...
PDF
Smart grid deployment: from a bibliometric analysis to a survey
PDF
Use of analytical hierarchy process for selecting and prioritizing islanding ...
PDF
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...
PDF
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...
PDF
Adaptive synchronous sliding control for a robot manipulator based on neural ...
PDF
Remote field-programmable gate array laboratory for signal acquisition and de...
PDF
Detecting and resolving feature envy through automated machine learning and m...
PDF
Smart monitoring technique for solar cell systems using internet of things ba...
PDF
An efficient security framework for intrusion detection and prevention in int...
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...
Embedded machine learning-based road conditions and driving behavior monitoring
Advanced control scheme of doubly fed induction generator for wind turbine us...
Neural network optimizer of proportional-integral-differential controller par...
An improved modulation technique suitable for a three level flying capacitor ...
A review on features and methods of potential fishing zone
Electrical signal interference minimization using appropriate core material f...
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
Bibliometric analysis highlighting the role of women in addressing climate ch...
Voltage and frequency control of microgrid in presence of micro-turbine inter...
Enhancing battery system identification: nonlinear autoregressive modeling fo...
Smart grid deployment: from a bibliometric analysis to a survey
Use of analytical hierarchy process for selecting and prioritizing islanding ...
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...
Adaptive synchronous sliding control for a robot manipulator based on neural ...
Remote field-programmable gate array laboratory for signal acquisition and de...
Detecting and resolving feature envy through automated machine learning and m...
Smart monitoring technique for solar cell systems using internet of things ba...
An efficient security framework for intrusion detection and prevention in int...
Ad

Recently uploaded (20)

PDF
August -2025_Top10 Read_Articles_ijait.pdf
PPTX
Sorting and Hashing in Data Structures with Algorithms, Techniques, Implement...
PPTX
ai_satellite_crop_management_20250815030350.pptx
PDF
Exploratory_Data_Analysis_Fundamentals.pdf
PPTX
Principal presentation for NAAC (1).pptx
PDF
null (2) bgfbg bfgb bfgb fbfg bfbgf b.pdf
PDF
Unit I -OPERATING SYSTEMS_SRM_KATTANKULATHUR.pptx.pdf
PDF
Computer System Architecture 3rd Edition-M Morris Mano.pdf
PDF
Abrasive, erosive and cavitation wear.pdf
PDF
Accra-Kumasi Expressway - Prefeasibility Report Volume 1 of 7.11.2018.pdf
PDF
Soil Improvement Techniques Note - Rabbi
PDF
First part_B-Image Processing - 1 of 2).pdf
PDF
UEFA_Carbon_Footprint_Calculator_Methology_2.0.pdf
PPTX
Measurement Uncertainty and Measurement System analysis
PDF
Applications of Equal_Area_Criterion.pdf
PDF
Design Guidelines and solutions for Plastics parts
PPTX
Chemical Technological Processes, Feasibility Study and Chemical Process Indu...
PPTX
Feature types and data preprocessing steps
PDF
Prof. Dr. KAYIHURA A. SILAS MUNYANEZA, PhD..pdf
PPTX
Module 8- Technological and Communication Skills.pptx
August -2025_Top10 Read_Articles_ijait.pdf
Sorting and Hashing in Data Structures with Algorithms, Techniques, Implement...
ai_satellite_crop_management_20250815030350.pptx
Exploratory_Data_Analysis_Fundamentals.pdf
Principal presentation for NAAC (1).pptx
null (2) bgfbg bfgb bfgb fbfg bfbgf b.pdf
Unit I -OPERATING SYSTEMS_SRM_KATTANKULATHUR.pptx.pdf
Computer System Architecture 3rd Edition-M Morris Mano.pdf
Abrasive, erosive and cavitation wear.pdf
Accra-Kumasi Expressway - Prefeasibility Report Volume 1 of 7.11.2018.pdf
Soil Improvement Techniques Note - Rabbi
First part_B-Image Processing - 1 of 2).pdf
UEFA_Carbon_Footprint_Calculator_Methology_2.0.pdf
Measurement Uncertainty and Measurement System analysis
Applications of Equal_Area_Criterion.pdf
Design Guidelines and solutions for Plastics parts
Chemical Technological Processes, Feasibility Study and Chemical Process Indu...
Feature types and data preprocessing steps
Prof. Dr. KAYIHURA A. SILAS MUNYANEZA, PhD..pdf
Module 8- Technological and Communication Skills.pptx

Automatic food bio-hazard detection system

  • 1. International Journal of Electrical and Computer Engineering (IJECE) Vol. 13, No. 3, June 2023, pp. 2652~2659 ISSN: 2088-8708, DOI: 10.11591/ijece.v13i3.pp2652-2659  2652 Journal homepage: http://ijece.iaescore.com Automatic food bio-hazard detection system Robinson Jimenez-Moreno1 , Javier Eduardo Martínez Baquero2 1 Mechatronic Engineering, Faculty of Engineering, Universidad Militar Nueva Granada, Bogota, Colombia 2 Engineering School, Faculty of Basic Sciences and Engineering, Universidad de los Llanos, Villavicencio, Colombia Article Info ABSTRACT Article history: Received Jun 3, 2022 Revised Sep 5, 2022 Accepted Sep 11, 2022 This paper presents the design of a convolutional neural network architecture oriented to the detection of food waste, to generate a low, medium, or critical- level alarm. An architecture based on four convolution layers is used, for which a database of 100 samples is prepared. The database is used with the different hyperparameters that make up the final architecture, after the training process. By means of confusion matrix analysis, a 100% performance of the network is obtained, which delivers its output to a fuzzy system that, depending on the duration of the detection time, generates the different alarm levels associated with the risk. Keywords: Convolutional network Deep learning Food detection Fuzzy interference This is an open access article under the CC BY-SA license. Corresponding Author: Javier Eduardo Martinez Baquero Engineering School, Faculty of Basic Sciences and Engineering, Universidad de los Llanos Transversal 25 #13-34, Villavicencio, Colombia Email: jmartinez@unillanos.edu.co 1. INTRODUCTION Due to the recent isolation events in homes, originated by the coronavirus disease 2019 (COVID-19) pandemic, many side effects have emerged as support needs in residential environments, among these is the assistance in cleaning and/or disinfection tasks. Thus, several studies have been aimed to cover different fronts: developments of cleaning robots [1], [2], an Internet of things (IoT) application for a disinfection robot [3], which in the same way [4] exposes a robot for cleaning bathroom floors. The purpose of these robots can be varied, for example, grease removal in ventilation ducts [5], waste segregation [6], or recycling tasks [7], [8]. Machine vision systems are a fundamental part of robotic development [9], which can be employed in food sorting recognition [10]. A relevant aspect within the previously mentioned approachis the treatment of food waste [11], [12], where one of thetechniques used are machine vision systems [13], among which convolutional neural networks [14], [15] stand out. Food waste in residential environments is a focus of future bacteria and diseases that must be treated [16], for which convolutional networks are currently used [17]–[19]. On the other hand, fuzzy inference systems are widely used in nonlinear and not very predictive models, even being recently used in research related to COVID-19 [20], bacterial analysis [21], and food adulteration [22], as well as their integration with neuro convolutional systems [23]. In turn, there are several applications of fuzzy models for alarm generation systems [24]–[26]. Given the relevance of the topic, the development of a neuroconvolutional architecture for waste discrimination is presented below, to generate an alert, either for a robotic prototype that may have the ability to clean or an alarm system for notification of the risk that such waste presents. The alarm is generated by means of a fuzzy inference system that takes as inputs the waste detection and the detection duration time. The article is then divided into three sections, the first one corresponds to the methods and materials used, the second section corresponds to the analysis of results, and the third section to the conclusions obtained.
  • 2. Int J Elec & Comp Eng ISSN: 2088-8708  Automatic food bio-hazard detection system (Robinson Jimenez-Moreno) 2653 2. METHOD The designed automatic system is oriented to the scheme shown in Figure 1. Initially, images of the environment are obtained, ideally from the top view of the table or the kitchen. The current image is input to a convolutional neural network [27], [28], which establishes whether there is food or food residue, this result is evaluated over time by a fuzzy inference system that determines an alarm level according to the duration of the residue. In the following, both the neuro convolutional architecture and the fuzzy inference system are explained. Figure 1. Classification scheme It was established that the robot will generate inspection tours through the determined areas every 6 hours, which will be called an inspection cycle. Each time it detects the risk of paper waste, the count variable will be increased, which corresponds to an array of 4 elements, where each one associates with each of the inspection areas. To generate the biohazard alarm, a fuzzy inference algorithm is used based on the area where the waste is located employing ResNet-50 and the time spent in this area based on the number of cycles in which it was found. In the following, both parts of the algorithm, the training of the network and the fuzzy inference system, are presented. 2.1. Convolutional neuronal network The first step consists of establishing the database with which the classification system is trained. In this case, two classes, food and residue, are used. A database of 100 images is used with a distribution of 70% for training and 30% for testing. Figure 2 shows some samples of each class used, Figure 2(a) waste and Figure 2(b) food. (a) (b) Figure 2. Training database (a) waste and (b) food These images are acquired using a conventional webcam with an image resolution of 640×480 pixels, which are resized inside the algorithm to 180×180 pixels to reduce the computational cost of training. For this case, the network architecture illustrated in Table 1 is used. It consists of 4 convolution layers in the feature extraction stage and two fully connected with 50% dropout in the classification stage, and a fully connected final output. The kernel column presents the number of filters used and the size of each one.
  • 3.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 13, No. 3, June 2023: 2652-2659 2654 Table 1. Network architecture used LAYER KERNEL Input 180×180×3 # F Filters Convolution/ReLU 98 8 Convolution/ReLU 192 6 MaxPooling 3 Convolution/ReLU 192 3 MaxPooling 3 Convolution/ReLU 320 3 MaxPooling 2 Figure 3 illustrates the performance obtained in the training of the network. For the case it is observed that it reaches 100% classification accuracy after 1500 iterations, taking just over 29 minutes on a computer with an NVIDIA GPU 1050 with 8 GB of memory. The confusion matrix in Figure 4 illustrates the classification performance with the validation data, showing the correct identification of each of the two classes used. This indicates that there are no errors in the performance of the network, taking into account that the lighting conditions used should not vary considerably. In this case, average classification times of 0.4 seconds were obtained, which allows the use of the algorithm in machine vision applications, such as the one proposed, in real-time. Figure 3. Final training performance Figure 4. Confusion matrix
  • 4. Int J Elec & Comp Eng ISSN: 2088-8708  Automatic food bio-hazard detection system (Robinson Jimenez-Moreno) 2655 2.2. Fuzzy interference system The alarm for possible biological risk due to food residues is established by means of a fuzzy inference system where once the food residue is recognized, the cycle in which it is found must be stored. Given the nonlinearity of the system, since it is not possible to predict precisely when a food residue will be found, fuzzy inference models are appropriate for this type of nonlinear and mathematically non-descriptive system. Two inputs are used, one associated with the time measured in cycles and the other associated with the detection of the residue, the output is determined by the percentage of the risk level that would imply leaving processed food for prolonged periods of time in the same space and the time it is kept in the residential environment. Figure 5 illustrates the fuzzy scheme implemented. Figure 5. Fuzzy system implemented The time input refers to finding the same or new residues in different spaces of time, these spaces are determined in 6-hour periods, which corresponds to an average time between meals for the three basic food cycles in human activity (breakfast, lunch, and dinner). For this purpose, a fuzzy input is established with three membership functions, each with linguistic labels of low, medium, and high to denote the temporal perception of the waste food in the site. For the time input, the universe of discourse is established in 10 cycles, where after the 6th cycle (36 hours) a high risk predominates. This scheme is determined given that at a residential level, with a family nucleus (2 or more people), a daily cleaning cycle is predominant (approximately every 24 hours). Which implies waste collection at least once a day. For a standard case with three meals a day, waste generation would be close to every 18 hours on average (3 cycles), which delimits the transition from low to medium alarm, as shown in Figure 6(a). For the waste food entry, as shown in Figure 6(b), two membership functions are established, each with linguistic labels of food and waste, in relation to how much waste accumulates, where the longer the food becomes waste. The universe of discourse is set from 0% food to 100% waste. (a) (b) Figure 6. Time and waste food inputs of the fuzzy system: input variable (a) time and (b) waste food
  • 5.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 13, No. 3, June 2023: 2652-2659 2656 The output of the fuzzy system corresponds to the level of biological risk determined in percentage, so it is framed in a universe of discourse from 0 to 100%, Given the nonlinearity inthe degradation of food [29], the medium level presents less coverage in the system, and the rule base plays a fundamental role in the output by relating this to the inputs, as seen on Figure 7. Figure 7. Fuzzy food risk output 3. RESULTS AND DISCUSSION Figure 8(a) shows some of the activations of the filter bank resulting from the first convolution layer of the trained network. Where the learning of the filters is evident, as it clearly identifies the plate with food and what is associated with the background of the image. Figure 8(b) shows in detail the activations in the identification of food for both classes in a specific way. In Figure 8(b), the upper part is the full plate, and the lower part corresponds to food waste. The heat map shows the concentration of learning, which resulted in a high level of accuracy. Figure 9 shows the level of confidence obtained by the classification category. This output is the one that enters the fuzzy system. Figure 10 illustrates the output of the fuzzy inference system with incremental time variations so that the three risk levels are generated, in Figure 10(a) low, Figure 10(b) medium, and Figure 10(c) high. Through simulations of the algorithm in the environment shown in Figure 11, using a video of the food area at 30 fps, the performance was evaluated, and the results shown in Table 2 were obtained. (a) (b) Figure 8. Activations of (a) learning process and (b) food and residue testing Figure 9. Levels of confidence
  • 6. Int J Elec & Comp Eng ISSN: 2088-8708  Automatic food bio-hazard detection system (Robinson Jimenez-Moreno) 2657 (a) (b) (c) Figure 10. Alarm activations (a) low alarm (b) medium alarm, and (c) high alarm Figure 11. Simulation test Table 2. Risk simulation Risk TP % Low 97 Medium 92 High 100 Average 96.3 By means of the simulation, it was possible to validate the temporal relevance of the system, depending fundamentally on the time when food was found in the evaluation area. The losses of true positives are due to variations in the level of confidence with which each image in the video is classified since a threshold of 85% is used to discard those that give values below, which are usually subject to error. However, the average value obtained of 96.3% shows the effectiveness of the algorithm in generating the alarms.
  • 7.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 13, No. 3, June 2023: 2652-2659 2658 4. CONCLUSION Machine vision systems are an important complement to automation systems, which employ pattern recognition techniques to operate. Within these techniques, neuro convolutional networks demonstrated high efficiency in the recognition of the two established classes, with reduced classification times and high levels of confidence. It is concluded that the automation of risk levels by means of the exposed methodology allows the development of efficient automatic assistants for the prevention of biological risks due to bacterial growth as in the case presented by food residues. It is concluded on the importance of analyzing the activations, which allowed making adjustments in the hyperparameters of the network, facilitating the convergence in the selection of the final architecture. At the same time, the fuzzy inference system allows the generation of a natural system that alarms the health conditions for decision making, complementing the action of the convolutional network. ACKNOWLEDGEMENTS The authors would like to thank Universidad Militar Nueva Granada, which, through its Vice-Rectory of Research, is financing the present project with the code IMP-ING-3405 (2021-2022) and entitled “Prototipo robótico móvil para tareas asistenciales en entornos residenciales,” from which the present work is derived and Universidad de los Llanos for all help to participate in this project. REFERENCES [1] K. Akila, B. Sabitha, and R. Saravanan, “Railway track cleaning robot,” in 2021 International Conference on Advancements in Electrical, Electronics, Communication, Computing and Automation (ICAECA), Oct. 2021, pp. 1–4. doi: 10.1109/ICAECA52838.2021.9675742. [2] P. Veerajagadheswar, S. Yuyao, P. Kandasamy, M. R. Elara, and A. A. Hayat, “S-Sacrr: A staircase and slope accessing reconfigurable cleaning robot and its validation,” IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 4558–4565, Apr. 2022, doi: 10.1109/LRA.2022.3151572. [3] C. McGinn, E. Bourke, and M. F. Cullinan, “An IoT approach for monitoring UV disinfection robots,” in 2021 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Oct. 2021, pp. 3056–3060. doi: 10.1109/SMC52423.2021.9659310. [4] Y. Nishida, T. Ura, T. Hamatsu, K. Nagahashi, S. Inaba, and T. Nakatani, “Fish recognition method using vector quantization histogram for investigation of fishery resources,” in 2014 Oceans - St. John’s, Sep. 2014, pp. 1–5. doi: 10.1109/OCEANS.2014.7003268. [5] A. Yeshmukhametov, A. Baratova, A. Salemkhan, Z. Buribayev, K. Ozhikenov, and Y. Amirgaliyev, “Design and modeling of self- sustainable bathroom floor cleaning robot system,” in 2021 21st International Conference on Control, Automation and Systems (ICCAS), Oct. 2021, pp. 1860–1865. doi: 10.23919/ICCAS52745.2021.9649969. [6] T. Hitomi, Y. Yamanaka, F. Ito, and T. Nakamura, “Development of a rotary cleaning mechanism using planetary gears for removing grease deposited in kitchen ventilation ducts,” in 2022 IEEE/SICE International Symposium on System Integration (SII), Jan. 2022, pp. 473–478. doi: 10.1109/SII52469.2022.9708739. [7] R. S. Nakandhrakumar, P. Rameshkumar, V. Parthasarathy, and B. Thirupathy Rao, “WITHDRAWN: Internet of things (IoT) based system development for robotic waste segregation management,” Materials Today: Proceedings, Mar. 2021, doi: 10.1016/j.matpr.2021.02.473. [8] A. C. Medina, J. F. Mora, C. Martinez, N. Barrero, and W. Hernandez, “Safety protocol for collaborative human-robot recycling tasks,” IFAC-PapersOnLine, vol. 52, no. 13, pp. 2008–2013, 2019, doi: 10.1016/j.ifacol.2019.11.498. [9] J. Li, M. Barwood, and S. Rahimifard, “A multi-criteria assessment of robotic disassembly to support recycling and recovery,” Resources, Conservation and Recycling, vol. 140, pp. 158–165, Jan. 2019, doi: 10.1016/j.resconrec.2018.09.019. [10] Z. Wang, H. Li, and X. Yang, “Vision-based robotic system for on-site construction and demolition waste sorting and recycling,” Journal of Building Engineering, vol. 32, Nov. 2020, doi: 10.1016/j.jobe.2020.101769. [11] W. Song, N. Jiang, H. Wang, and J. Vincent, “Use of smartphone videos and pattern recognition for food authentication,” Sensors and Actuators B: Chemical, vol. 304, Feb. 2020, doi: 10.1016/j.snb.2019.127247. [12] K. Xu, M. M. Zheng, and X. Liu, “A two-stage robust model for urban food waste collection network under uncertainty,” in 2021 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM), Dec. 2021, pp. 824–828. doi: 10.1109/IEEM50564.2021.9672895. [13] E. L. Cosbuc, E.-D. Ungureanu-Comanita, and M. Gavrilescu, “Identification of the risks generated in the environment by food waste,” in 2021 International Conference on e-Health and Bioengineering (EHB), Nov. 2021, pp. 1–4. doi: 10.1109/EHB52898.2021.9657709. [14] Z. Shen, A. Shehzad, S. Chen, H. Sun, and J. Liu, “Machine learning based approach on food recognition and nutrition estimation,” Procedia Computer Science, vol. 174, pp. 448–453, 2020, doi: 10.1016/j.procs.2020.06.113. [15] P. Furtado, M. Caldeira, and P. Martins, “Human visual system vs convolution neural networks in food recognition task: An empirical comparison,” Computer Vision and Image Understanding, vol. 191, Feb. 2020, doi: 10.1016/j.cviu.2019.102878. [16] L. Xiao, T. Lan, D. Xu, W. Gao, and C. Li, “A simplified CNNs visual perception learning network algorithm for foods recognition,” Computers & Electrical Engineering, vol. 92, Jun. 2021, doi: 10.1016/j.compeleceng.2021.107152. [17] A. Laila, M. von Massow, M. Bain, K. Parizeau, and J. Haines, “Impact of COVID-19 on food waste behaviour of families: Results from household waste composition audits,” Socio-Economic Planning Sciences, vol. 82, Aug. 2022, doi: 10.1016/j.seps.2021.101188. [18] Z. Qiuhao, “Kitchen waste classification based on deep residual network and transfer learning,” in 2021 6th International Symposium on Computer and Information Processing Technology (ISCIPT), Jun. 2021, pp. 625–629. doi: 10.1109/ISCIPT53667.2021.00133. [19] E. Aguilar and P. Radeva, “Uncertainty-aware integration of local and flat classifiers for food recognition,” Pattern Recognition Letters, vol. 136, pp. 237–243, Aug. 2020, doi: 10.1016/j.patrec.2020.06.013.
  • 8. Int J Elec & Comp Eng ISSN: 2088-8708  Automatic food bio-hazard detection system (Robinson Jimenez-Moreno) 2659 [20] Z. B. Ozger and P. Cihan, “A novel ensemble fuzzy classification model in SARS-CoV-2 B-cell epitope identification for development of protein-based vaccine,” Applied Soft Computing, vol. 116, Feb. 2022, doi: 10.1016/j.asoc.2021.108280. [21] N. E. Dina, A. M. R. Gherman, A. Colniță, D. Marconi, and C. Sârbu, “Fuzzy characterization and classification of bacteria species detected at single-cell level by surface-enhanced Raman scattering,” Spectrochimica Acta Part A: Molecular and Biomolecular Spectroscopy, vol. 247, Feb. 2021, doi: 10.1016/j.saa.2020.119149. [22] P. P. Lal et al., “IoT integrated fuzzy classification analysis for detecting adulterants in cow milk,” Sensing and Bio-Sensing Research, vol. 36, Jun. 2022, doi: 10.1016/j.sbsr.2022.100486. [23] S. Dey, R. Roychoudhury, S. Malakar, and R. Sarkar, “An optimized fuzzy ensemble of convolutional neural networks for detecting tuberculosis from Chest X-ray images,” Applied Soft Computing, vol. 114, Jan. 2022, doi: 10.1016/j.asoc.2021.108094. [24] G. Guo, J. Qiao, W. Wang, and T. Chai, “A fuzzy leakage alarm method of liquid steel,” IFAC Proceedings Volumes, vol. 30, no. 13, pp. 43–47, Jul. 1997, doi: 10.1016/S1474-6670(17)44367-9. [25] M. J. Jafari, M. Pouyakian, A. Khanteymoori, and S. M. Hanifi, “Reliability evaluation of fire alarm systems using dynamic Bayesian networks and fuzzy fault tree analysis,” Journal of Loss Prevention in the Process Industries, vol. 67, Sep. 2020, doi: 10.1016/j.jlp.2020.104229. [26] Q. Zheng, Y. Li, and J. Cao, “Application of data mining technology in alarm analysis of communication network,” Computer Communications, vol. 163, pp. 84–90, Nov. 2020, doi: 10.1016/j.comcom.2020.08.012. [27] T. Guo, J. Dong, H. Li, and Y. Gao, “Simple convolutional neural network on image classification,” in 2017 IEEE 2nd International Conference on Big Data Analysis (ICBDA), Mar. 2017, pp. 721–724. doi: 10.1109/ICBDA.2017.8078730. [28] M. D. Zeiler and R. Fergus, “Visualizing and understanding convolutional networks,” in European conference on computer vision, Springer, 2014, pp. 818–833. doi: 10.1007/978-3-319-10590-1_53. [29] C. Anzueto, Mathematical models for food shelf-life estimation. San Salvador, Guatemala: Food and Beverage, 2012. BIOGRAPHIES OF AUTHORS Robinson Jiménez-Moreno is an electronic engineer and a graduate of Universidad Distrital Francisco José de Caldas in 2002. He received an M.Sc. in engineering from Universidad Nacional de Colombia in 2012 and Ph.D. in engineering at Universidad Distrital Francisco José de Caldas in 2018. His current work as an assistant professor of Universidad Militar Nueva Granada and research focuses on the use of convolutional neural networks for object recognition and image processing for robotic applications such as human-machine interaction. His profile can be found at https://www.researchgate.net/profile/Robinson-Moreno- 2 and https://reddolac.org/profile/RobinsonJimenezMoreno. He can also be contacted at email: robinson.jimenez@unimilitar.edu.co. Javier Eduardo Martinez Baquero is an electronic engineer, a graduate of Universidad de los Llanos Villavicencio, Colombia in 2002, a postgraduate in electronic instrumentation from Universidad Santo Tomas in 2004, a postgraduate in instrumentation and industrial control at Universidad de los Llanos in 2020, and M.Sc. in educative technology and innovative media for education at Universidad Autonoma de Bucaramanga, Colombia in 2013. His current work is as an associate professor of Universidad de los Llanos, and his research focuses on instrumentation, automation, control, and renewable energies. His profile can be found at https://www.researchgate.net/profile/Javier-Martinez-Baquero and https://reddolac.org/profile/JavierEduardoMartinezBaquero. He can also be contacted at jmartinez@unillanos.edu.co.