1
ARTIFICIAL INTELLIGENCE + QUANTUM COMPUTING = GREATEST
TECHNOLOGICAL REVOLUTION IN HISTORY
Fernando Alcoforado*
This article aims to demonstrate that the combination of Artificial Intelligence and
Quantum Computing will constitute the greatest technological revolution in history.
Quantum computing could greatly accelerate the evolution of Artificial Intelligence,
which, when it becomes even more powerful, will contribute to the development of
quantum computers of the future. This article presents how Artificial Intelligence,
Quantum Computing work and what will result from the combination of both.
1. Artificial Intelligence
Artificial intelligence (AI) is a computational technology or a set of technologies such as
artificial neural networks, algorithms and learning systems whose objective is to imitate
human mental capabilities, such as: reasoning, environmental perception and decision-
making capacity [1]. The technology is developed with the aim that machines can solve
a series of problems, covering everything from the great complexity of government and
industry management to the daily tasks of modern men and women. To do this, AI uses
sophisticated learning technology, allowing it to learn from a large set of data and act on
its own. The general objective of AI is to create machines that can operate at the same
level of cognitive capacity as humans, or even surpass them. In recent years, AI has
emerged as a transformative force across multiple industries, revolutionizing the way
companies conduct business [1].
Artificial Intelligence is based on three technologies [1]:
1. Machine Learning is an application of Artificial Intelligence that provides the
computer with the ability to automatically learn and improve from its own experience.
Machine learning focuses on developing “software” that can access data and use it to
learn from it. The learning process begins with observing data in order to look for
statistical patterns and make good decisions based on the examples provided. In this way,
the main objective is to make computers learn automatically without human intervention.
2. Deep learning is a subset of Machine learning, essentially being a neural network with
three or more layers. These neural networks attempt to simulate the behavior of the human
brain although far from matching its capacity allowing the machine to “learn” from the
abundance of data. While a single-layer neural network can still make approximate
predictions, additional hidden layers can help optimize and refine accuracy. Deep
learning drives many AI applications and services that improve automation by performing
analytical and physical tasks without human intervention. Deep learning technology is
behind everyday products and services (like digital assistants, voice-enabled TV remotes,
and credit card fraud detection) as well as emerging technologies (like self-driving cars).
3. Natural language processing (NLP) is a branch of artificial intelligence that helps
computers understand, interpret, and manipulate human language. NLP draws on many
disciplines, including computer science and computational linguistics, in its quest to
bridge the gap between human communication and computer understanding.
Algorithms are the essence of any artificial intelligence system that are fed with as much
data as possible, as references, so that they can learn better [2] [3]. It is a tool that maps
decisions within a system and their possible consequences. Intelligent algorithms have
the ability and process to filter order and structure. Thus, they autonomously present
2
content that may, according to the rules of the algorithms, have more or less influence,
excluding other possible information. In general, an algorithm comprises a finite
sequence of executable actions (steps) to solve a problem or, in the most common case in
Computer Science, perform a task. The algorithm itself is not the program, but the
sequence of actions and conditions that must be obeyed for the problem to be solved.
Algorithms are finite sequences of instructions used to solve a problem. For example,
when someone accesses a website, algorithms define the path for the page to open
correctly. When someone interacts with a link, other algorithms are triggered, indicating
what to do [2] [3].
It is important to note that unlike the algorithm, which is a type of process, procedure or
set of rules that must be followed to solve any type of calculation, that is, step-by-step
instructions that define how the work must be performed in order to obtain the desired
outcome, software is a type of system that allows the user to interact with the computer
and gives instructions to the computer to perform specific tasks as well as control the
functioning of the hardware and its operations. Software is a set of instructions that must
be followed and executed by a mechanism, be it a computer or an electromechanical
device. Software is the term used to describe programs, apps, scripts, macros and directly
embedded code instructions (firmware), in order to dictate what a machine should do.
Every computer program, cell phone, tablet, smart TV, video game console, set-top box,
etc. it is software, be it a text editor, a browser, an audio or video editor, a game, a
streaming app, etc. [4].
The first advantage of using algorithms is task automation [5]. They can analyze a large
volume of data, in less time than a person would, for example. Thus, they increase the
efficiency of activities. All computer software is made up of algorithms. The evolution of
algorithms allows the emergence of new technologies, such as smartphones, smart TVs,
new applications and operating systems. With new command possibilities, the algorithms
become more improved, and, consequently, new potential uses are developed. Transport
and delivery applications, streaming services and movie and music recommendations are
provided by systems that work based on algorithms. Algorithms are, therefore, the
essence of any artificial intelligence system that are fed with as much data as possible, as
references, so that they can learn better [5].
Artificial intelligence promotes the reduction of human error because computers do not
make these errors if they are programmed correctly [5]. With Artificial Intelligence,
decisions are made based on information previously collected by applying a certain set of
algorithms. Thus, errors are reduced and the possibility of achieving accuracy with a
greater degree of precision is an achievable possibility. Artificial intelligence takes risks
instead of humans. This is one of the biggest advantages of Artificial Intelligence because
we can overcome many risk limitations involving human lives by developing an AI robot
that can do risky things for us. Among the possibilities, we have going to Mars, defusing
a bomb, exploring the deepest parts of the oceans, mining coal and oil and many others
[5].
Artificial intelligence helps with repetitive jobs in our day to day work like sending email,
checking documents for errors, and much more [5]. With artificial intelligence, these tasks
can be productively automated and even remove those considered “tiring” for humans
and free them to be increasingly creative and productive. Artificial intelligence provides
digital assistance to interact with users, which eliminates the need for human resources.
Digital assistants are also used on many websites to provide what users want by talking
to them about what they are looking for. Some chatbots are designed in such a way that
3
it is difficult to determine whether we are talking to a robot or a human. Artificial
intelligence provides faster decisions by making machines make decisions faster than
humans. The AI-powered machine works as programmed and will provide results faster.
Artificial intelligence drives innovation in almost all areas that will help humans solve
most complex problems.
2. Quantum Computing
One of the main characteristics of contemporary society is the large-scale use of
information technology. The computer, an icon of information technology, connected to
a network is changing people's relationship with time and space. Informational networks
allow us to expand our ability to think in unimaginable ways. The new technological
revolution has expanded human intelligence. We are talking about a technology that
allows increasing the storage, processing and analysis of information, making billions of
relationships between thousands of data per second: the computer [6]. Current computers
are electronic because they are made up of transistors used in electronic chips, that is, in
integrated circuits or small microelectronic devices generally made up of millions of
components that store, move and process data. This means that transistors have
limitations, as there will come a time when it will no longer be possible to reduce the size
of one of the smallest and most important components of processors, the transistor [6].
It is important to highlight that it is in this small device, the transistor, that all information
is read, interpreted and processed [6]. When dealing with very small scales, Physics stops
being as predictable as in macroscopic systems, starting to behave randomly, in a
probabilistic way, subject to the properties of Quantum Physics. This means that one of
the alternatives of the future is the quantum computer. In these computers, fundamental
units of information, called “quantum bits”, are used to solve calculations or simulations
that would take processing times that are impractical in electronic computers, such as
those currently used [6]. It is important to note that the bit is the smallest unit of
information that can be stored or transmitted and that it can only take two values: 0 or 1,
true or false, and so on. Each “binary digit” 0 or 1 is therefore known as a bit. Ordinary
computers work by reducing numbers and instructions to a binary code – a series of zeros
and ones. Technically, these zeros and ones represent whether or not electrical current
passes through a device called a transistor. Inside a microprocessor of your cell phone or
computer there are billions of transistors that, combined, form what are called logic gates.
A conventional computer translates this binary code into physical states, such as on or
off, within its hardware. Each “this or that” distinction would function as a way of storing
binary data. The sequences of binary numbers are then manipulated through banks of
logic gates, printed on silicon chips [7].
Quantum computers work with a logic quite different from that present in electronic
computers. Quantum bits can simultaneously present the values 0 and 1, as a result of a
quantum phenomenon called quantum superposition [6]. These values represent the
binary code of computers and are, in a way, the language understood by machines.
Quantum computers have proven to be the newest answer in Physics and Computing to
problems related to the limited capacity of electronic computers whose processing speed
and capacity are closely related to the size of their components. Therefore, its
miniaturization is an inevitable process.
Quantum computers will not serve the same purposes as electronic computers. Quantum
computers are good at working with many variables simultaneously, unlike current
computers, which have many limitations when carrying out this type of task. In this way,
4
it is expected that quantum computers can be used to simulate extremely complex
systems, such as biological, meteorological, astronomical, molecular systems, etc. The
ease of quantum computers in dealing with complex systems is related to the nature of
quantum bits. An electronic computer bit can only have the value 0 or 1, while quantum
bits can have both values at the same time. In this way, a single quantum bit has a
numerical equivalence of 2 electronic bits. This means that, with just 10 quantum bits, we
would have a computer with a capacity of 1024 bits (210
= 1024), while most conventional
computers today work with 64-bit systems [6].
With a conventional classical computer, if it needed to perform 100 different calculations,
it would have to process them one at a time, whereas with a quantum computer, it could
perform them all at once. The current situation where we are forced to use classical
computers for calculations will change drastically. Supercomputers — the highest class
of classical computers — are so large that they take up a large room. The reason is that
100 calculators are lined up to do 100 different calculations at once. In a real
supercomputer, more than 100,000 smaller computers are lined up. With the birth of
quantum computers, this will no longer be necessary. But, that doesn't mean
supercomputers will become unnecessary. They will be used for different purposes, such
as smartphones and computers [4].
Like the first digital computers, quantum computing offers the possibility of technologies
millions of times more powerful than current systems, but the key to success will be
solving real-world problems in quantum language [9]. We are at the limits of the data
processing power of traditional computers and data continues to grow. Although Moore's
Law, which predicts that the number of transistors in integrated circuits will double every
two years, has proven to be extremely consistent since the term was coined in 1965, these
transistors are now so small that they could not be manufactured with the technology
existing. That's why there's a race among the technology industry's biggest leaders to
determine who will be the first to launch a viable quantum computer that would be
exponentially more powerful than today's computers to process all the data we generate
every day and solve every problem increasingly complex [9].
Current computers have limitations, for example in the area of Artificial Intelligence,
where there are no computers with sufficient power or processing speed to support
advanced AI [9]. Thus, the need arose to create an alternative computer to the usual ones
that could solve AI problems, or others such as the factorization of very large prime
numbers, discrete logarithms and simulation of Quantum Physics problems. Quantum
computers will enable a range of useful applications, such as modeling variations of
chemical reactions to discover new medicines, developing imaging technologies for the
healthcare industry to detect problems in the body, or accelerating the way batteries, new
materials and flexible electronics are developed. that will make all the difference in data
processing. Considering that AI and Machine Learning depend on large data sets (Big
Data) to be effective, it is not difficult to imagine the revolution that the quantum
computer can bring. Many prototypes of quantum computers have already been tested in
laboratories around the world, but their large-scale development is still unknown and
depends on a lot of research and investment [9].
There are fields in which quantum computers have a great advantage over classical
computers, for example, in the areas of chemistry and biotechnology. Material reactions,
in principle, involve quantum effects. A quantum computer that used quantum
phenomena themselves would enable calculations that could easily incorporate quantum
5
effects and would be very effective in developing materials such as catalysts and
polymers. This can lead to the development of new medicines that were previously
unviable, thus contributing to improving people's health. Additionally, in the area of
finance, for example, as the formulas for options trading are similar to those for quantum
phenomena, it is expected that calculations can be performed efficiently on quantum
computers [9].
According to the MIT Technology Review [10], Artificial Intelligence is changing the
way we think about computing. Computers haven't advanced much in 40 or 50 years, they
have become smaller and faster, but they are still mere boxes with processors that carry
out human instructions. AI is changing this reality in at least three aspects: 1) the way
computers are produced; 2) the way computers are programmed; and, 3) how computers
are used. Ultimately, this is a phenomenon that will change the function of computers.
The core of computing is moving from number crunching to decision making [10].
The first change concerns how computers and the chips that control them are made [10].
The Deep learning models that make today's AI applications work, require, however, a
different approach because they require a large number of less precise calculations to be
performed at the same time. This means that a new type of chip is needed that can move
data as quickly as possible, ensuring that it is available whenever needed. When Deep
learning arrived on the scene about a decade ago, there were already specialized computer
chips that were very good at it with graphics processing units (GPUs) designed to display
an entire screen's worth of pixels dozens of times per second[10].
The second change concerns how computers are programmed what to do. For the last 40
years, computers have been programmed, and in the next 40 years, they will be trained.
Traditionally, in order for a computer to do something like recognize speech or identify
objects in an image, programmers have to first create rules for the computer. With
Machine learning, programmers no longer dictate the rules. Instead, they create a neural
network in which computers learn these rules themselves. The next big advances will
come in molecular simulation as training computers to manipulate the properties of matter
that can create global changes in energy use, food production, manufacturing and
medicine. Deep learning has an amazing track record. Two of the biggest advances of this
kind so far are how to make computers behave as if they understand human language and
recognize what is in an image and are already changing the way we use them [10].
The third change concerns the fact that a computer no longer needs a keyboard or screen
for humans to interact with them. Anything can become a computer. In fact, most
household products, from toothbrushes to light switches and doorbells, already have a
smart version. As they proliferate, however, so does our desire to spend less time telling
them what to do. It's like they should be able to figure out what we need without our
interference. This is the shift from number analysis to decision making as a determinant
of this new era of computing that envisions computers that tell humans what we need to
know and when we need to know it and that help humans when they need them. Now,
machines are interacting with people and becoming increasingly integrated into our lives.
Computers are already out of their boxes [10].
3. The contribution of Quantum Computing in the development of Artificial
Intelligence and vice versa
According to experts in the field, quantum computing can greatly accelerate the evolution
of artificial intelligence, making it quite powerful [11]. With the relationship between
6
quantum computing and AI, it is possible to have the use of quantum computing
capabilities and algorithms in AI tasks, as well as to have "classical" AI systems that are
applied to data coming from quantum systems, with the purpose of evolve these systems
[12]. A quantum computer has the inherent nature of being probabilistic. Thanks to
phenomena such as superposition and entanglement, they are able to encode probability
distributions that are much more complex than in classical models, and are even capable
of creating data that truly represent the existing probability distribution, and that are not
biased, such as happens with traditional models. Therefore, for some, these are the first
systems in which the use of quantum computers will prove to be a great advantage [12].
Another interesting question is whether it would be possible to use quantum computing
to evolve current AIs, creating new types of AIs that are more intelligent or can do more
things [12]. This means that quantum computers are capable of boosting AIs. Any and all
sectors that deal with a large volume of data, and data that presents great complexities,
would have a lot to gain from the joint work of these two technologies. The problem is
that saying this is almost the same as saying "all sectors of modern society" such as the
financial, health and telecommunications sectors. In the financial sector, virtually every
optimization activity would be improved. It would be possible to optimize capital
allocation, assets, portfolio choice and management, among others. In the area of
medicine and pharmacy there are some uses such as analysis and precise modeling of
molecules for new medicines, selection of groups for clinical trials, hospital resource
optimization processes and even the distribution of health services in a given city, for
example. In telecommunications, as well as in the financial sector, optimization tasks
would be the main focus, such as communication traffic and networks [12].
Optimizing the financial, health and telecommunications sectors shows great potential for
reducing resources consumed, which allows for lower costs and an improvement in the
service offered by these sectors [12]. As for other applications, there will be major
advances in business and technology consultancies that can help their clients visualize
how to apply these emerging technologies to their contexts. This will help companies
from various sectors to better understand the data they have, extract superior intelligence
from it and, consequently, offer better services. The combination of AI and quantum
computing is already a reality, in addition to academic research, as many companies, from
startups to technology giants, have giant initiatives to explore this combination.
Companies such as Zapata Computing, in the USA, Xanadu, in Canada, and even IBM
have projects, whether internal or in partnership with potential clients and/or industries,
in which the combination of these technologies has been used. An example comes from
the partnership between IBM and Moderna, which came together to combine quantum
computing and AI and develop several treatments based on mRNA, which was the
technology that allowed the emergence of one of the first and most effective vaccines for
COVID-19 during the pandemic [12].
4. Conclusions
From the above, it is quite evident the possibility that quantum computing can greatly
accelerate the evolution of Artificial Intelligence, which, by becoming even more
powerful, will also contribute to the development of quantum computers of the future.
REFERENCES
1. ICMCJUNIOR. O que é inteligência artificial? Available on the website
<https://icmcjunior.com.br/inteligencia-artificial/>.
7
2. INSIGHTS. O que é um algoritmo inteligente? Available on the website
<https://www.portalinsights.com.br/perguntas-frequentes/o-que-e-um-algoritmo-
inteligente>.
3. THIBES, Victoria. Afinal, o que é um algoritmo e o que isso tem a ver com
computação? Available on the website <https://canaltech.com.br/produtos/Afinal-
o-que-e-um-algoritmo-e-o-que-isso-tem-a-ver-com-computacao/>.
4. GOGONI, Ronaldo. O que é software? Available on the website
<https://tecnoblog.net/responde/o-que-e-software/>.
5. ALCOFORADO, Fernando. How artificial intelligence and its intelligent
software and algorithms work. Available on the website
<https://www.linkedin.com/pulse/how-artificial-intelligence-its-softwares-smart-
work-alcoforado-s7cgf>.
6. MUNDO EDUCAÇÃO. Computador quântico. Available on the website
<https://mundoeducacao.uol.com.br/fisica/computador-quantico.htm>.
7. LAPOLA, Marcelo. Como funciona um computador quântico? Físico explica
ciência por trás. Available on the website
<https://revistagalileu.globo.com/colunistas/quanticas/coluna/2023/11/como-
funciona-um-computador-quantico-fisico-explica-ciencia-por-tras.ghtml>.
8. KIDO, Yuzuru. The Present and Future of “Quantum Computers”. Available on
the website <https://social-innovation.hitachi/en/article/quantum-
computing/?utm_campaign=sns&utm_source=li&utm_medium=en_quantum-
computing_230>.
9. MATOS, David. Como a Computação Quântica Vai Revolucionar a Inteligência
Artificial, Machine Learning e Big Data. Available on the website
<https://www.cienciaedados.com/como-a-computacao-quantica-vai-revolucionar-a-
inteligencia-artificial-machine-learning-e-big-data/>.
10. MIT Techonology Review. Como a Inteligência Artificial está reinventando o que
os computadores são. Available on the website
<https://mittechreview.com.br/como-a-inteligencia-artificial-esta-reinventando-o-
que-os-computadores-sao/>.
11. SOUZA, Júlia. Com computação quântica, inteligência artificial deve dar salto
gigantesco, dizem especialistas. Available on the website
<https://epocanegocios.globo.com/tecnologia/noticia/2023/04/com-computacao-
quantica-inteligencia-artificial-deve-dar-salto-gigantesco-dizem-
especialistas.ghtml>.
12. AUGUSTO, César. O potencial da combinação entre computação quântica e
inteligência artificial. Available on the website
<https://tiinside.com.br/24/05/2023/o-potencial-da-combinacao-entre-computacao-
quantica-e-inteligencia-artificial/>.

Fernando Alcoforado, awarded the medal of Engineering Merit of the CONFEA / CREA System, member
of the SBPC- Brazilian Society for the Progress of Science, IPB- Polytechnic Institute of Bahia and of the
Bahia Academy of Education, engineer from the UFBA Polytechnic School and doctor in Territorial
Planning and Regional Development from the University of Barcelona, college professor (Engineering,
Economy and Administration) and consultant in the areas of strategic planning, business planning, regional
planning, urban planning and energy systems, was Advisor to the Vice President of Engineering and
Technology at LIGHT S.A. Electric power distribution company from Rio de Janeiro, Strategic Planning
Coordinator of CEPED- Bahia Research and Development Center, Undersecretary of Energy of the State
of Bahia, Secretary of Planning of Salvador, is the author of the books Globalização (Editora Nobel, São
Paulo, 1997), De Collor a FHC- O Brasil e a Nova (Des)ordem Mundial (Editora Nobel, São Paulo, 1998),
Um Projeto para o Brasil (Editora Nobel, São Paulo, 2000), Os condicionantes do desenvolvimento do
Estado da Bahia (Tese de doutorado. Universidade de
8
Barcelona,http://www.tesisenred.net/handle/10803/1944, 2003), Globalização e Desenvolvimento (Editora
Nobel, São Paulo, 2006), Bahia- Desenvolvimento do Século XVI ao Século XX e Objetivos Estratégicos
na Era Contemporânea (EGBA, Salvador, 2008), The Necessary Conditions of the Economic and Social
Development- The Case of the State of Bahia (VDM Verlag Dr. Müller Aktiengesellschaft & Co. KG,
Saarbrücken, Germany, 2010), Aquecimento Global e Catástrofe Planetária (Viena- Editora e Gráfica,
Santa Cruz do Rio Pardo, São Paulo, 2010), Amazônia Sustentável- Para o progresso do Brasil e combate
ao aquecimento global (Viena- Editora e Gráfica, Santa Cruz do Rio Pardo, São Paulo, 2011), Os Fatores
Condicionantes do Desenvolvimento Econômico e Social (Editora CRV, Curitiba, 2012), Energia no
Mundo e no Brasil- Energia e Mudança Climática Catastrófica no Século XXI (Editora CRV, Curitiba,
2015), As Grandes Revoluções Científicas, Econômicas e Sociais que Mudaram o Mundo (Editora CRV,
Curitiba, 2016), A Invenção de um novo Brasil (Editora CRV, Curitiba, 2017), Esquerda x Direita e a sua
convergência (Associação Baiana de Imprensa, Salvador, 2018), Como inventar o futuro para mudar o
mundo (Editora CRV, Curitiba, 2019), A humanidade ameaçada e as estratégias para sua sobrevivência
(Editora Dialética, São Paulo, 2021), A escalada da ciência e da tecnologia e sua contribuição ao progresso
e à sobrevivência da humanidade (Editora CRV, Curitiba, 2022), a chapter in the book Flood Handbook
(CRC Press, Boca Raton, Florida United States, 2022), How to protect human beings from threats to their
existence and avoid the extinction of humanity (Generis Publishing, Europe, Republic of Moldova,
Chișinău, 2023) and A revolução da educação necessária ao Brasil na era contemporânea (Editora CRV,
Curitiba, 2023).

More Related Content

PDF
HOW ARTIFICIAL INTELLIGENCE AND ITS SOFTWARES AND SMART ALGORITHMS WORK.pdf
PPTX
UNIT 1 IX (1) (2) (2).pptx
PPTX
UNIT 1 IX (1) (2) (3).pptx
PPTX
UNIT 1 IX (1) (1).pptx
PPTX
UNIT 1 IX (1) (2) (1).pptx
PDF
What is artificial intelligence Definition, top 10 types and examples.pdf
PPTX
ARTIFICIAL INTELLLLIGENCEE modul11_AI.pptx
DOCX
artificial intelligence.docx
HOW ARTIFICIAL INTELLIGENCE AND ITS SOFTWARES AND SMART ALGORITHMS WORK.pdf
UNIT 1 IX (1) (2) (2).pptx
UNIT 1 IX (1) (2) (3).pptx
UNIT 1 IX (1) (1).pptx
UNIT 1 IX (1) (2) (1).pptx
What is artificial intelligence Definition, top 10 types and examples.pdf
ARTIFICIAL INTELLLLIGENCEE modul11_AI.pptx
artificial intelligence.docx

Similar to ARTIFICIAL INTELLIGENCE + QUANTUM COMPUTING = GREATEST TECHNOLOGICAL REVOLUTION IN HISTORY.pdf (20)

DOC
AiArtificial Itelligence
DOCX
Understanding Artificial Intelligence A Beginner’s Guide
PDF
How to build an AI app.pdf
PDF
leewayhertz.com-How to build an AI app.pdf
PDF
The advent of artificial super intelligence and its impacts
PDF
Lecture1-Artificial Intelligence.pptx.pdf
PPTX
Artificial Intelligence.pptx
PPTX
AI PROJECT - Created by: Shreya Kumbhar
PDF
How to build an AI app.pdf
PDF
How to build an AI app.pdf
PDF
What Is Artificial Intelligence,How It Is Used and Its Future.pdf
PDF
AI Evolution Beyond Humans _The Age of Machine Superiority.pdf
PDF
AI Evolution Beyond Humans _The Age of Machine Superiority.pdf
PDF
2016 promise-of-ai
PDF
leewayhertz.com-How to build an AI app.pdf
PDF
Artificial Intelligence and Machine Learning
PDF
Artificial intelligence uses in productive systems and impacts on the world...
PDF
Elements of artificial intelligence and usage
PDF
A Study On Artificial Intelligence Technologies And Its Applications
PPT
artificial intelligence
AiArtificial Itelligence
Understanding Artificial Intelligence A Beginner’s Guide
How to build an AI app.pdf
leewayhertz.com-How to build an AI app.pdf
The advent of artificial super intelligence and its impacts
Lecture1-Artificial Intelligence.pptx.pdf
Artificial Intelligence.pptx
AI PROJECT - Created by: Shreya Kumbhar
How to build an AI app.pdf
How to build an AI app.pdf
What Is Artificial Intelligence,How It Is Used and Its Future.pdf
AI Evolution Beyond Humans _The Age of Machine Superiority.pdf
AI Evolution Beyond Humans _The Age of Machine Superiority.pdf
2016 promise-of-ai
leewayhertz.com-How to build an AI app.pdf
Artificial Intelligence and Machine Learning
Artificial intelligence uses in productive systems and impacts on the world...
Elements of artificial intelligence and usage
A Study On Artificial Intelligence Technologies And Its Applications
artificial intelligence
Ad

More from Faga1939 (20)

PDF
DE L'USURPATION DU TERRITOIRE PALESTINIEN PAR ISRAËL AU GÉNOCIDE DANS LA BAND...
PDF
FROM ISRAEL'S USURPATION OF PALESTINIAN TERRITORY TO THE GENOCIDE IN THE GAZA...
PDF
DA USURPAÇÃO DO TERRITÓRIO DA PALESTINA POR ISRAEL AO GENOCÍDIO NA FAIXA DE G...
PDF
L'IMPACT DE L´AUGMENTATION DES TARIFS DOUANIERS DE TRUMP SUR LE BRÉSIL NÉCES...
PDF
O TARIFAÇO DE TRUMP CONTRA O BRASIL EXIGE O FIM DE SUA DEPENDÊNCIA EXTERNA.pdf
PDF
COMMENT FAIRE FACE AUX MENACES DES FORCES DE LA NATURE VENANT DE L'ESPACE.pdf
PDF
HOW TO DEAL WITH THREATS FROM THE FORCES OF NATURE FROM OUTER SPACE.pdf
PDF
COMO ENFRENTAR AS AMEAÇAS DAS FORÇAS DA NATUREZA VINDAS DO ESPAÇO EXTERIOR.pdf
PDF
COMMENT ÉLIMINER LES MALFAITS CONTRE L'HUMANITÉ CAUSÉS PAR LE CAPITALISME À T...
PDF
HOW TO ELIMINATE THE EVILS AGAINST HUMANITY CAUSED BY CAPITALISM THROUGHOUT H...
PDF
COMO ELIMINAR OS MALES CONTRA A HUMANIDADE PROVOCADOS PELO CAPITALISMO AO LO...
PDF
COMMENT FAIRE FACE AUX MENACES DES FORCES DE LA NATURE QUI EXISTENT SUR LA PL...
PDF
HOW TO FACE THREATS FROM THE FORCES OF NATURE EXISTING ON PLANET EARTH.pdf
PDF
COMO ENFRENTAR AS AMEAÇAS DAS FORÇAS DA NATUREZA EXISTENTES NO PLANETA TERRA.pdf
PDF
COMMENT EMPÊCHER LES NÉO-FASCISTES DE RETOURNER AU POUVOIR AU BRÉSIL.pdf
PDF
COMO IMPEDIR A VOLTA DOS NEOFASCISTAS AO PODER NO BRASIL.pdf
PDF
COP 30 ET SES DÉFIS POUR ÉVITER LE RÉCHAUFFEMENT CLIMATIQUE MONDIAL ET UN CHA...
PDF
COP 30 AND ITS CHALLENGES TO AVOID GLOBAL WARMING AND CATASTROPHIC GLOBAL CLI...
PDF
A COP 30 E SEUS DESAFIOS PARA EVITAR O AQUECIMENTO GLOBAL E A MUDANÇA CLIMÁTI...
PDF
THE ENERGY SOURCES THAT HAVE CHANGED THE WORLD THROUGHOUT HISTORY AND THE REQ...
DE L'USURPATION DU TERRITOIRE PALESTINIEN PAR ISRAËL AU GÉNOCIDE DANS LA BAND...
FROM ISRAEL'S USURPATION OF PALESTINIAN TERRITORY TO THE GENOCIDE IN THE GAZA...
DA USURPAÇÃO DO TERRITÓRIO DA PALESTINA POR ISRAEL AO GENOCÍDIO NA FAIXA DE G...
L'IMPACT DE L´AUGMENTATION DES TARIFS DOUANIERS DE TRUMP SUR LE BRÉSIL NÉCES...
O TARIFAÇO DE TRUMP CONTRA O BRASIL EXIGE O FIM DE SUA DEPENDÊNCIA EXTERNA.pdf
COMMENT FAIRE FACE AUX MENACES DES FORCES DE LA NATURE VENANT DE L'ESPACE.pdf
HOW TO DEAL WITH THREATS FROM THE FORCES OF NATURE FROM OUTER SPACE.pdf
COMO ENFRENTAR AS AMEAÇAS DAS FORÇAS DA NATUREZA VINDAS DO ESPAÇO EXTERIOR.pdf
COMMENT ÉLIMINER LES MALFAITS CONTRE L'HUMANITÉ CAUSÉS PAR LE CAPITALISME À T...
HOW TO ELIMINATE THE EVILS AGAINST HUMANITY CAUSED BY CAPITALISM THROUGHOUT H...
COMO ELIMINAR OS MALES CONTRA A HUMANIDADE PROVOCADOS PELO CAPITALISMO AO LO...
COMMENT FAIRE FACE AUX MENACES DES FORCES DE LA NATURE QUI EXISTENT SUR LA PL...
HOW TO FACE THREATS FROM THE FORCES OF NATURE EXISTING ON PLANET EARTH.pdf
COMO ENFRENTAR AS AMEAÇAS DAS FORÇAS DA NATUREZA EXISTENTES NO PLANETA TERRA.pdf
COMMENT EMPÊCHER LES NÉO-FASCISTES DE RETOURNER AU POUVOIR AU BRÉSIL.pdf
COMO IMPEDIR A VOLTA DOS NEOFASCISTAS AO PODER NO BRASIL.pdf
COP 30 ET SES DÉFIS POUR ÉVITER LE RÉCHAUFFEMENT CLIMATIQUE MONDIAL ET UN CHA...
COP 30 AND ITS CHALLENGES TO AVOID GLOBAL WARMING AND CATASTROPHIC GLOBAL CLI...
A COP 30 E SEUS DESAFIOS PARA EVITAR O AQUECIMENTO GLOBAL E A MUDANÇA CLIMÁTI...
THE ENERGY SOURCES THAT HAVE CHANGED THE WORLD THROUGHOUT HISTORY AND THE REQ...
Ad

Recently uploaded (20)

PPTX
Final SEM Unit 1 for mit wpu at pune .pptx
PPTX
Tartificialntelligence_presentation.pptx
PDF
Five Habits of High-Impact Board Members
PDF
WOOl fibre morphology and structure.pdf for textiles
PPTX
The various Industrial Revolutions .pptx
PPTX
Group 1 Presentation -Planning and Decision Making .pptx
PDF
A contest of sentiment analysis: k-nearest neighbor versus neural network
PDF
Univ-Connecticut-ChatGPT-Presentaion.pdf
PDF
Enhancing emotion recognition model for a student engagement use case through...
PPTX
Web Crawler for Trend Tracking Gen Z Insights.pptx
PDF
Getting started with AI Agents and Multi-Agent Systems
PDF
NewMind AI Weekly Chronicles – August ’25 Week III
PPTX
observCloud-Native Containerability and monitoring.pptx
PDF
Getting Started with Data Integration: FME Form 101
PDF
sustainability-14-14877-v2.pddhzftheheeeee
PDF
Unlock new opportunities with location data.pdf
PDF
Assigned Numbers - 2025 - Bluetooth® Document
PDF
Developing a website for English-speaking practice to English as a foreign la...
PPTX
MicrosoftCybserSecurityReferenceArchitecture-April-2025.pptx
PPTX
O2C Customer Invoices to Receipt V15A.pptx
Final SEM Unit 1 for mit wpu at pune .pptx
Tartificialntelligence_presentation.pptx
Five Habits of High-Impact Board Members
WOOl fibre morphology and structure.pdf for textiles
The various Industrial Revolutions .pptx
Group 1 Presentation -Planning and Decision Making .pptx
A contest of sentiment analysis: k-nearest neighbor versus neural network
Univ-Connecticut-ChatGPT-Presentaion.pdf
Enhancing emotion recognition model for a student engagement use case through...
Web Crawler for Trend Tracking Gen Z Insights.pptx
Getting started with AI Agents and Multi-Agent Systems
NewMind AI Weekly Chronicles – August ’25 Week III
observCloud-Native Containerability and monitoring.pptx
Getting Started with Data Integration: FME Form 101
sustainability-14-14877-v2.pddhzftheheeeee
Unlock new opportunities with location data.pdf
Assigned Numbers - 2025 - Bluetooth® Document
Developing a website for English-speaking practice to English as a foreign la...
MicrosoftCybserSecurityReferenceArchitecture-April-2025.pptx
O2C Customer Invoices to Receipt V15A.pptx

ARTIFICIAL INTELLIGENCE + QUANTUM COMPUTING = GREATEST TECHNOLOGICAL REVOLUTION IN HISTORY.pdf

  • 1. 1 ARTIFICIAL INTELLIGENCE + QUANTUM COMPUTING = GREATEST TECHNOLOGICAL REVOLUTION IN HISTORY Fernando Alcoforado* This article aims to demonstrate that the combination of Artificial Intelligence and Quantum Computing will constitute the greatest technological revolution in history. Quantum computing could greatly accelerate the evolution of Artificial Intelligence, which, when it becomes even more powerful, will contribute to the development of quantum computers of the future. This article presents how Artificial Intelligence, Quantum Computing work and what will result from the combination of both. 1. Artificial Intelligence Artificial intelligence (AI) is a computational technology or a set of technologies such as artificial neural networks, algorithms and learning systems whose objective is to imitate human mental capabilities, such as: reasoning, environmental perception and decision- making capacity [1]. The technology is developed with the aim that machines can solve a series of problems, covering everything from the great complexity of government and industry management to the daily tasks of modern men and women. To do this, AI uses sophisticated learning technology, allowing it to learn from a large set of data and act on its own. The general objective of AI is to create machines that can operate at the same level of cognitive capacity as humans, or even surpass them. In recent years, AI has emerged as a transformative force across multiple industries, revolutionizing the way companies conduct business [1]. Artificial Intelligence is based on three technologies [1]: 1. Machine Learning is an application of Artificial Intelligence that provides the computer with the ability to automatically learn and improve from its own experience. Machine learning focuses on developing “software” that can access data and use it to learn from it. The learning process begins with observing data in order to look for statistical patterns and make good decisions based on the examples provided. In this way, the main objective is to make computers learn automatically without human intervention. 2. Deep learning is a subset of Machine learning, essentially being a neural network with three or more layers. These neural networks attempt to simulate the behavior of the human brain although far from matching its capacity allowing the machine to “learn” from the abundance of data. While a single-layer neural network can still make approximate predictions, additional hidden layers can help optimize and refine accuracy. Deep learning drives many AI applications and services that improve automation by performing analytical and physical tasks without human intervention. Deep learning technology is behind everyday products and services (like digital assistants, voice-enabled TV remotes, and credit card fraud detection) as well as emerging technologies (like self-driving cars). 3. Natural language processing (NLP) is a branch of artificial intelligence that helps computers understand, interpret, and manipulate human language. NLP draws on many disciplines, including computer science and computational linguistics, in its quest to bridge the gap between human communication and computer understanding. Algorithms are the essence of any artificial intelligence system that are fed with as much data as possible, as references, so that they can learn better [2] [3]. It is a tool that maps decisions within a system and their possible consequences. Intelligent algorithms have the ability and process to filter order and structure. Thus, they autonomously present
  • 2. 2 content that may, according to the rules of the algorithms, have more or less influence, excluding other possible information. In general, an algorithm comprises a finite sequence of executable actions (steps) to solve a problem or, in the most common case in Computer Science, perform a task. The algorithm itself is not the program, but the sequence of actions and conditions that must be obeyed for the problem to be solved. Algorithms are finite sequences of instructions used to solve a problem. For example, when someone accesses a website, algorithms define the path for the page to open correctly. When someone interacts with a link, other algorithms are triggered, indicating what to do [2] [3]. It is important to note that unlike the algorithm, which is a type of process, procedure or set of rules that must be followed to solve any type of calculation, that is, step-by-step instructions that define how the work must be performed in order to obtain the desired outcome, software is a type of system that allows the user to interact with the computer and gives instructions to the computer to perform specific tasks as well as control the functioning of the hardware and its operations. Software is a set of instructions that must be followed and executed by a mechanism, be it a computer or an electromechanical device. Software is the term used to describe programs, apps, scripts, macros and directly embedded code instructions (firmware), in order to dictate what a machine should do. Every computer program, cell phone, tablet, smart TV, video game console, set-top box, etc. it is software, be it a text editor, a browser, an audio or video editor, a game, a streaming app, etc. [4]. The first advantage of using algorithms is task automation [5]. They can analyze a large volume of data, in less time than a person would, for example. Thus, they increase the efficiency of activities. All computer software is made up of algorithms. The evolution of algorithms allows the emergence of new technologies, such as smartphones, smart TVs, new applications and operating systems. With new command possibilities, the algorithms become more improved, and, consequently, new potential uses are developed. Transport and delivery applications, streaming services and movie and music recommendations are provided by systems that work based on algorithms. Algorithms are, therefore, the essence of any artificial intelligence system that are fed with as much data as possible, as references, so that they can learn better [5]. Artificial intelligence promotes the reduction of human error because computers do not make these errors if they are programmed correctly [5]. With Artificial Intelligence, decisions are made based on information previously collected by applying a certain set of algorithms. Thus, errors are reduced and the possibility of achieving accuracy with a greater degree of precision is an achievable possibility. Artificial intelligence takes risks instead of humans. This is one of the biggest advantages of Artificial Intelligence because we can overcome many risk limitations involving human lives by developing an AI robot that can do risky things for us. Among the possibilities, we have going to Mars, defusing a bomb, exploring the deepest parts of the oceans, mining coal and oil and many others [5]. Artificial intelligence helps with repetitive jobs in our day to day work like sending email, checking documents for errors, and much more [5]. With artificial intelligence, these tasks can be productively automated and even remove those considered “tiring” for humans and free them to be increasingly creative and productive. Artificial intelligence provides digital assistance to interact with users, which eliminates the need for human resources. Digital assistants are also used on many websites to provide what users want by talking to them about what they are looking for. Some chatbots are designed in such a way that
  • 3. 3 it is difficult to determine whether we are talking to a robot or a human. Artificial intelligence provides faster decisions by making machines make decisions faster than humans. The AI-powered machine works as programmed and will provide results faster. Artificial intelligence drives innovation in almost all areas that will help humans solve most complex problems. 2. Quantum Computing One of the main characteristics of contemporary society is the large-scale use of information technology. The computer, an icon of information technology, connected to a network is changing people's relationship with time and space. Informational networks allow us to expand our ability to think in unimaginable ways. The new technological revolution has expanded human intelligence. We are talking about a technology that allows increasing the storage, processing and analysis of information, making billions of relationships between thousands of data per second: the computer [6]. Current computers are electronic because they are made up of transistors used in electronic chips, that is, in integrated circuits or small microelectronic devices generally made up of millions of components that store, move and process data. This means that transistors have limitations, as there will come a time when it will no longer be possible to reduce the size of one of the smallest and most important components of processors, the transistor [6]. It is important to highlight that it is in this small device, the transistor, that all information is read, interpreted and processed [6]. When dealing with very small scales, Physics stops being as predictable as in macroscopic systems, starting to behave randomly, in a probabilistic way, subject to the properties of Quantum Physics. This means that one of the alternatives of the future is the quantum computer. In these computers, fundamental units of information, called “quantum bits”, are used to solve calculations or simulations that would take processing times that are impractical in electronic computers, such as those currently used [6]. It is important to note that the bit is the smallest unit of information that can be stored or transmitted and that it can only take two values: 0 or 1, true or false, and so on. Each “binary digit” 0 or 1 is therefore known as a bit. Ordinary computers work by reducing numbers and instructions to a binary code – a series of zeros and ones. Technically, these zeros and ones represent whether or not electrical current passes through a device called a transistor. Inside a microprocessor of your cell phone or computer there are billions of transistors that, combined, form what are called logic gates. A conventional computer translates this binary code into physical states, such as on or off, within its hardware. Each “this or that” distinction would function as a way of storing binary data. The sequences of binary numbers are then manipulated through banks of logic gates, printed on silicon chips [7]. Quantum computers work with a logic quite different from that present in electronic computers. Quantum bits can simultaneously present the values 0 and 1, as a result of a quantum phenomenon called quantum superposition [6]. These values represent the binary code of computers and are, in a way, the language understood by machines. Quantum computers have proven to be the newest answer in Physics and Computing to problems related to the limited capacity of electronic computers whose processing speed and capacity are closely related to the size of their components. Therefore, its miniaturization is an inevitable process. Quantum computers will not serve the same purposes as electronic computers. Quantum computers are good at working with many variables simultaneously, unlike current computers, which have many limitations when carrying out this type of task. In this way,
  • 4. 4 it is expected that quantum computers can be used to simulate extremely complex systems, such as biological, meteorological, astronomical, molecular systems, etc. The ease of quantum computers in dealing with complex systems is related to the nature of quantum bits. An electronic computer bit can only have the value 0 or 1, while quantum bits can have both values at the same time. In this way, a single quantum bit has a numerical equivalence of 2 electronic bits. This means that, with just 10 quantum bits, we would have a computer with a capacity of 1024 bits (210 = 1024), while most conventional computers today work with 64-bit systems [6]. With a conventional classical computer, if it needed to perform 100 different calculations, it would have to process them one at a time, whereas with a quantum computer, it could perform them all at once. The current situation where we are forced to use classical computers for calculations will change drastically. Supercomputers — the highest class of classical computers — are so large that they take up a large room. The reason is that 100 calculators are lined up to do 100 different calculations at once. In a real supercomputer, more than 100,000 smaller computers are lined up. With the birth of quantum computers, this will no longer be necessary. But, that doesn't mean supercomputers will become unnecessary. They will be used for different purposes, such as smartphones and computers [4]. Like the first digital computers, quantum computing offers the possibility of technologies millions of times more powerful than current systems, but the key to success will be solving real-world problems in quantum language [9]. We are at the limits of the data processing power of traditional computers and data continues to grow. Although Moore's Law, which predicts that the number of transistors in integrated circuits will double every two years, has proven to be extremely consistent since the term was coined in 1965, these transistors are now so small that they could not be manufactured with the technology existing. That's why there's a race among the technology industry's biggest leaders to determine who will be the first to launch a viable quantum computer that would be exponentially more powerful than today's computers to process all the data we generate every day and solve every problem increasingly complex [9]. Current computers have limitations, for example in the area of Artificial Intelligence, where there are no computers with sufficient power or processing speed to support advanced AI [9]. Thus, the need arose to create an alternative computer to the usual ones that could solve AI problems, or others such as the factorization of very large prime numbers, discrete logarithms and simulation of Quantum Physics problems. Quantum computers will enable a range of useful applications, such as modeling variations of chemical reactions to discover new medicines, developing imaging technologies for the healthcare industry to detect problems in the body, or accelerating the way batteries, new materials and flexible electronics are developed. that will make all the difference in data processing. Considering that AI and Machine Learning depend on large data sets (Big Data) to be effective, it is not difficult to imagine the revolution that the quantum computer can bring. Many prototypes of quantum computers have already been tested in laboratories around the world, but their large-scale development is still unknown and depends on a lot of research and investment [9]. There are fields in which quantum computers have a great advantage over classical computers, for example, in the areas of chemistry and biotechnology. Material reactions, in principle, involve quantum effects. A quantum computer that used quantum phenomena themselves would enable calculations that could easily incorporate quantum
  • 5. 5 effects and would be very effective in developing materials such as catalysts and polymers. This can lead to the development of new medicines that were previously unviable, thus contributing to improving people's health. Additionally, in the area of finance, for example, as the formulas for options trading are similar to those for quantum phenomena, it is expected that calculations can be performed efficiently on quantum computers [9]. According to the MIT Technology Review [10], Artificial Intelligence is changing the way we think about computing. Computers haven't advanced much in 40 or 50 years, they have become smaller and faster, but they are still mere boxes with processors that carry out human instructions. AI is changing this reality in at least three aspects: 1) the way computers are produced; 2) the way computers are programmed; and, 3) how computers are used. Ultimately, this is a phenomenon that will change the function of computers. The core of computing is moving from number crunching to decision making [10]. The first change concerns how computers and the chips that control them are made [10]. The Deep learning models that make today's AI applications work, require, however, a different approach because they require a large number of less precise calculations to be performed at the same time. This means that a new type of chip is needed that can move data as quickly as possible, ensuring that it is available whenever needed. When Deep learning arrived on the scene about a decade ago, there were already specialized computer chips that were very good at it with graphics processing units (GPUs) designed to display an entire screen's worth of pixels dozens of times per second[10]. The second change concerns how computers are programmed what to do. For the last 40 years, computers have been programmed, and in the next 40 years, they will be trained. Traditionally, in order for a computer to do something like recognize speech or identify objects in an image, programmers have to first create rules for the computer. With Machine learning, programmers no longer dictate the rules. Instead, they create a neural network in which computers learn these rules themselves. The next big advances will come in molecular simulation as training computers to manipulate the properties of matter that can create global changes in energy use, food production, manufacturing and medicine. Deep learning has an amazing track record. Two of the biggest advances of this kind so far are how to make computers behave as if they understand human language and recognize what is in an image and are already changing the way we use them [10]. The third change concerns the fact that a computer no longer needs a keyboard or screen for humans to interact with them. Anything can become a computer. In fact, most household products, from toothbrushes to light switches and doorbells, already have a smart version. As they proliferate, however, so does our desire to spend less time telling them what to do. It's like they should be able to figure out what we need without our interference. This is the shift from number analysis to decision making as a determinant of this new era of computing that envisions computers that tell humans what we need to know and when we need to know it and that help humans when they need them. Now, machines are interacting with people and becoming increasingly integrated into our lives. Computers are already out of their boxes [10]. 3. The contribution of Quantum Computing in the development of Artificial Intelligence and vice versa According to experts in the field, quantum computing can greatly accelerate the evolution of artificial intelligence, making it quite powerful [11]. With the relationship between
  • 6. 6 quantum computing and AI, it is possible to have the use of quantum computing capabilities and algorithms in AI tasks, as well as to have "classical" AI systems that are applied to data coming from quantum systems, with the purpose of evolve these systems [12]. A quantum computer has the inherent nature of being probabilistic. Thanks to phenomena such as superposition and entanglement, they are able to encode probability distributions that are much more complex than in classical models, and are even capable of creating data that truly represent the existing probability distribution, and that are not biased, such as happens with traditional models. Therefore, for some, these are the first systems in which the use of quantum computers will prove to be a great advantage [12]. Another interesting question is whether it would be possible to use quantum computing to evolve current AIs, creating new types of AIs that are more intelligent or can do more things [12]. This means that quantum computers are capable of boosting AIs. Any and all sectors that deal with a large volume of data, and data that presents great complexities, would have a lot to gain from the joint work of these two technologies. The problem is that saying this is almost the same as saying "all sectors of modern society" such as the financial, health and telecommunications sectors. In the financial sector, virtually every optimization activity would be improved. It would be possible to optimize capital allocation, assets, portfolio choice and management, among others. In the area of medicine and pharmacy there are some uses such as analysis and precise modeling of molecules for new medicines, selection of groups for clinical trials, hospital resource optimization processes and even the distribution of health services in a given city, for example. In telecommunications, as well as in the financial sector, optimization tasks would be the main focus, such as communication traffic and networks [12]. Optimizing the financial, health and telecommunications sectors shows great potential for reducing resources consumed, which allows for lower costs and an improvement in the service offered by these sectors [12]. As for other applications, there will be major advances in business and technology consultancies that can help their clients visualize how to apply these emerging technologies to their contexts. This will help companies from various sectors to better understand the data they have, extract superior intelligence from it and, consequently, offer better services. The combination of AI and quantum computing is already a reality, in addition to academic research, as many companies, from startups to technology giants, have giant initiatives to explore this combination. Companies such as Zapata Computing, in the USA, Xanadu, in Canada, and even IBM have projects, whether internal or in partnership with potential clients and/or industries, in which the combination of these technologies has been used. An example comes from the partnership between IBM and Moderna, which came together to combine quantum computing and AI and develop several treatments based on mRNA, which was the technology that allowed the emergence of one of the first and most effective vaccines for COVID-19 during the pandemic [12]. 4. Conclusions From the above, it is quite evident the possibility that quantum computing can greatly accelerate the evolution of Artificial Intelligence, which, by becoming even more powerful, will also contribute to the development of quantum computers of the future. REFERENCES 1. ICMCJUNIOR. O que é inteligência artificial? Available on the website <https://icmcjunior.com.br/inteligencia-artificial/>.
  • 7. 7 2. INSIGHTS. O que é um algoritmo inteligente? Available on the website <https://www.portalinsights.com.br/perguntas-frequentes/o-que-e-um-algoritmo- inteligente>. 3. THIBES, Victoria. Afinal, o que é um algoritmo e o que isso tem a ver com computação? Available on the website <https://canaltech.com.br/produtos/Afinal- o-que-e-um-algoritmo-e-o-que-isso-tem-a-ver-com-computacao/>. 4. GOGONI, Ronaldo. O que é software? Available on the website <https://tecnoblog.net/responde/o-que-e-software/>. 5. ALCOFORADO, Fernando. How artificial intelligence and its intelligent software and algorithms work. Available on the website <https://www.linkedin.com/pulse/how-artificial-intelligence-its-softwares-smart- work-alcoforado-s7cgf>. 6. MUNDO EDUCAÇÃO. Computador quântico. Available on the website <https://mundoeducacao.uol.com.br/fisica/computador-quantico.htm>. 7. LAPOLA, Marcelo. Como funciona um computador quântico? Físico explica ciência por trás. Available on the website <https://revistagalileu.globo.com/colunistas/quanticas/coluna/2023/11/como- funciona-um-computador-quantico-fisico-explica-ciencia-por-tras.ghtml>. 8. KIDO, Yuzuru. The Present and Future of “Quantum Computers”. Available on the website <https://social-innovation.hitachi/en/article/quantum- computing/?utm_campaign=sns&utm_source=li&utm_medium=en_quantum- computing_230>. 9. MATOS, David. Como a Computação Quântica Vai Revolucionar a Inteligência Artificial, Machine Learning e Big Data. Available on the website <https://www.cienciaedados.com/como-a-computacao-quantica-vai-revolucionar-a- inteligencia-artificial-machine-learning-e-big-data/>. 10. MIT Techonology Review. Como a Inteligência Artificial está reinventando o que os computadores são. Available on the website <https://mittechreview.com.br/como-a-inteligencia-artificial-esta-reinventando-o- que-os-computadores-sao/>. 11. SOUZA, Júlia. Com computação quântica, inteligência artificial deve dar salto gigantesco, dizem especialistas. Available on the website <https://epocanegocios.globo.com/tecnologia/noticia/2023/04/com-computacao- quantica-inteligencia-artificial-deve-dar-salto-gigantesco-dizem- especialistas.ghtml>. 12. AUGUSTO, César. O potencial da combinação entre computação quântica e inteligência artificial. Available on the website <https://tiinside.com.br/24/05/2023/o-potencial-da-combinacao-entre-computacao- quantica-e-inteligencia-artificial/>.  Fernando Alcoforado, awarded the medal of Engineering Merit of the CONFEA / CREA System, member of the SBPC- Brazilian Society for the Progress of Science, IPB- Polytechnic Institute of Bahia and of the Bahia Academy of Education, engineer from the UFBA Polytechnic School and doctor in Territorial Planning and Regional Development from the University of Barcelona, college professor (Engineering, Economy and Administration) and consultant in the areas of strategic planning, business planning, regional planning, urban planning and energy systems, was Advisor to the Vice President of Engineering and Technology at LIGHT S.A. Electric power distribution company from Rio de Janeiro, Strategic Planning Coordinator of CEPED- Bahia Research and Development Center, Undersecretary of Energy of the State of Bahia, Secretary of Planning of Salvador, is the author of the books Globalização (Editora Nobel, São Paulo, 1997), De Collor a FHC- O Brasil e a Nova (Des)ordem Mundial (Editora Nobel, São Paulo, 1998), Um Projeto para o Brasil (Editora Nobel, São Paulo, 2000), Os condicionantes do desenvolvimento do Estado da Bahia (Tese de doutorado. Universidade de
  • 8. 8 Barcelona,http://www.tesisenred.net/handle/10803/1944, 2003), Globalização e Desenvolvimento (Editora Nobel, São Paulo, 2006), Bahia- Desenvolvimento do Século XVI ao Século XX e Objetivos Estratégicos na Era Contemporânea (EGBA, Salvador, 2008), The Necessary Conditions of the Economic and Social Development- The Case of the State of Bahia (VDM Verlag Dr. Müller Aktiengesellschaft & Co. KG, Saarbrücken, Germany, 2010), Aquecimento Global e Catástrofe Planetária (Viena- Editora e Gráfica, Santa Cruz do Rio Pardo, São Paulo, 2010), Amazônia Sustentável- Para o progresso do Brasil e combate ao aquecimento global (Viena- Editora e Gráfica, Santa Cruz do Rio Pardo, São Paulo, 2011), Os Fatores Condicionantes do Desenvolvimento Econômico e Social (Editora CRV, Curitiba, 2012), Energia no Mundo e no Brasil- Energia e Mudança Climática Catastrófica no Século XXI (Editora CRV, Curitiba, 2015), As Grandes Revoluções Científicas, Econômicas e Sociais que Mudaram o Mundo (Editora CRV, Curitiba, 2016), A Invenção de um novo Brasil (Editora CRV, Curitiba, 2017), Esquerda x Direita e a sua convergência (Associação Baiana de Imprensa, Salvador, 2018), Como inventar o futuro para mudar o mundo (Editora CRV, Curitiba, 2019), A humanidade ameaçada e as estratégias para sua sobrevivência (Editora Dialética, São Paulo, 2021), A escalada da ciência e da tecnologia e sua contribuição ao progresso e à sobrevivência da humanidade (Editora CRV, Curitiba, 2022), a chapter in the book Flood Handbook (CRC Press, Boca Raton, Florida United States, 2022), How to protect human beings from threats to their existence and avoid the extinction of humanity (Generis Publishing, Europe, Republic of Moldova, Chișinău, 2023) and A revolução da educação necessária ao Brasil na era contemporânea (Editora CRV, Curitiba, 2023).