2. • Do you think when you have scientific
knowledge you have power?
• What happens when science becomes the
subject of public controversy?
Scientists are mostly
well-respected, e.g.,
Einstein
3. Common Beliefs
Molecular model of rhinovirus, the virus that causes common cold and
rhinitis, 3D illustration. Image Credit: Kateryna Kon / Shutterstock
Being cold or damp makes you sick
In reality, colds are caused by viruses, not by
temperature or wet hair. While being chilly might
make you feel uncomfortable, it doesn’t directly lead
to catching a cold unless you’re exposed to the virus.
https://newsnetwork.mayoclinic.org/discussion/mayo-clinic-q-and-a-myths-about-catching-a-cold/
4. Common Beliefs
Sitting too close strained or damaged your eyesight permanently
As it turns out, sitting close to the TV has no effect on your vision at all. That distance at
which you sit from the television screen does not damage your eyes in any way. It turns
out that sitting close to the TV can actually be a good sign to parents that their kids might
be nearsighted. If a kid has trouble seeing the TV he will sit closer to see it more clearly.
https://sites.psu.edu/siowfa15/2015/09/15/does-sitting-too-close-to-the-tv-actually-hurt-your-vision/
5. Common Beliefs
If you crack your knuckles, you’ll get arthritis later in life
The sound of cracking knuckles comes from gas bubbles collapsing in the synovial fluid
within the joint, not from anything damaging the bones or cartilage. Studies have shown
no link between knuckle-cracking and arthritis.
California physician who reported on an experiment he conducted on himself. Over his
lifetime, he regularly cracked the knuckles of only one hand. He checked x-rays on
himself after decades of this behavior and found no difference in arthritis between his
hands.
https://www.health.harvard.edu/blog/knuckle-cracking-annoying-and-harmful-or-just-annoying-2018051413797
6. Common Beliefs
Often trust and personal connection are more important than authority or
expertise.
Even when there is a connection and trust among people, worldviews, ideologies, religion and
reference groups greatly determine how we'll relate to new scientific knowledge we meet.
People who teach science in schools usually assume that someone who studies more science
will understand science better and therefore will agree with what scientists say.
In science communication we know a lot of
cases where people use scientific terms and
ideas they learned to support ideas that
contradict what nearly all scientists think.
Therefore it's important for us to
understand not just what people know
about science, but how they use their
scientific knowledge to determine who to
believe.
This is important when there is scientific consensus, and just as important when the experts
themselves can't agree among themselves and the public needs to decide for themselves.
7. Science Communication
Models
The Deficit Model
This model assumes that if the public is given enough scientific information, they will understand and
support science, leading to better decisions and appreciation of technology. However, research shows this
approach doesn’t effectively change attitudes or behaviors—simply providing facts isn’t enough to bridge
the gap between science and the public.
The Contextual Model
This model improves on the deficit approach by considering the context in which information is shared,
such as timing (e.g., vaccine info during COVID-19 news cycles), accessibility, and public trust in scientists.
While it’s more effective at spreading knowledge, it still doesn’t tackle the deeper reasons for disconnects
between science and various public groups.
The Dialogic Model
This model addresses some root causes of science-society gaps by promoting two-way communication. It
recognizes that people’s views on science (e.g., vaccine decisions) involve more than just trust in facts—
they include economic, logistical, or personal factors. The scientific community must engage in open
dialogue with diverse publics, understanding their needs and concerns to build trust, which is the model’s
core goal.
The Participation Model
This newer model involves the public directly in science through citizen science (where people contribute
to or start research) and policymaking (e.g., influencing animal testing rules or funding). It emphasizes
active public involvement rather than just receiving information, fostering collaboration between science
and society.
Interrelationships Between Models
Researchers suggest these models can overlap or work together depending on the situation—one group
might need dialogue, another participation. They’re not fixed categories but part of a flexible continuum,
8. Science Communication
Models
Changing from a deficit model to a dialogue model.
• The deficit model focuses on increasing knowledge on a subject only by disseminating more facts
to the public.
• The dialogue model gives the public an active role in science communication and allows scientists
to take into account value systems when discussing science with different audiences.
Courchamp et al., 2016
9. The “problem” and the “blame”
Models are “frameworks for understanding what the ‘problem’ is, how to measure the problem, and how
to address the problem”;
In some cases, both scientists and public are blamed for not engaging with
each other and others to try and more deliberately solve important societal
issues.
In other words, instead of the models being helpful for explaining and
informing science communication practice they can be used to criticize the
efforts of practitioners.
the ‘problem’ being
• the public understanding of and relationship with science.
• scientists’ lack of understanding or relationship with public.
Regardless, both these conceptions generate ‘blame’ models.
• public may be being blamed for not understanding and
appreciating science
• scientists may be blamed for not understanding public,
trying to have positive relationships with public or
attempting to engage with them.
.
10. Which are the most important science
communication objective to you?
• Education: Enhance the public’s understanding of science to be able to make informed decisions
• Defense: Contradict science misinformation, disinformation, and fake news
• Popularization: Inform the public about science and distribute scientific content
• Popularization: Make scientific content accessible
• Promotion: Excite the public about science and increase appreciation for science
• Promotion: Gain the public’s support and government funding for science
• Contextualization: Tailor messages to specific audiences and build trust
• Engagement: Diverse publics discuss science issues
• Consultation: Stimulate the public to be involved in public science discourse, express concerns and
raise questions that stem from science and its applications, gain lay knowledge
• Deliberation: Foster the public to help set the agenda for science by actively deliberate in public
debates on the “why” and “why not” of science, as part of democratic policymaking
• Critique: Acknowledge the public critique on the science research enterprise priority list, and strive
to maximize possible societal returns from investments in science for the larger social good
• Collaboration: Encourage co-creation of knowledge, and enable the public to participate in
research endeavors
11. De-jargonization
In science communication, breaking down jargon while still communicating the
importance of research is incredibly difficult.
For most scientists, jargon is a part of their everyday language and they forget that
the public does not understand certain terms.
For a lay person, hearing and not understanding scientific jargon is frustrating, and
often leads to confusion on a subject.
Training for scientists in how to use language carefully, and how to understand and
relate to values of any audience will accelerate our ability to create a dialogue with
the general public.
https://scienceandpublic.com/de-jargonizer
https://xkcd.com/simplewriter/ Gretchen Kroh, 2019
12. The ozone layer hole is
shrinking
NASA's Aura satellite observations confirm that the ozone hole is shrinking
• The first studies on the harmful effects of CFC gases on the ozone layer were conducted in 1973
by two chemists, Frank Sherwood Rowland and Mario Molina. However, since CFC gases were a
major part of a vast industry, convincing people about their dangers was not easy.
• In the 1980s, concerns over the ozone hole dominated news due to its link to increased UV
radiation, which causes skin cancer and other health issues.
• Ozone is a protective gas in the upper atmosphere that absorbs harmful UV radiation. However,
CFC (chlorofluorocarbon) gases used in industries broke down ozone molecules, leading to
ozone depletion.
• Ultraviolet (UV) radiation from the Sun breaks down the CFC compound, releasing chlorine
atoms, which then interact with ozone molecules, breaking them apart and depleting the ozone
layer.
• The ozone hole over Antarctica was discovered in 1985, with NASA satellite data confirming its
extent.
13. The ozone layer hole is
shrinking
• The Montreal Protocol was signed in 1987 and came into force in 1989, banning
CFC production. Over time, CFCs were replaced with safer alternatives like HFCs.
• NASA's recent data shows a 20% recovery of the ozone layer since 2005, with
declining chlorine levels proving the effectiveness of the CFC ban.
• The ozone hole still exists but is expected to fully close between 2060 and 2080 if
progress continues.
• The Montreal Protocol is seen as a successful global environmental agreement,
unlike the Kyoto Protocol for climate change, highlighting the importance of
public awareness and governmental action.
15. Controversy
Socio-scientific issues are open-ended problems with no single solution; often several
potential solutions may apply.
Scientific principles, theories and data help formulate ways of addressing such
issues, but they are not enough – partly because of the additional aspects, such as
social, political, economic and ethical issues, on which scientific knowledge has little
bearing if any
Controversy is an integral part of the social and political context of science, and is
important to scientific progress; avoiding it would be nearly impossible, as well as
counterproductive.
Paris Agreement for Global Warming: USA, China oppose while most others agree,
e.g., communities and countries that are exposed to the dangers of climate change,
such as those living in areas prone to desertification, have a specific interest in
curbing carbon emissions that cause global warming, while some stakeholders, such
as the fossil fuel industry (coal, petroleum, natural gas), would be handicapped by
such measures.
16. In an effort to uphold professional journalistic standards, reporters sometimes present the climate
change issue to the public showing both the findings of the scientists, who attribute it to human activity,
and the claims of those who doubt them. Is this a problem?
A) No, because journalists should always present all aspects of an issue.
B) Yes, because presenting multiple perspectives on climate change is tedious and the public may
become bored with the subject.
C) Yes, because it creates the false impression that there is significant controversy in the scientific
community about climate change.
D) Yes, because it overemphasizes a single topic, such as climate change, drawing attention away
from other issues on the public agenda.
17. Paradox of Choice
Faced with a wide choice, we feel
frustrated and unsatisfied.
Our tendency in this situation is
• to not to choose at all, or
• to choose according to prior beliefs or
ideas.
When overwhelmed with information, we
find it much harder to select the relevant
facts and make informed decisions.
18. Bounded Rationality
When faced with uncertainty in decision-making we like to believe our choices are purely
rational.
However, this is often not the case. One of the key theories in judgment and decision-making is
bounded rationality (Herbert A. Simon, 1957).
Ideal conditions for making optimal decisions are rarely available—due to environmental
constraints, time or cost limitations, or cognitive biases—we rely on mental shortcuts known as
heuristics.
The benefit of this approach is efficiency—it allows for quick decisions that are often
satisfactory.
Let’s choose to buy a cellphone, we may compare the
situation to past experiences:
Have we purchased this brand before?
Did I like it?
Did I love it?
Was it reliable?
Another heuristic involves the authority of the
person making the recommendation:
Do they have expertise in technology?
19. Inoculation Theory
Inoculation theory is a social psychological communication theory that explains how an attitude
or belief can be protected against persuasion or influence in much the same way a body can be
protected against disease…
20. According to inoculation theory, if people are told ahead of time that they might hear certain false
information, this
A) confuses them, and is best avoided
B) makes them less likely to believe the false claim and to spread it further
C) strengthens their belief in the false claim
D) leads them to doubt the person who forewarned them correct
Inoculation Theory
22. Misinformation
Misinformation can do damage
Misinformation, whether accidental or intentional (disinformation), can cause significant harm. It's
crucial to protect people by making them resilient to misinformation or by debunking it after
exposure.
Misinformation can be sticky!
Fact-checking can reduce belief in false information, but misinformation often continues to
influence people even after correction—this is the "continued influence effect." Effective debunking
is crucial to minimize its impact.
Prevent misinformation from sticking if you can
Misinformation is best prevented through inoculation, explaining manipulative strategies to make
people resistant to future misinformation. This method works best before exposure.
Debunk often and properly
If preemptive measures fail, debunking is necessary. For
effective debunking, provide detailed explanations of why the
information is false and what is true to help "unstick" the
misinformation.
24. State the truth first
• If it’s easy to do in a few clear words, state what is true first. This allows you to frame the message—you lead with
your talking points, not someone else’s.
• The best corrections are as prominent (in the headlines, not buried in questions) as the misinformation.
• Do not rely on a simple retraction (“this claim is not true”).
• Providing a factual alternative, that is an alternative that fills a causal “gap” in explaining what happened if the
misinformation is corrected, is an effective method of debunking.
MYTH: Point to misinformation
• Repeat the misinformation, only once, directly prior to the correction. One repetition of the myth is beneficial to belief
updating.
• Needless repetitions of the misinformation should be avoided: We know that repetition makes information appear true.
• Corrections are most successful if people are suspicious, or made to be suspicious, of the source or intent of the
misinformation
FALLACY: Explain why misinformation is wrong
Rather than only stating that the misinformation is false, it is beneficial to provide details as to why. Explain
(1) why the mistaken information was thought to be correct in the first place (2) why it is now clear it is wrong (3) why the
alternative is correct.
It is important for people to see the inconsistency in order to resolve it.
FACT: State the truth again
Restate the fact again, so the fact is the last thing people process.
Even with detailed refutations, the effects will wear off over time 3, 52, so be prepared to debunk repeatedly!
26. Familiarity backfire effect
Repetition makes information more familiar, and familiar information is generally perceived to be more
truthful than novel information.
Because a myth is necessarily repeated when it is debunked, the risk arises that debunking may backfire
by making a myth more familiar.
Early evidence was supportive of this idea, but more recently, exhaustive experimental attempts to induce
a backfire effect through familiarity alone have come up empty.
Thus, while repeating misinformation generally increases familiarity and truth ratings, repeating a myth
while refuting it has been found to be safe in many circumstances, and can even make the correction
more salient and effective.
27. Wardle & Derakhshan 2018 propose a system whereby misinformation and disinformation can
be divided into seven categories, depending on the intent of the disseminator and on the
recipients’ interpretation:
1. Satire or parody, which may be misconstrued as truth;
2. False connection: visuals and headlines that create an impression not supported by the
content (for example, a clickbait headline reading The Secret Trick to Weight Loss, whereas the
article contains no secret);
3. Misleading content: texts or visuals framed in such a way as to misrepresent their original
meaning (such as a distorted graph or a quote that is missing a crucial part);
4. False context: genuine content taken out of its original context (such as a photograph of a
past event with a caption identifying it as a more recent one);
5. Imposter content: information with false credentials (such as a fictitious statement carrying
the signature or logo of an official agency);
6. Manipulated content: genuine content deliberately altered (such as a video slowed down to
create the impression that the speaker is inebriated);
7. Fabricated content (text or images that are purely fictitious).
Misinformation and
Disinformation
30. 1 light year – 9.5 trillion km
Stellar Distances
A side view of Orion (which we can only imagine seeing) would show that some of the stars are very far apart
from each other, and not really grouped at all. This depiction shows that some stars are a hundred light years
apart.
32. Planets Retrograde
Mercury isn't the only planet that retrogrades. Every planetary 'step back’
When a planet appears to move backward in the sky, it's in "retrograde." This isn't because
the planet is really moving backward—it’s an optical illusion.
Imagine you're in a car passing another car on the highway. For a moment, it might seem like
the other car is moving backward relative to you, even though both cars are moving forward.
That’s similar to what happens with retrograde motion.
33. In our study, we argue that the time, effort, and profit losses
incurred by organizations through thes personality
assessments are ultimately wasted. Instead, organizations
could save money and achieve better results by providing
astrology knowledge or training to their HR departments. By
doing so, organizations would not only be able to select the
most suitable candidates during recruitment but also place
existing staff in positions that align with their personalities,
thereby increasing productivity. Companies that aim to
foster organizational commitment can easily achieve this by
hiring people from water and earth signs during
recruitment, as Cancer, Pisces, Scorpio, Taurus, Capricorn,
and Virgo employees will remain loyal and responsible in
their roles for many years.
This study examines whether the concepts of personality
and organizational commitment are related to astrological
personality traits.
The survey results led to the following conclusions:
• There is a relationship between zodiac signs and
personality traits.
• Employees from the water signs (Cancer, Scorpio, Pisces)
primarily exhibit emotional commitment.
• Employees from the earth signs (Taurus, Virgo, Capricorn)
show higher continuity commitment.
• Employees from the air signs (Gemini, Libra, Aquarius) do
not demonstrate organizational commitment.
• Employees from the fire signs (Aries, Leo, Sagittarius) have
higher continuity commitment.
34. • Why astrology today has a high follower count
and believers as stone cold?
35. Science denial in high
places
Powerful individuals and organizations have various reasons for rejecting scientific evidence.
In the 17th century, physicist Galileo Galilei publicly advocated for the Copernican
astronomical models but faced strong ideological opposition from the Catholic Church,
which labeled his theories as heretical.
When the Church put Galileo on
trial, it primarily defended its
actions by asserting that its
interpretation of the scriptures
was the sole truth. In contrast,
modern science deniers often
use scientific or pseudo-scientific
evidence to back their claims,
making it challenging for the
public to identify the flaws in
their arguments.
36. https://www.nasa.gov/technology/nasa-finds-drought-in-eastern-mediterranean-wors
t-of-past-900-years
Between the years 1100 and 2012, the
team found droughts in the tree-ring
record that corresponded to those
described in historical documents written
at the time.
According to Cook, the range of how
extreme wet or dry periods were is quite
broad, but the recent drought in the
Levant region, from 1998 to 2012, stands
out as about 50 percent drier than the
driest period in the past 500 years, and
10 to 20 percent drier than the worst
drought of the past 900 years.
“If we look at recent events and we start to
see anomalies that are outside this range of
natural variability, then we can say with
some confidence that it looks like this
particular event or this series of events had
some kind of human caused climate change
contribution,” he said.
Climate Change
40. Assignment 1:
Create 2 different shorts/reels
• Explain your field/department/research
• Find a topic of your choice and make a 1.5 minute reel
• Image you are explaining your science to 12 year-old
• Interview a scientist/academic/another student
• Interview another person who will explain their
research or science, etc.
Conditions and rewards:
• Can be in Turkish or English
• You will collaborate with the @metuscicomm Instagram
page
• You can use graphic design, motion graphics, AI, etc.
• You can show your face but not mandatory, however it
must be your voice. You can voice over to a video that you
make and own.
• Write a good caption and #’s to receive attention.
• Add your student ID, tag @metuscicomm and
@metu_odtu
• Top 10 best reel will be published at the official
@metu_odtu Instagram page.
• Every 50000 views will get you 5 points to the final grade
Editor's Notes
#6:For example regarding climate change, evolution or vaccination. The assumption is that if people don't agree with what scientists say, then they were probably not taught well enough.