SlideShare a Scribd company logo
how does google works?
Crawling & IndexingCrawling & Indexing
The journey of a query starts before you
ever type a search, with crawling and
indexing the web of trillions of documents.
How Search Works
These processes lay the foundation — they're how we
gather and organize information on the web so we can
return the most useful results to you. Our index is well
over 100,000,000 gigabytes, and we’ve spent over one
million computing hours to build it. Learn more about
the basics in this short video.
Finding information byFinding information by
crawlingcrawling
We use software known as “web crawlers” to discover publicly available
WebPages. The most well-known crawler is called “Googlebot.”
Crawlers look at WebPages and follow links on those pages, much like
you would if you were browsing content on the web. They go from link
to link and bring data about those WebPages back to Google’s servers.
The crawl process begins with a list of web addresses from past crawls
and sitemaps provided by website owners. As our crawlers visit these
websites, they look for links for other pages to visit.
The software pays special attention to new sites, changes to existing sites
and dead links.
Computer programs determine which sites to crawl, how often, and how
many pages to fetch from each site.
Google doesn't accept payment to crawl a site more frequently for our
web search results.
We care more about having the best possible results because in the long
run that’s what’s best for users and, therefore, our business.
Organizing information by indexingOrganizing information by indexing
The web is like an ever-growing public library with billions of books
and no central filing system. Google essentially gathers the pages
during the crawl process and then creates an index, so we know exactly
how to look things up.
Much like the index in the back of a book, the Google index includes
information about words and their locations. When you search, at the
most basic level, our algorithms look up your search terms in the index
to find the appropriate pages.
The search process gets much more complex from there. When you
search for “dogs” you don’t want a page with the word “dogs” on it
hundreds of times. You probably want pictures, videos or a list of breeds.
Google’s indexing systems note many different aspects of pages, such as
when they were published, whether they contain pictures and videos, and
much more. With the Knowledge Graph, we’re continuing to go beyond
keyword matching to better understand the people, places and things you
care about.
Choice for website owners
Most websites don’t need to set up restrictions for crawling, indexing or
serving, so their pages are eligible to appear in search results without
having to do any extra work. That said, site owners have many choices
about how Google crawls and indexes their sites through Webmaster
Tools and a file called “robots.txt”.
With the robots.txt file, site owners can choose not to be crawled by
Googlebot, or they can provide more specific instructions about how to
process pages on their sites.
Site owners have granular choices and can choose how content is
indexed on a page-by-page basis. For example, they can opt to have
their pages appear without a snippet (the summary of the page shown
below the title in search results) or a cached version (an alternate
version stored on Google’s servers in case the live page is unavailable).
Webmasters can also choose to integrate search into their own pages
with Custom Search.
AlgorithmsAlgorithms
You want the answer, not trillions of webpages.
Algorithms are computer programs that look for clues
to give you back exactly what you want.
For a typical query, there are thousands, if not millions, of WebPages
with helpful information. Algorithms are the computer processes and
formulas that take your questions and turn them into answers. Today
Google’s algorithms rely on more than 200 unique signals or “clues” that
make it possible to guess what you might really be looking for. These
signals include things like the terms on websites, the freshness of
content, your region and Page Rank.
Search Projects
There are many components to the search process and the results page,
and we’re constantly updating our technologies and systems to deliver
better results. Many of these changes involve exciting new innovations,
such as the Knowledge Graph or Google Instant. There are other
important systems that we constantly tune and refine. This list of
projects provides a glimpse into the many different aspects of search.
The Evolution of SearchThe Evolution of Search
Our goal is to get you to the answer you're looking for faster, creating a
nearly seamless connection between you and the knowledge you seek. If
you’re looking to deepen your understanding of how search has evolved,
this video highlights some important features like universal results and
quick answers.
Experiments: From Idea to LaunchExperiments: From Idea to Launch
A typical algorithmic change begins as an idea from one of our
engineers about how to improve search. We take a data-driven
approach and all proposed algorithm changes undergo extensive quality
evaluation before release.
Engineers typically start by running a series of experiments, tweaking
small variables and getting feedback from colleagues until they are
satisfied and ready to release the experiment to a larger audience.
Fighting SpamFighting Spam
Every day, millions of useless spam pages are created. We fight spam
through a combination of computer algorithms and manual review.
Spam sites attempt to game their way to the top of search results
through techniques like repeating keywords over and over, buying links
that pass Page Rank or putting invisible text on the screen. This is bad
for search because relevant websites get buried, and it’s bad for
legitimate website owners because their sites become harder to find.
The good news is that Google's algorithms can detect the vast majority
of spam and demote it automatically. For the rest, we have teams who
manually review sites.
Identifying SpamIdentifying Spam
Spam sites come in all shapes and sizes. Some sites are automatically-
generated gibberish that no human could make sense of. Of course, we
also see sites using subtler spam techniques. Check out these examples
of “pure spam,” which are sites using the most aggressive spam
techniques. This is a stream of live spam screenshots that we’ve
manually identified and recently removed from appearing in search
results.
Taking ActionTaking Action
While our algorithms address the vast majority of spam, we address
other spam manually to prevent it from affecting the quality of your
results. This graph shows the number of domains that have been affected
by a manual action over time and is broken down by the different spam
types. The numbers may look large out of context, but the web is a really
big place. A recent snapshot of our index showed that about 0.22% of
domains had been manually marked for removal.
Manual Action by MonthManual Action by Month
Notifying Website OwnersNotifying Website Owners
When we take manual action on a website, we try to alert the site's
owner to help him or her address issues. We want website owners to
have the information they need to get their sites in shape. That’s why,
over time, we’ve invested substantial resources in webmaster
communication and outreach. The following graph shows the number of
spam notifications sent to site owners through Webmaster Tools.
Messages by MonthMessages by Month
Listening for FeedbackListening for Feedback
Manual actions don’t last forever. Once a website owner cleans up her
site to remove spammy content, she can ask us to review the site again
by filing a reconsideration request. We process all of the reconsideration
requests we receive and communicate along the way to let site owners
know how it's going.
Historically, most sites that have submitted reconsideration requests are
not actually affected by any manual spam action. Often these sites are
simply experiencing the natural ebb and flow of online traffic, an
algorithmic change, or perhaps a technical problem preventing Google
from accessing site content. This chart shows the weekly volume of
reconsideration requests since 2006.
Reconsideration Requests by WeekReconsideration Requests by Week
PoliciesPolicies
We care deeply about the information you find on Google. We strive for
a consistent approach that puts users first.
We want to organize the world’s information. But what about malware?
What about credit card numbers? There are many tricky issues we think
about on a daily basis. Here you’ll find a list of policies organized around
particular topic areas. We’re starting with policies related primarily to
content removals, but this is a living document and we plan to update
over time. We’d love to get your feedback and suggestions.
Access to Information Comes FirstAccess to Information Comes First
We believe in free expression and the free flow of
Information. We try hard to make information available
except for narrowly defined cases like spam, malware,
legal requirements and preventing identity theft.
Algorithms Over Manual ActionAlgorithms Over Manual Action
The relevance and comprehensiveness of our search
result is central to helping you find what you’re looking for.
We prefer machine solutions to manually organizing information.
Algorithms are scalable, so when we make an improvement, it makes
things better not just for one search results page, but for thousands or
millions. However, there are certain cases where we utilize manual
controls when machine solutions aren’t enough. Learn more about
Algorithms.
Exceptions ListsExceptions Lists
Like most search engines, in some cases, our algorithms
falsely identify sites and we make limited exceptions to
improve our search quality. For example, our Safe Search
algorithms are designed to protect children from adult content online.
When one of these algorithms misidentifies websites (for example
essex.edu) we sometimes make manual exceptions to prevent these
sites from being classified as pornography.
Fighting Spam and MalwareFighting Spam and Malware
We hate spam as much as you do. It hurts users by
cluttering search results with irrelevant links. We have
teams that work to detect spammy websites and remove
them from our results. The same goes for phishing websites and
malware. Learn more about Fighting Spam.
Preventing Identity TheftPreventing Identity Theft
Upon request, we’ll remove personal information
from search results if we believe it could make you
susceptible to specific harm, such as identity theft or
financial fraud. We generally don’t process removals of
national ID numbers from official government websites because in
those cases we consider the information to be public. We sometimes
refuse requests if we believe someone is attempting to abuse these
policies to remove other information from our results.
Fighting Child ExploitationFighting Child Exploitation
We block search results that lead to child sexual
abuse imagery. This is a legal requirement and
the right thing to do.
Shocking ContentShocking Content
We want to make sure information is available
when you’re looking for it, but we also want to
be careful not to show potentially upsetting
content when you haven’t asked for it. Accordingly, we may not trigger
certain search features for queries where the results could be offensive
in several narrowly defined categories.
Safe SearchSafe Search
When it comes to information on the web, we leave it up
to
you to decide what’s worth finding. That’s why we have a
Safe Search filter, which gives you more control over your
search
experience by helping you avoid adult content if you’d rather not
MADE BY: Hritik AgarwalMADE BY: Hritik Agarwal
CLASS: 10CLASS: 10thth
GUIDED BY: Sir Arvindar SinghGUIDED BY: Sir Arvindar Singh
SUBMTTED TO: Sir Arvindar SinghSUBMTTED TO: Sir Arvindar Singh
how does google works?

More Related Content

PDF
Search engine rampage
PDF
SEO for Beginners - A Step by Step Guide
PDF
Seo beginner's guide by client joy
PPT
Search engine optimization (seo)
PPTX
Google Webmaster Tool Guide
PPTX
Google WebMaster Tool
PPTX
Crawling, Indicizzazione e SEO - Paolo Ramazzotti
PPTX
Google Webmaster Tools
Search engine rampage
SEO for Beginners - A Step by Step Guide
Seo beginner's guide by client joy
Search engine optimization (seo)
Google Webmaster Tool Guide
Google WebMaster Tool
Crawling, Indicizzazione e SEO - Paolo Ramazzotti
Google Webmaster Tools

What's hot (20)

PDF
The Technical SEO Renaissance
PPTX
Google webmaster tools
ODT
Organic seo
PPT
Guide for google webmaster tools
PDF
SEO Workshop
PPTX
Guide to Google Webmaster Tools
PPSX
Search engine optimization (seo) overview
PPT
Vikram seo ppt
PPSX
ppt on SEO topic
DOCX
Digital Marketing Important Questions and Answers.
PPTX
Lvr ppt
PPTX
Concordia Universtiy Seo Services Seminar
PDF
Google Analytics New Features - Webinar - 20091030
PDF
Using Google Analytics and Google Webmaster Tools to improve your site
PPTX
How Does Organic SEO Work?
PPTX
Inside google search - how it works??
PDF
Google
PPTX
Search Engine Optimization
DOCX
Google's page rank; a decision support system in itself.
PPTX
Google algorithm updates
The Technical SEO Renaissance
Google webmaster tools
Organic seo
Guide for google webmaster tools
SEO Workshop
Guide to Google Webmaster Tools
Search engine optimization (seo) overview
Vikram seo ppt
ppt on SEO topic
Digital Marketing Important Questions and Answers.
Lvr ppt
Concordia Universtiy Seo Services Seminar
Google Analytics New Features - Webinar - 20091030
Using Google Analytics and Google Webmaster Tools to improve your site
How Does Organic SEO Work?
Inside google search - how it works??
Google
Search Engine Optimization
Google's page rank; a decision support system in itself.
Google algorithm updates
Ad

Viewers also liked (17)

PDF
Stabbing Mediocrity to Death
PDF
Mediocrity Will Kill You
PPTX
Pbmv tips and tricks
PPTX
53 Takeaways From The Wolf Of Wall Street's London Seminar
PPTX
The YES Factor: How to persuade business buyers to say yes.
PPT
Positive Attitude
PDF
How To Use The GROW Coaching Model
PPTX
Steve Jobs Inspirational Quotes
PPT
March 12 2017 -Sunday service - Overcoming Mediocrity
PDF
Personal branding - do it yourself
KEY
61 Beautiful & Inspirational Timeline Cover on Facebook
PPT
The NEW Way to Win Friends & Influence People (social media in events)
PDF
Big-tent UX (UX Camp West 2016)
PPTX
10 Powerful Body Language Tips for your next Presentation
PDF
How Google Works
PDF
The Search for Meaning in B2B Marketing
PDF
3 Things Every Sales Team Needs to Be Thinking About in 2017
Stabbing Mediocrity to Death
Mediocrity Will Kill You
Pbmv tips and tricks
53 Takeaways From The Wolf Of Wall Street's London Seminar
The YES Factor: How to persuade business buyers to say yes.
Positive Attitude
How To Use The GROW Coaching Model
Steve Jobs Inspirational Quotes
March 12 2017 -Sunday service - Overcoming Mediocrity
Personal branding - do it yourself
61 Beautiful & Inspirational Timeline Cover on Facebook
The NEW Way to Win Friends & Influence People (social media in events)
Big-tent UX (UX Camp West 2016)
10 Powerful Body Language Tips for your next Presentation
How Google Works
The Search for Meaning in B2B Marketing
3 Things Every Sales Team Needs to Be Thinking About in 2017
Ad

Similar to how does google works? (20)

PDF
Official how google works
DOCX
Report on search engines
PDF
seo-basics-course-2023.pdf
PDF
seo important Terms A-Z (1).pdf safalta.com
PPTX
Search Engine Optimization - Fundamentals - SEO
PPTX
Il processo di Crawilng e Indexing di Google - Paolo Ramazzotti
PDF
How to Run an SEO Audit by yourself at home.pdf
PDF
Seo material Digitoliens - Best Digital Marketing Institute in Hyderabad
PPTX
Simple and Free SEO Tools to Instantly Improve Your Marketing
PPTX
What is seo
PDF
Using SEO to Build Your Business
PDF
The step by step guide to SEO Website Audit
PDF
Using SEO to Build Your Business
PPT
Search Engine Optimization (Seo)
PDF
Introduction to Search Engine Optimization On Page
PDF
Search engine and web crawler
PPTX
Seo presentation
PPTX
Google Algorithms
PDF
Introduction to SEO.
PPTX
What is SEO ?
Official how google works
Report on search engines
seo-basics-course-2023.pdf
seo important Terms A-Z (1).pdf safalta.com
Search Engine Optimization - Fundamentals - SEO
Il processo di Crawilng e Indexing di Google - Paolo Ramazzotti
How to Run an SEO Audit by yourself at home.pdf
Seo material Digitoliens - Best Digital Marketing Institute in Hyderabad
Simple and Free SEO Tools to Instantly Improve Your Marketing
What is seo
Using SEO to Build Your Business
The step by step guide to SEO Website Audit
Using SEO to Build Your Business
Search Engine Optimization (Seo)
Introduction to Search Engine Optimization On Page
Search engine and web crawler
Seo presentation
Google Algorithms
Introduction to SEO.
What is SEO ?

Recently uploaded (20)

PPTX
Introuction about ICD -10 and ICD-11 PPT.pptx
PPTX
Internet___Basics___Styled_ presentation
PDF
The Internet -By the Numbers, Sri Lanka Edition
PPTX
CHE NAA, , b,mn,mblblblbljb jb jlb ,j , ,C PPT.pptx
PPTX
Digital Literacy And Online Safety on internet
PPT
tcp ip networks nd ip layering assotred slides
PPTX
June-4-Sermon-Powerpoint.pptx USE THIS FOR YOUR MOTIVATION
PPTX
durere- in cancer tu ttresjjnklj gfrrjnrs mhugyfrd
PPTX
Module 1 - Cyber Law and Ethics 101.pptx
PPTX
cyber security Workshop awareness ppt.pptx
PDF
LABUAN4D EXCLUSIVE SERVER STAR GAMING ASIA NO.1
PPTX
INTERNET------BASICS-------UPDATED PPT PRESENTATION
PPTX
Introuction about WHO-FIC in ICD-10.pptx
PPTX
international classification of diseases ICD-10 review PPT.pptx
PPTX
introduction about ICD -10 & ICD-11 ppt.pptx
PDF
APNIC Update, presented at PHNOG 2025 by Shane Hermoso
PPTX
CSharp_Syntax_Basics.pptxxxxxxxxxxxxxxxxxxxxxxxxxxxx
PDF
“Google Algorithm Updates in 2025 Guide”
PDF
Sims 4 Historia para lo sims 4 para jugar
PPTX
PptxGenJS_Demo_Chart_20250317130215833.pptx
Introuction about ICD -10 and ICD-11 PPT.pptx
Internet___Basics___Styled_ presentation
The Internet -By the Numbers, Sri Lanka Edition
CHE NAA, , b,mn,mblblblbljb jb jlb ,j , ,C PPT.pptx
Digital Literacy And Online Safety on internet
tcp ip networks nd ip layering assotred slides
June-4-Sermon-Powerpoint.pptx USE THIS FOR YOUR MOTIVATION
durere- in cancer tu ttresjjnklj gfrrjnrs mhugyfrd
Module 1 - Cyber Law and Ethics 101.pptx
cyber security Workshop awareness ppt.pptx
LABUAN4D EXCLUSIVE SERVER STAR GAMING ASIA NO.1
INTERNET------BASICS-------UPDATED PPT PRESENTATION
Introuction about WHO-FIC in ICD-10.pptx
international classification of diseases ICD-10 review PPT.pptx
introduction about ICD -10 & ICD-11 ppt.pptx
APNIC Update, presented at PHNOG 2025 by Shane Hermoso
CSharp_Syntax_Basics.pptxxxxxxxxxxxxxxxxxxxxxxxxxxxx
“Google Algorithm Updates in 2025 Guide”
Sims 4 Historia para lo sims 4 para jugar
PptxGenJS_Demo_Chart_20250317130215833.pptx

how does google works?

  • 2. Crawling & IndexingCrawling & Indexing The journey of a query starts before you ever type a search, with crawling and indexing the web of trillions of documents.
  • 3. How Search Works These processes lay the foundation — they're how we gather and organize information on the web so we can return the most useful results to you. Our index is well over 100,000,000 gigabytes, and we’ve spent over one million computing hours to build it. Learn more about the basics in this short video.
  • 4. Finding information byFinding information by crawlingcrawling We use software known as “web crawlers” to discover publicly available WebPages. The most well-known crawler is called “Googlebot.” Crawlers look at WebPages and follow links on those pages, much like you would if you were browsing content on the web. They go from link to link and bring data about those WebPages back to Google’s servers. The crawl process begins with a list of web addresses from past crawls and sitemaps provided by website owners. As our crawlers visit these websites, they look for links for other pages to visit.
  • 5. The software pays special attention to new sites, changes to existing sites and dead links. Computer programs determine which sites to crawl, how often, and how many pages to fetch from each site. Google doesn't accept payment to crawl a site more frequently for our web search results. We care more about having the best possible results because in the long run that’s what’s best for users and, therefore, our business.
  • 6. Organizing information by indexingOrganizing information by indexing The web is like an ever-growing public library with billions of books and no central filing system. Google essentially gathers the pages during the crawl process and then creates an index, so we know exactly how to look things up. Much like the index in the back of a book, the Google index includes information about words and their locations. When you search, at the most basic level, our algorithms look up your search terms in the index to find the appropriate pages.
  • 7. The search process gets much more complex from there. When you search for “dogs” you don’t want a page with the word “dogs” on it hundreds of times. You probably want pictures, videos or a list of breeds. Google’s indexing systems note many different aspects of pages, such as when they were published, whether they contain pictures and videos, and much more. With the Knowledge Graph, we’re continuing to go beyond keyword matching to better understand the people, places and things you care about.
  • 8. Choice for website owners Most websites don’t need to set up restrictions for crawling, indexing or serving, so their pages are eligible to appear in search results without having to do any extra work. That said, site owners have many choices about how Google crawls and indexes their sites through Webmaster Tools and a file called “robots.txt”. With the robots.txt file, site owners can choose not to be crawled by Googlebot, or they can provide more specific instructions about how to process pages on their sites.
  • 9. Site owners have granular choices and can choose how content is indexed on a page-by-page basis. For example, they can opt to have their pages appear without a snippet (the summary of the page shown below the title in search results) or a cached version (an alternate version stored on Google’s servers in case the live page is unavailable). Webmasters can also choose to integrate search into their own pages with Custom Search.
  • 10. AlgorithmsAlgorithms You want the answer, not trillions of webpages. Algorithms are computer programs that look for clues to give you back exactly what you want.
  • 11. For a typical query, there are thousands, if not millions, of WebPages with helpful information. Algorithms are the computer processes and formulas that take your questions and turn them into answers. Today Google’s algorithms rely on more than 200 unique signals or “clues” that make it possible to guess what you might really be looking for. These signals include things like the terms on websites, the freshness of content, your region and Page Rank.
  • 12. Search Projects There are many components to the search process and the results page, and we’re constantly updating our technologies and systems to deliver better results. Many of these changes involve exciting new innovations, such as the Knowledge Graph or Google Instant. There are other important systems that we constantly tune and refine. This list of projects provides a glimpse into the many different aspects of search.
  • 13. The Evolution of SearchThe Evolution of Search Our goal is to get you to the answer you're looking for faster, creating a nearly seamless connection between you and the knowledge you seek. If you’re looking to deepen your understanding of how search has evolved, this video highlights some important features like universal results and quick answers.
  • 14. Experiments: From Idea to LaunchExperiments: From Idea to Launch A typical algorithmic change begins as an idea from one of our engineers about how to improve search. We take a data-driven approach and all proposed algorithm changes undergo extensive quality evaluation before release. Engineers typically start by running a series of experiments, tweaking small variables and getting feedback from colleagues until they are satisfied and ready to release the experiment to a larger audience.
  • 15. Fighting SpamFighting Spam Every day, millions of useless spam pages are created. We fight spam through a combination of computer algorithms and manual review.
  • 16. Spam sites attempt to game their way to the top of search results through techniques like repeating keywords over and over, buying links that pass Page Rank or putting invisible text on the screen. This is bad for search because relevant websites get buried, and it’s bad for legitimate website owners because their sites become harder to find. The good news is that Google's algorithms can detect the vast majority of spam and demote it automatically. For the rest, we have teams who manually review sites.
  • 17. Identifying SpamIdentifying Spam Spam sites come in all shapes and sizes. Some sites are automatically- generated gibberish that no human could make sense of. Of course, we also see sites using subtler spam techniques. Check out these examples of “pure spam,” which are sites using the most aggressive spam techniques. This is a stream of live spam screenshots that we’ve manually identified and recently removed from appearing in search results.
  • 18. Taking ActionTaking Action While our algorithms address the vast majority of spam, we address other spam manually to prevent it from affecting the quality of your results. This graph shows the number of domains that have been affected by a manual action over time and is broken down by the different spam types. The numbers may look large out of context, but the web is a really big place. A recent snapshot of our index showed that about 0.22% of domains had been manually marked for removal.
  • 19. Manual Action by MonthManual Action by Month
  • 20. Notifying Website OwnersNotifying Website Owners When we take manual action on a website, we try to alert the site's owner to help him or her address issues. We want website owners to have the information they need to get their sites in shape. That’s why, over time, we’ve invested substantial resources in webmaster communication and outreach. The following graph shows the number of spam notifications sent to site owners through Webmaster Tools.
  • 22. Listening for FeedbackListening for Feedback Manual actions don’t last forever. Once a website owner cleans up her site to remove spammy content, she can ask us to review the site again by filing a reconsideration request. We process all of the reconsideration requests we receive and communicate along the way to let site owners know how it's going. Historically, most sites that have submitted reconsideration requests are not actually affected by any manual spam action. Often these sites are simply experiencing the natural ebb and flow of online traffic, an algorithmic change, or perhaps a technical problem preventing Google from accessing site content. This chart shows the weekly volume of reconsideration requests since 2006.
  • 23. Reconsideration Requests by WeekReconsideration Requests by Week
  • 24. PoliciesPolicies We care deeply about the information you find on Google. We strive for a consistent approach that puts users first. We want to organize the world’s information. But what about malware? What about credit card numbers? There are many tricky issues we think about on a daily basis. Here you’ll find a list of policies organized around particular topic areas. We’re starting with policies related primarily to content removals, but this is a living document and we plan to update over time. We’d love to get your feedback and suggestions.
  • 25. Access to Information Comes FirstAccess to Information Comes First We believe in free expression and the free flow of Information. We try hard to make information available except for narrowly defined cases like spam, malware, legal requirements and preventing identity theft. Algorithms Over Manual ActionAlgorithms Over Manual Action The relevance and comprehensiveness of our search result is central to helping you find what you’re looking for. We prefer machine solutions to manually organizing information. Algorithms are scalable, so when we make an improvement, it makes things better not just for one search results page, but for thousands or millions. However, there are certain cases where we utilize manual controls when machine solutions aren’t enough. Learn more about Algorithms.
  • 26. Exceptions ListsExceptions Lists Like most search engines, in some cases, our algorithms falsely identify sites and we make limited exceptions to improve our search quality. For example, our Safe Search algorithms are designed to protect children from adult content online. When one of these algorithms misidentifies websites (for example essex.edu) we sometimes make manual exceptions to prevent these sites from being classified as pornography. Fighting Spam and MalwareFighting Spam and Malware We hate spam as much as you do. It hurts users by cluttering search results with irrelevant links. We have teams that work to detect spammy websites and remove them from our results. The same goes for phishing websites and malware. Learn more about Fighting Spam.
  • 27. Preventing Identity TheftPreventing Identity Theft Upon request, we’ll remove personal information from search results if we believe it could make you susceptible to specific harm, such as identity theft or financial fraud. We generally don’t process removals of national ID numbers from official government websites because in those cases we consider the information to be public. We sometimes refuse requests if we believe someone is attempting to abuse these policies to remove other information from our results. Fighting Child ExploitationFighting Child Exploitation We block search results that lead to child sexual abuse imagery. This is a legal requirement and the right thing to do.
  • 28. Shocking ContentShocking Content We want to make sure information is available when you’re looking for it, but we also want to be careful not to show potentially upsetting content when you haven’t asked for it. Accordingly, we may not trigger certain search features for queries where the results could be offensive in several narrowly defined categories. Safe SearchSafe Search When it comes to information on the web, we leave it up to you to decide what’s worth finding. That’s why we have a Safe Search filter, which gives you more control over your search experience by helping you avoid adult content if you’d rather not
  • 29. MADE BY: Hritik AgarwalMADE BY: Hritik Agarwal CLASS: 10CLASS: 10thth GUIDED BY: Sir Arvindar SinghGUIDED BY: Sir Arvindar Singh SUBMTTED TO: Sir Arvindar SinghSUBMTTED TO: Sir Arvindar Singh