SlideShare a Scribd company logo
6
Most read
15
Most read
16
Most read
Physics-101
CSE-D2
Presentation
On
Entropy.ppt
WELCOME
WE ARE NOT JUST ANOTHER BRICK IN THE WALL
CONTEN
T
History
CONCEP
T
DEE
ERL/LU
A/C
4
PARTPART
3
PART
2
PART
1
KAZI EMAD
B.Sc. in CSE
ID:
191902025
Slide (4-7)
IKHTIAR
B.Sc. in CSE
ID:
191902022
Slide (8-11)
ASMA
B.Sc. in CSE
ID:
191902027
Slide (12-14)
RAIHAN
B.Sc. in CSE
ID: 1919024
Slide (15-18)
CONTEN
History
CONCEP
T
DEE
ERL/LE
A/C
Introduction/History
Concept
Definition And Expression Of Entropy
Entropy in real life
Law of Entropy
Unit of Entropy
Classification
Applications
Conclusion
CONTEN
History
CONCEP
T
DEE
ERL/LU
A/C
INTRODUCTION :The word “entropy” was created by
German physicist “Rudolf Clausius ” in 1854. Rudolf
Clausius (1822 - 1988)The word has a Greek origin, its
first part reminds us of “energy” and the second part is
from “tropes” which means turning point.
HISTORY :Why’s this notion called entropy, anyway?
From the American Heritage Book of English Usage
(1996): “When the American scientist Claude Shannon
found that the mathematical formula of Boltzmann
defined a useful quantity in information theory, he
hesitated to name this newly discovered quantity
entropy because of its philosophical baggage.The
mathematician JohnVon [sic] Neumann encouraged
Shannon to go ahead with the name entropy, however,
since`no one knows what entropy is, so in a debate you
will always have the advantage.’
Rudolf Clausius (1822 - 1988)
CONTEN
History
CONCEP
T
DEE
ERL/LE
A/C
What is entropy?
The word entropy is sometimes confused with
energy. Although they are related quantities, they
are distinct.
or energy measures the capability of an object or
system to do work.
on the other hand, is a measure of the "disorder"
of a system. What "disorder refers to is really the
number of different microscopic states a system can
be in, given that the system has a particular fixed
composition, volume, energy, pressure, and
temperature. By "microscopic states", we mean the
exact states of all the molecules making up the
system.
Entropy = (Boltzmann's constant k) x logarithm of
number of possible states = k log(N).
CONTEN
HISTORY
CONCEP
T
DEE
ERL/LU
A/C CONCEPT The idea of entropy comes from a
principle of thermodynamic dealing with energy .It
usually refers to idea that everything in the universe
eventually moves from order to disorder , and
entropy is the measurement of that change. •
Physicists use entropy to measure the amount of
disorder in a physical system. • In information theory
, entropy is the expected value (average) of the
information contained in each message received. • It
can be considered as the degree of randomness in
a message.
Entropy :thermodynamic property-- a quantitative
measure of disorder
Entropy traces out its origin –molecular movement
interpretation-Rudolf Clausias in 1850
The concept of entropy -thermodynamic laws(i.e.
the 2nd law of thermodynamics)
It can be visualised due to the process of
expansion, heating, mixing and reaction.
Entropy is associated with heat and temperature.
CONTEN
HISTORY
DEE
DEE
ERL/LU
A/Legt/
C
Definition and expression of entropy
Entropy may be defined as the property
of a system which measure the degree of
disorder or randomness in the system
It is a Greek word which means
transformation
It is denoted by the symbol ‘S’
Clausius was convinced of the
significance of the ratio of heat delivered
and the temperature at which it is delivered
CONTEN
HISTORY
DEE
ERL/LU
CLASSIFICATION
A/C
1.Entropy is the sum total of entropy due to positional
disorder, vibrational disorder and configurational disorder. i.e
randomness due to change of state S=sp+st+sc
2.When a system is undergoing change then the entropy
change is equal to the heat absorbed by the system divided
by the temperature at which change taken place. ΔS = S2 –
S1 = ∫ dq / T T ΔS = dq or TdS = dq this is the II law
expression. Suppose the process is undergoing change at
constant temperature:
3.From I Law we know that ΔE = q – w or dE = dq – dw or dE
= dq – PdV At constant temperature ΔE = 0, therefore dq =
PdV. From II law we know that dq = TdS , Substituting this in
the above we get, Tds = Pdv ΔS = PdV / T,
4.Suppose the process is undergoing change at constant
pressure condition then: T ΔS = (q)p - but we know that (q)p
= CpdT T ΔS = Cp dT, Or TdS = Cp dT By integration, 1∫2dS
= 1∫2 Cp dT /T S2 – S1 = Δ Cp ln (T2 / T1) This is the entropy
change of the system at constant pressure condition from
room temperature to the reaction temperature.
CONTEN
HISTORY
ERL/LU
LE/UE
CLASSIFICATION
Entropy in real life
Thus we can say that more disorder = more entropy more
order = less entropy
CONTEN
HISTORY
CONCEP
T
DEE
LE/UE
CLASSIFICATION
Law Of Entropy The second law of
thermodynamics, “entropy of an isolated
system always increases.” Or in other
words “The entropy can be created but not
destroyed
Unit of Entropy The SI unit for Entropy (S)
is Joules per Kelvin (J/K). Clausius is also
gives the relation between the units i.e. 1
Clausius (Cl) = 1 (cal/°C) = 4.1868 (J/K)
CONTEN
HISTORY
CONCEP
T
DEE
CLASSIFICATION
TE/SE
CLASSIFICATION
Thermodynamical entropy
Statistical entropy
Entropy in classical world
Entropy in quantum world
CONTEN
HISTORY
CONCEP
T
DEE
TE/SE
ECW/EQ
W
Thermodynamical Entropy
Entropy is defined using Clausius
inequality 𝛿𝑄 𝑇 ≤ 0 The cyclic integral of 𝛅Q
𝑇 can be viewed as the sum of all these
differential amounts of heat transfer divided
by the temperature at the boundary. 𝛿𝑄 𝑇 =
0 (for reversible cycle) 𝛿𝑄 𝑇 < 0 (for
irreversible cycle) 𝛿𝑄 𝑇 > 0 (for impossible
cycle) by defintion: Let’s define a
thermodynamic property entropy (S), such
that S2-S1= dS = 𝑑𝑄 𝑇
Statistical Entropy
In 1877, Ludwig Boltzmann developed a statistical
entropy S. this suggest the connection between
entropy and thermodynamic probability,. It may be
written as: S=F(Ω)=KBlnΩ Where KB= Boltzmann’s
constant Ω = thermodynamic probability
CONTEN
HISTORY
CONCEP
T
DEE
ECW/EQ
W
APPLICATIONS
Entropy in classical world
Claude E. Shannon introduced Shannon’s
entropy , used for measuring entropy of a
classical system. Thus shannon’s
introduced a entropy , i. e. HS= Pilog2
(1/Pi) Where Pi = probability distribution
function
Entropy in Quantum world Von Neumann
entropy is used for measuring entropy of a
quantum system. It gauges order in a given
quantum system. The entropy of a quantum
state was introduced by von Neumann.
This entropy of a state P is defined by
S(P)= λilog2 ( 1 λi ) Where λi = Eigenvalues
of the density matrix
CONTEN
HISTORY
CONCEP
T
DEE
APPLICATIONS
CONCLUSION
APPLICATIONS
We use a shannon’s entropy in
information theory.
Identify an information processing task –
data compression, information
transmission, teleportation.
Quantum Shannon Theory provides
General theory of interconvertibility
between different types of communications
resources: qubits, cbits, ebits, cobits,
sbits…
It can store information so that at a later
time the information can be reconstructed.
Entropy as a measure of
entanglement.Entropy is a measure of the
uncertainty about a quantum system before
we make a measurement of its state.
CONTEN
HISTORY
CONCEP
T
DEE
CONCLUSION
 Conclusion
Entropy is the thermodynamic property which is the
measure of disorder in a system.
It can be expresses by ‘S’=q/t
The term is coined by Rudolf Clausius.
Entropy is mainly associated with heat and temperature.
Disorder can be of 3 types- Positional, Vibrational and
Configurational
Thermobarometric models is an excellent case study
when the application of thermodynamic parameters are
involve
Second law of thermodynamics implies that the entropy of
a universe is increasing continously because energy
conservation is not 100% efficient. i.e. some heat is always
released.
Entropy can be zero At absolute 0 (0 K), all atomic motion
ceases and disorder in a substance is zero. entropy will
always be positive
CONTEN
HISTORY
CONCEP
T
DEE
ERL/LU
A/C
THANKS FORYOUR
KINDATTENTION….

More Related Content

PPTX
Thermodynamic
PPTX
Entropy
PPTX
PPT
Thermodynamics
PPTX
Entropy
PPTX
Second law of thermodynamic
PPTX
2nd law of thermodynamic
PPTX
Entropy
Thermodynamic
Entropy
Thermodynamics
Entropy
Second law of thermodynamic
2nd law of thermodynamic
Entropy

What's hot (20)

PDF
Thermodynamics
PPT
first law of thermodynamics and second law
PPT
BASIC THERMODYNAMICS
PPTX
2nd law of thermodynamics, entropy
PPTX
Tp 11 internal energy (shared)
PPTX
Kinetic theory of gases
PPTX
Concepts of entropy
PDF
Second law of thermodynamics
PPTX
Introduction to thermodynamics
PPTX
Basic concepts of thermodynamics
PDF
First law of thermodynamics
PDF
Clausius - Clapeyron Equation
PPTX
Adiabatic compresion and expansion of gases
PPTX
PPTX
Chemical Thermodynamics
PPTX
First law of thermodynamics
PPTX
Energy,heat,work and thermodynamic processes
KEY
SSL12 Entropy
PPTX
Basics of thermodynamics
Thermodynamics
first law of thermodynamics and second law
BASIC THERMODYNAMICS
2nd law of thermodynamics, entropy
Tp 11 internal energy (shared)
Kinetic theory of gases
Concepts of entropy
Second law of thermodynamics
Introduction to thermodynamics
Basic concepts of thermodynamics
First law of thermodynamics
Clausius - Clapeyron Equation
Adiabatic compresion and expansion of gases
Chemical Thermodynamics
First law of thermodynamics
Energy,heat,work and thermodynamic processes
SSL12 Entropy
Basics of thermodynamics
Ad

Similar to Entropy.ppt (20)

PPTX
EntropyPresentation1358393959392929494392939493.pptx
PPTX
WHAT IS ENTROPY.pptx
PDF
entropy-170828073801.pdf
PDF
Week_5_Entdrddropy_Lecture_Slides_(4).pdf
PPT
PPT
entropypresentation111112222222222222.ppt
PDF
It From Bit - An Amelioration of an Amateur Scientist
PPT
Entropy A Measure of Disorder.ppt
PPTX
Presentation1 entropy.pptx
PPTX
2nd law of thermodynamics
PPTX
THERMODYNAMICS[PART 3], CLASS 11, CHEMISTRY
PDF
Chapter 2 Entropy & Availability
PPTX
2nd law of thermodynamics
PPTX
Lecture 3 Entropy 2nd law.pptxEntropy 2nd lawEntropy 2nd law
PPTX
THE CONCEPT OF ENTHALPY, ENTROPY AND FREE ENERGY
PPTX
CHE 116 Unit 2.1 Entropy.pptx physical chemistry
PPTX
CHE 116 Unit 2.1 Entropy.pptx chemistry
PDF
Thermodynamics
EntropyPresentation1358393959392929494392939493.pptx
WHAT IS ENTROPY.pptx
entropy-170828073801.pdf
Week_5_Entdrddropy_Lecture_Slides_(4).pdf
entropypresentation111112222222222222.ppt
It From Bit - An Amelioration of an Amateur Scientist
Entropy A Measure of Disorder.ppt
Presentation1 entropy.pptx
2nd law of thermodynamics
THERMODYNAMICS[PART 3], CLASS 11, CHEMISTRY
Chapter 2 Entropy & Availability
2nd law of thermodynamics
Lecture 3 Entropy 2nd law.pptxEntropy 2nd lawEntropy 2nd law
THE CONCEPT OF ENTHALPY, ENTROPY AND FREE ENERGY
CHE 116 Unit 2.1 Entropy.pptx physical chemistry
CHE 116 Unit 2.1 Entropy.pptx chemistry
Thermodynamics
Ad

More from Kazi Emad (8)

PPTX
Comparative Analysis of Windows and Linux System
PPTX
Student result management system
PPTX
Cache memory by emad
PPTX
Single sourceshortestpath by emad
PPTX
Dijkstra Algo, BFS, Bellman–Ford Algo, DFS
PPTX
Kazi emad on nenotech
PPTX
Technologies Used in ICC CWC-2019
PPTX
Kazi emad on ai
Comparative Analysis of Windows and Linux System
Student result management system
Cache memory by emad
Single sourceshortestpath by emad
Dijkstra Algo, BFS, Bellman–Ford Algo, DFS
Kazi emad on nenotech
Technologies Used in ICC CWC-2019
Kazi emad on ai

Recently uploaded (20)

PPTX
Safety Seminar civil to be ensured for safe working.
PDF
Operating System & Kernel Study Guide-1 - converted.pdf
DOCX
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
PDF
Model Code of Practice - Construction Work - 21102022 .pdf
PPTX
OOP with Java - Java Introduction (Basics)
PPTX
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
PPTX
Construction Project Organization Group 2.pptx
PPTX
web development for engineering and engineering
PPTX
UNIT 4 Total Quality Management .pptx
PDF
Well-logging-methods_new................
PDF
Embodied AI: Ushering in the Next Era of Intelligent Systems
PDF
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
PDF
composite construction of structures.pdf
PDF
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
PDF
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
PPTX
Current and future trends in Computer Vision.pptx
PDF
III.4.1.2_The_Space_Environment.p pdffdf
PDF
PPT on Performance Review to get promotions
PPTX
Foundation to blockchain - A guide to Blockchain Tech
PDF
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf
Safety Seminar civil to be ensured for safe working.
Operating System & Kernel Study Guide-1 - converted.pdf
ASol_English-Language-Literature-Set-1-27-02-2023-converted.docx
Model Code of Practice - Construction Work - 21102022 .pdf
OOP with Java - Java Introduction (Basics)
MET 305 2019 SCHEME MODULE 2 COMPLETE.pptx
Construction Project Organization Group 2.pptx
web development for engineering and engineering
UNIT 4 Total Quality Management .pptx
Well-logging-methods_new................
Embodied AI: Ushering in the Next Era of Intelligent Systems
Mohammad Mahdi Farshadian CV - Prospective PhD Student 2026
composite construction of structures.pdf
SM_6th-Sem__Cse_Internet-of-Things.pdf IOT
TFEC-4-2020-Design-Guide-for-Timber-Roof-Trusses.pdf
Current and future trends in Computer Vision.pptx
III.4.1.2_The_Space_Environment.p pdffdf
PPT on Performance Review to get promotions
Foundation to blockchain - A guide to Blockchain Tech
BMEC211 - INTRODUCTION TO MECHATRONICS-1.pdf

Entropy.ppt

  • 4. WELCOME WE ARE NOT JUST ANOTHER BRICK IN THE WALL CONTEN T History CONCEP T DEE ERL/LU A/C 4 PARTPART 3 PART 2 PART 1 KAZI EMAD B.Sc. in CSE ID: 191902025 Slide (4-7) IKHTIAR B.Sc. in CSE ID: 191902022 Slide (8-11) ASMA B.Sc. in CSE ID: 191902027 Slide (12-14) RAIHAN B.Sc. in CSE ID: 1919024 Slide (15-18)
  • 5. CONTEN History CONCEP T DEE ERL/LE A/C Introduction/History Concept Definition And Expression Of Entropy Entropy in real life Law of Entropy Unit of Entropy Classification Applications Conclusion
  • 6. CONTEN History CONCEP T DEE ERL/LU A/C INTRODUCTION :The word “entropy” was created by German physicist “Rudolf Clausius ” in 1854. Rudolf Clausius (1822 - 1988)The word has a Greek origin, its first part reminds us of “energy” and the second part is from “tropes” which means turning point. HISTORY :Why’s this notion called entropy, anyway? From the American Heritage Book of English Usage (1996): “When the American scientist Claude Shannon found that the mathematical formula of Boltzmann defined a useful quantity in information theory, he hesitated to name this newly discovered quantity entropy because of its philosophical baggage.The mathematician JohnVon [sic] Neumann encouraged Shannon to go ahead with the name entropy, however, since`no one knows what entropy is, so in a debate you will always have the advantage.’ Rudolf Clausius (1822 - 1988)
  • 7. CONTEN History CONCEP T DEE ERL/LE A/C What is entropy? The word entropy is sometimes confused with energy. Although they are related quantities, they are distinct. or energy measures the capability of an object or system to do work. on the other hand, is a measure of the "disorder" of a system. What "disorder refers to is really the number of different microscopic states a system can be in, given that the system has a particular fixed composition, volume, energy, pressure, and temperature. By "microscopic states", we mean the exact states of all the molecules making up the system. Entropy = (Boltzmann's constant k) x logarithm of number of possible states = k log(N).
  • 8. CONTEN HISTORY CONCEP T DEE ERL/LU A/C CONCEPT The idea of entropy comes from a principle of thermodynamic dealing with energy .It usually refers to idea that everything in the universe eventually moves from order to disorder , and entropy is the measurement of that change. • Physicists use entropy to measure the amount of disorder in a physical system. • In information theory , entropy is the expected value (average) of the information contained in each message received. • It can be considered as the degree of randomness in a message. Entropy :thermodynamic property-- a quantitative measure of disorder Entropy traces out its origin –molecular movement interpretation-Rudolf Clausias in 1850 The concept of entropy -thermodynamic laws(i.e. the 2nd law of thermodynamics) It can be visualised due to the process of expansion, heating, mixing and reaction. Entropy is associated with heat and temperature.
  • 9. CONTEN HISTORY DEE DEE ERL/LU A/Legt/ C Definition and expression of entropy Entropy may be defined as the property of a system which measure the degree of disorder or randomness in the system It is a Greek word which means transformation It is denoted by the symbol ‘S’ Clausius was convinced of the significance of the ratio of heat delivered and the temperature at which it is delivered
  • 10. CONTEN HISTORY DEE ERL/LU CLASSIFICATION A/C 1.Entropy is the sum total of entropy due to positional disorder, vibrational disorder and configurational disorder. i.e randomness due to change of state S=sp+st+sc 2.When a system is undergoing change then the entropy change is equal to the heat absorbed by the system divided by the temperature at which change taken place. ΔS = S2 – S1 = ∫ dq / T T ΔS = dq or TdS = dq this is the II law expression. Suppose the process is undergoing change at constant temperature: 3.From I Law we know that ΔE = q – w or dE = dq – dw or dE = dq – PdV At constant temperature ΔE = 0, therefore dq = PdV. From II law we know that dq = TdS , Substituting this in the above we get, Tds = Pdv ΔS = PdV / T, 4.Suppose the process is undergoing change at constant pressure condition then: T ΔS = (q)p - but we know that (q)p = CpdT T ΔS = Cp dT, Or TdS = Cp dT By integration, 1∫2dS = 1∫2 Cp dT /T S2 – S1 = Δ Cp ln (T2 / T1) This is the entropy change of the system at constant pressure condition from room temperature to the reaction temperature.
  • 11. CONTEN HISTORY ERL/LU LE/UE CLASSIFICATION Entropy in real life Thus we can say that more disorder = more entropy more order = less entropy
  • 12. CONTEN HISTORY CONCEP T DEE LE/UE CLASSIFICATION Law Of Entropy The second law of thermodynamics, “entropy of an isolated system always increases.” Or in other words “The entropy can be created but not destroyed Unit of Entropy The SI unit for Entropy (S) is Joules per Kelvin (J/K). Clausius is also gives the relation between the units i.e. 1 Clausius (Cl) = 1 (cal/°C) = 4.1868 (J/K)
  • 14. CONTEN HISTORY CONCEP T DEE TE/SE ECW/EQ W Thermodynamical Entropy Entropy is defined using Clausius inequality 𝛿𝑄 𝑇 ≤ 0 The cyclic integral of 𝛅Q 𝑇 can be viewed as the sum of all these differential amounts of heat transfer divided by the temperature at the boundary. 𝛿𝑄 𝑇 = 0 (for reversible cycle) 𝛿𝑄 𝑇 < 0 (for irreversible cycle) 𝛿𝑄 𝑇 > 0 (for impossible cycle) by defintion: Let’s define a thermodynamic property entropy (S), such that S2-S1= dS = 𝑑𝑄 𝑇 Statistical Entropy In 1877, Ludwig Boltzmann developed a statistical entropy S. this suggest the connection between entropy and thermodynamic probability,. It may be written as: S=F(Ω)=KBlnΩ Where KB= Boltzmann’s constant Ω = thermodynamic probability
  • 15. CONTEN HISTORY CONCEP T DEE ECW/EQ W APPLICATIONS Entropy in classical world Claude E. Shannon introduced Shannon’s entropy , used for measuring entropy of a classical system. Thus shannon’s introduced a entropy , i. e. HS= Pilog2 (1/Pi) Where Pi = probability distribution function Entropy in Quantum world Von Neumann entropy is used for measuring entropy of a quantum system. It gauges order in a given quantum system. The entropy of a quantum state was introduced by von Neumann. This entropy of a state P is defined by S(P)= λilog2 ( 1 λi ) Where λi = Eigenvalues of the density matrix
  • 16. CONTEN HISTORY CONCEP T DEE APPLICATIONS CONCLUSION APPLICATIONS We use a shannon’s entropy in information theory. Identify an information processing task – data compression, information transmission, teleportation. Quantum Shannon Theory provides General theory of interconvertibility between different types of communications resources: qubits, cbits, ebits, cobits, sbits… It can store information so that at a later time the information can be reconstructed. Entropy as a measure of entanglement.Entropy is a measure of the uncertainty about a quantum system before we make a measurement of its state.
  • 17. CONTEN HISTORY CONCEP T DEE CONCLUSION  Conclusion Entropy is the thermodynamic property which is the measure of disorder in a system. It can be expresses by ‘S’=q/t The term is coined by Rudolf Clausius. Entropy is mainly associated with heat and temperature. Disorder can be of 3 types- Positional, Vibrational and Configurational Thermobarometric models is an excellent case study when the application of thermodynamic parameters are involve Second law of thermodynamics implies that the entropy of a universe is increasing continously because energy conservation is not 100% efficient. i.e. some heat is always released. Entropy can be zero At absolute 0 (0 K), all atomic motion ceases and disorder in a substance is zero. entropy will always be positive