Thermodynamics and Physical Statistics
Thermodynamic Approach
- defines correlations between the observed physical quantities (macroscopic),
relies mostly on the experimentally established dependencies
allows generally to consider the physical essence of the problem -
does not require precise information about the microscopic structure of matter.
Thermodynamic Approach
- defines correlations between the observed physical quantities (macroscopic),
relies mostly on the experimentally established dependencies
allows generally to consider the physical essence of the problem -
does not require precise information about the microscopic structure of matter.
Statistical Approach
- Based on certain models of the micro-structure of matter
defines correlations between the observed physical quantities, starting from the laws of motion of molecules using the methods of probability theory and mathematical statistics
allows to understand the random fluctuations of macroscopic parameters.
Probabilities
Probabilities
Distribution of Probabilities
X
φ
The CORRECT question: what is the probability for the top toy to stop within certain range of angles: between φ and (φ + dφ)?
Evidently, this probability MAY depend on φ and MUST be proportional to dφ:
dP(φ to φ + dφ) = f(φ)dφ
Here f(φ) is the probability distribution function.
dS
S
dV
V
Probability Distribution and Average Values
Knowing the distribution of a random x we may calculate its average value Moreover, we may calculate the average for any function ψ(x):
Y
X
Z
V
= 0
Calculation – ‘live”…
This all is about even distribution of molecules over space in spherical balloon.
What about the distribution of molecules over velocities and energies?
It can be spherically symmetric, but it can not be even as formally there is no upper limit of velocity…
= 0,5 both for types 1 and 0.
NT – number of tests (throws),
Ni – number of tests with the result of the type i
Рi – probability to obtain the result of the type i
T
= 0,25 probability of types 0 and 2
= 0,5 – probability of type 1
N – the length of a test series (in this case N = 2) ,
NT – total number of tests
Ni – number of tests with result of the type i
Рi – probability to obtain the result of the type i
T
Eagles and Tails Game
T
SOME MATHEMATICS
Type Number of variants producing this type of result
i = 0 1 (all the throws = eaglespins)
i = 1 N (only one there was a tailspin (either at first throw, or at the 2nd, or at the 3rd, … or at the last)
i = 2 N(N -1)/2
i = 3 N(N-1)(N-2)/6
…..
i = k N(N-1)(N-2)….(N-k+1)/k! = N!/k!(N-k)!
….
i = N-1 N
i = N 1 (all the throws = tailspins)
Eagles and Tails Game
Ωk= N!/k!(N-k)!
ΣΩk= 2N
Probability to obtain the result of the type k:
Pk= N!/k!(N-k)!2N
If N>>1 we may apply the Sterling's approximation:
ln(n!) ~= n ln(n/e)
Для N = 10
Pk= (2/πN)1/2exp(-2(k-N/2)2/N)
here n = (k - N/2) – is the deviation of the obtained result from the average N/2. Probabilities are noticeable when n<~N1/2 << N
Eagles and Tails Game
SOME MATHEMATICS
For continuously distributed quantities X:
differential probability to find the random value X within the range from X to X + dX dP(x) = f(x)dx
f(x) – is the probability distribution function
Probability to find the random value X within the range from x1 to x2 :
For continuously distributed quantities the probability to have exactly some value x0, is equal to zero P(x = x0)=0.
In case of the Eagles and Tails game: μ = N/2, σ = (N/2)1/2 << N
Pk=(2/πN)1/2exp(-2(k-N/2)2/N) =>
Normal Distribution.
EXAMPLE: Eagles and Tails Game
ΣΩk= 2N
N = 10
Type Realizations (variants)
k = 0 0000000000
k = 1 1000000000 0100000000
0010000000 … 0000000001
…
k = 5 0100110011 1101000101 … -
- total 252 variants.
If N>>>1 – figures may be exponentially high
1. The more realizations are possible – the higher is the probability of the result.
2. The more realizations are possible – the less looks the “degree of order” in this result
N = 10
S(k) = Aln(Pk) = Aln(Ωk) - ANln2
The higher is the “order” – the lower is entropy
The higher is probability – the higher is entropy
P(k и i) = PkPi => S(k и i) = S(k)+S(i)
Entropy is the additive function!
If N = 10
S(0) = S(10) = ln(1) = 0
S(5) = ln(252) = ~5,6
The more realizations are possible – the more is the probability of it and the less is the “degree of order” in this result. For long series of tests (N>>>1) numbers of variants of realizations may be exponentially high and it may be reasonable to apply logarithmic functions)
Any information or communication can be coded as a string of zeroes and units: 0110010101110010010101111110001010111…. (binary code)
To any binary code with length N (consisting of N digits, k of which are units and (N-k) – are zeroes) we may assign a value of entropy:
S(N,k) = ln(ΩN,k), where ΩN,k – is the number of variants how we may compose a string out of k units and (N-k) zeroes
Communications, looking like 00000000.. 111111111… S = 0
Communications with equal number of units and zeroes have maximal entropy .
Entropy of 2 communications equals to the sum of their entropies.
Claude Elwood Shannon
1916 - 2001
Entropy in Informatics
The row of numbers (n1, n2, …nK) = n(i) forms what is called the «macro-state» of the system. Each macro-state can be realized by the tremendously great number of variants Ω(n(i)). The number of variants of realization is called the “statistic weight” of the macro-state.
The more is the “statistic weight” – the greater is the probability to realize this state.
Even distribution of molecules over space has the biggest statistical weight and thus id usually associated with thermo-dynamical equilibrium.
J/K
Statistical Entropy in Physics.
5. The entropy is the measure of disorder in molecular systems.
J/К
Statistical Entropy in Physics.
For the state of the molecular system with certain macroscopic parameters we may introduce the definition of Statistical Entropy as the logarithm of the number of possible micro-realizations (the statistical weight of a state Ώ) - of a this state, multiplied by the Boltzmann constant.
the number of variants of realization of a state (the statistical weight of a state) shall be higher, if the so called phase volume, available for each molecule (atom), is higher: Phase volume Ω1 ~Vp3~VE3/2 ~VT3/2
As molecules are completely identical, their permutations do not change neither the macro-state, nor the micro-states of the system. Thus we have to reduce the statistical weight of the state by the factor ~ N! (the number of permutations for N molecules)
the phase volume for N molecules shall be raised to the power N:
Ω ~ VNT3N/2.
For multy-atomic molecules, taking into account the possibilities of rotations and oscillations, we shall substitute 3 by i : Ω ~ VNT iN/2
Ω~ VNTiN/2 /N!; S=k ln Ω =kNln(VTi/2/NC)=
= v(Rln(V/v) + cVlnT +s0)
The statistical entropy proves to be the same physical quantity, as was earlier defined in thermodynamics without even referring to the molecular structure of matter and heat!
MEPhI General Physics
Если не удалось найти и скачать презентацию, Вы можете заказать его на нашем сайте. Мы постараемся найти нужный Вам материал и отправим по электронной почте. Не стесняйтесь обращаться к нам, если у вас возникли вопросы или пожелания:
Email: Нажмите что бы посмотреть