Statistical Physics [11/17/2015]

Thermodynamics of complex systems

The topic of this lecture is  Introduction of Statistic Physics. Statistic Physics is a course.In Statistical Physics, one thing that people do is to explain the microscopic phenomena, microscopic interaction of the agents of the atoms and how this was accomplished and how we expect to accomplish those things in our study of our social systems. In Statistic Physics, what people normally do in Physics is we use the least amount of laws to explain the most amount of phenomenons that we encounter.

The beauty of this is that if we only have the least amount of laws, then our possibility of making errors will be minimized, and if we know that those laws have been tested or have been refuted by researchers of many generations, so this means that there is something about those laws that we can rely on. In terms of the thermodynamics, people previously didn’t know what was happening. It turned out to be something statistics.

Prior to the 20th Century, the ideas that people normally think that given us enough measure or enough tool, sufficient tool to observe everything, we’ll be able to explain everything through Newton’s three rules, but after the beginning of the 20th Century, people began to think of information and think of statistics. The prevailing idea including the idea of today is that we’re not able to observe everything about our system, so this means that we will have to talk about probability. What is the probability for us to observe certain things?

In terms of the thermodynamics, what people have observed they include the following, so we have Joe. Previous people didn’t know what is heat so some people say that heat is perhaps some kind of substance like atoms that move from one place to another place. This turned out to be not true because we can construct many counterexamples. For example, when we have ice melting into water, it seems like everything is the same and so where is those atoms or where did those atoms … from, right? This is one example.

Another example is that it seems that people found that people could generate infinite amount of heat, so this means that there’s no conservation law about the number of heat because we can generate the heat infinitely. There was this person named Joe. Joe constructed a way to turn physical energy into heat and he also constructed a way to turn electrical energy into heat. This means that those things could be interchanged. Another finding that was pretty important is this Carnot Circle four stages.

We could use those four stages to either translate some heat into mechanical energy or translate mechanical energy back into heat. Based on all of those things, what people found out is ultimately something that look like the following. We have the internal energy of the system and this energy is a functional volume of course because if we compress the system, if we expand the system, we change the configuration of the system and we change the physical composition. There’s another observation about thermodynamics physics, which is that a system always goes … Heat always goes from high temperatures.

If we have two systems together, heat goes from the subsystem with higher energy to the subsystem of lower energy, so this means that there is a flow kind of thing in terms of heat. Based on all of those things, people ultimately find things in this kind of fashion. Here we’re using first order approximation. This is in terms of temperature, temperature times the change of entropy and the pressure times the change of volume. What people think here, the mind frame is that … First, it is the first order approximation and second, we have two different types of variables.

One type of variable is called the intensive variables and another part of variable is called extensive variables. When we combine subsystems into one, the value of an extensive variable of the combined system equals the sum of values of the subsystems,  the differences in intensive variables between subsystems drive the change in corresponding extensive variables. Extensive variables include things such as weight. If we put two systems together, then the weight of this combined system is the weight of those two subsystems. Another kind of variable is called the intensive variables. For example, pressure is intensive variable because if we put two subsystems of different pressures together, then the difference of the pressure will drive the movement of the molecules from the subsystem of a higher pressure to the subsystem of lower pressure. There are several things that are very frequently mentioned or very frequently used. For example:

$d U(S, V) = \left({\partial U\over\partial S}\right)_V d S + \left({\partial U\over\partial V}\right)_S d V$,

where temperature $T=\left({\partial U\over\partial S}\right)_V$and pressure $p=-\left({\partial U\over\partial V}\right)_S$ are intensive variables, entropy and volume are extensive variables.

There are several things that are very frequently mentioned or very frequently used.

First, which is the $\textbf{free energy}F = pV = U – T S$ is Legendre pair of internal energy $U$, and $T$ and $S$ are the Legendre conjugate variables.

We know that it’s pretty hard to turn the energy associated with entropy into mechanical energy that we can use. Part of energy associated with entropy is something that we cannot use, but the rest part of the energy is something that we can use. We call that part of energy “free energy.” If we think of the form of free energy, so what we find out is that it has multiple occasion of intensive variable and extensive variable. Then we have two things, one is free energy and one is this internal energy. If we recall our lecture on convex optimization and convex deal and variational method, what we find out is that the free energy is a Legendre-Fenchel pair of the internal energy.

Second,which is the $\textbf{Enthalpy}H(S,p) = U(S,V) + p V$ is Legendre pair of internal energy $U$, and $p$ and $V$ are the Legendre conjurage variables. Entropy which is the part that we cannot use, entropy is the Legendre-Fenchel pair of the internal energy and the Legendre conjugate variables are $P$ and $V$.

Similarly, $\textbf{Gibbs free energy}G(T,p) = U – T S + p V$ is Legendre pair of internal energy $U$, with $(T,p)$ and $(S,V)$ being Legendre conjugate variables.Gibbs free energy is the convex pair. We also it call Legendre pair convex pair. Gibbs free energy is the convex function pair of the internal energy with the conjugate variables being two-dimensional. We have on the one hand $T$ and $P$, both are “temperature” and “pressure.” Both are intensive variables and we also have $S$ and $V$, which is entropy and volume. Both are extensive variables.

Now the question is that, so it is not very surprising that here we have the convex optimization here in our variation method because the idea of convex optimization ultimately came from here. Similarly for example, we think of graphic models that we talked about previously in this class and the idea of graphic models is also from Statistic Physics. Here I just want to review on the interpretation of the Legendre Transform. We have two so we’re given a function $F$ and $F$ is a convex function.

We define the Legendre Transform :

$f: I\to \mathbf R$ is a convex function.

$f^*(x^*)=\sup_{x\in I} (x^*\cdot x-f(x))$is its Legendre transform.

so a Legendre transform has different variables. We define the Legendre Transform of this one at point $x^*$ is the supreme of when we multiply the intensive variable with the extensive variable. We multiply a variable with its pair and subtract function $F$ from the variable and take the supreme. We get the Legendre Transform at the conjugate variable. If we think of, in this form, we have on the $X$ axis we have variable $V$ and on the $Y$ axis we have variable $P$.

We can express variable $V$ in terms of $P$, and we can express variable $P$ in terms of variable $V$, and we have the area underneath this line here, and this is monotonically increasing function. The area under this line here, we express this as $L$. The area to the left-hand side of this line we express this as $P$. What we have is that $L$ is the integral from zero to $V$. Understand $L$ is the area underneath and $H$ is the area to the left. What we have is $L$ of $V$ plus $H$ of $P$ equals $P$ times $V$. If we think of it in another way, so what we have is $H$ of $P$ equals $2P$ times $V$ minus $L$ of $V$.

 

 

legendre

 

$L(v)=\int_0^v p(v’)dv’$ $\Rightarrow {\partial L\over\partial v}=p$

$H(p)=\int_0^p v(p’)dp’$ $\Rightarrow {\partial H\over\partial p}=v$

$H(p) = p\cdot v-L(v)$

$L(v)+H(p) = p\cdot v$

This is another interpretation of the Legendre Transform. What we want to notice is that we have two functions, $L$ and $H$, and $L$ and $H$ are Legendre pairs, and $L$ and $H$ take different variables. If we think of Legendre Transform of function that takes a different state of variables, so not necessarily the variables of the original functions. This is one interpretation of the Legendre Transform. Actually, they have names. $H$ is what people normally call energy, so Hamiltonian and $L$ is the Lagrangian so $L$ and $H$ have different names.

legendre2

 

function $y=f(x)$ corresponds to

envolop $y=p\cdot x-f^*(p)$ where

$p=f'(x)$ is slope of $f(x)$ at $x$

$f^*(p)$ is intercept determined by $p$

$f^*(p)$ is convex in $p$

$f(x)=f^{**}(x)$ if $f(x)$ is convex

if we compare two. People first understand thermodynamics before people think of it in terms of Legendre Transform. We have the science of those things that look pretty weird but people first have the equation of the increment of energy and then people later find out that all of those things have the same form which is in terms of the Legendre pairs. If we take $T$ and $S$ as Legendre conjugate variables, we get free energy. If we take another set of Legendre conjugate variables we get another pair of Legendre functions.

This the main interpretation that we talked about before and in this one we have a function $Y$ here and we want to find out the Legendre Transform in this one, so what we do is we first give a envelope and then we move the envelope up until we first touch the line of this function and this function is a convex function, so this slope will touch the function at some time for the first time. At the time that this slope which is identified as $P$, at the first time that the slope touches the function, we use the intercept at the $Y$ axis as the function; that is the Legendre pair corresponding to the slope.

This is another interpretation of this idea. Those interpretations are actually the same. What we have is $F$, so the Legendre function at $P$ plus the original function at $X$. We know that $P$ and $X$ are Legendre pairs. If we multiply those things together, then the total length is this one which is $X$ times the slope, $P$ of $X.$This is another interpretation of this. As we mentioned, people have found many interesting things about the thermodynamics but the question people had was, how do we explain the thermodynamics using the least amount of laws, the least amount of assumptions to make an interpretation of this?

Boltzmann relation and partition function

 

Here $E$ is a scale of variable which is the energy but $E$ could be other things. It could be a long feature, so what we know is that maximum entropy probability given as the set of features, the sufficient statistics $E,$ is going to take a exponential form. We have a set of parameters, the natural parameters corresponding to those features, beta and $E.$ Given $E$ we can find out beta and given beta we can find out $E.$ According to Boltzmann the probability of a system to be in energy, $E$ of $J,$ so $J$ is an index of the energy, so if we think of a system with a finite number of molecules, then the number of energies is finite and we can actually number those energy configurations.

According to Boltzmann it looks like the following form.

$P_j = \exp(-\beta E_j) / Z$, where – $P_j$ is the probability of observing the $j^\mbox{th}$ state with energy $E_j$

$Z=\sum_j \exp(-\beta E_j)$ is the **partition function**.

$Z$ here is called the partition function and it is the normalization constant to make it probability. The conventional way and the classical way to demonstrate that we have this distribution is the following. We consider a small cylinder and this cylinder is what we consider and we put this small cylinder in a heat reservoir. We have considered this small cylinder there and energy as $E$ of $J$ at this, it is in energy configuration $J$ and the total energy.

The energy here is energy of $R$ which is the total energy minus $E_j$ so the total energy here is $E$ and we assume that the total energy in this duration, we assume that the total energy of the whole system of reservoir and the small tube is constant so $E_R$ plus $E_J$ equals to $E$.

The following one is a very critical assumption which says that in a system all configurations are equally like. Of course we have constraint, which is all configurations should have the same energy, which is energy $E,$ right? This is very critical and this is actually what we have previously identified as maximum entropy. In a system, okay, so maximum entropy, all configurations are equally likely.

$\log P_j \propto \log\Omega(E-E_j) = \log\Omega(E)-\beta E_j+\cdots \Rightarrow P_j\propto \exp(-\beta E_j)$. Now what we do is to take the first order approximation so around the energy “E,” so we have log of “E.” This is the first energy turn here and beta naturally comes out from this as the partial derivative of the log partition function. I should say log number of configuration over energy and from here we can say that the probability for the system to be in energy “I” is proportional to some exponential form over there.

Link to thermodynamics

 

${\partial \log Z\over \partial \beta} = -\sum_j E_j P_j = -U$, where $U$ is internal energy

${\partial \log Z\over \partial V} = {1\over Z} \sum_j {\partial\over\partial V}\exp(-\beta E_j(V))= \beta \sum_j p_j P_j = \beta p$ where $p_j$ is the **pressure** associated with the $j^\mbox{th}$ state.
– $d\log Z= {\partial\log Z\over\partial\beta} d\beta + {\partial\log Z\over\partial V} d V=-U d\beta+\beta p d V = -d(\beta U) + \beta \underbrace{(d U + p d V)}_{T d S}$, or $T d S = {1\over\beta} d (\log Z+\beta U)$, or $S=k \log Z + {U\over T}$, where $k={1\over\beta T}$ is Boltzmann’s constant, or $U-T S=k T \log Z = F$ is free energy. – $S = k\log Z + k\beta U = K \log Z + k \sum_j \beta E_j P_j = -k \sum_j P_j \log P_j$ entropy relationship.

Statistical Physics [11/17/2015]

Leave a Reply