Home Page General Articles Genesis Science Mission
Scripture Categories Genesis Science Mission Online Store

Creation Links

Anomalies Genesis Mission Blog

 

Statistical Entropy
Order and Disorder


Statistical Entropy: The application of probability theory to the principle of entropy.

This shows that entropy is a measure of the amount of disorder in a system a fact that is critical to a proper study of origins.


Basic concept

Statistical entropy is based on the probability of molecular positions which is most often resulting from molecular motion. The tendency of entropy to increase seen in the 2nd Law of Thermodynamics is a result of the fact that high entropy configurations  are more probable than low entropy configurations.

A common example is a gas in a half plugged container.

When the pug is removed it is highly improbable that all of that gas would remain on one side.

The overwhelmingly more probable situation is that the gas gets evenly distributed between both halves of the container.

It turns out that this is what is actually observed to occur.

Here is linking to a page with an active illustration of this example of statistical entropy.


Entropy and Probability

Entropy it directly related to probability by way of the Gibbs Entropy Formula.

Entropy is denoted as S

K is the Boltzmann Constant.

k = 1.38 X 10-23 JL-1

pi is the probability of microstate i occurring in system fluctuation

Gibbs Entropy Formula

This formula is rather hard to work with because dealing with the probability of each microstate would be tedious however these microstates are the key to understand the relationship between entropy and disorder.


Equivalent Micro States

Equivalent microstates are essentially the possible arrangements of a system in its current state. The number of equivalent microstates is denoted as W. Below is the break down of possible microstates of a system of 6 dots.

In this case the total number of possible microstates is 64

Note that the most ordered states of which there are 2 such that W = 2. This produces a probability of ether one of them occurring at 3.12%

The next most ordered states of which there are 12 has an W = 12. This produces a probability of any one of them occurring at 18.74%

The most disordered states of which there are 50 has an W = 50. This produces a probability of any one of them occurring at 78.1%

This is a simple case but is shows how the more disordered systems have more microstates than ordered states. The result is that the more disordered a system is the lager its W and the more ordered a system is the smaller its W.


Statistical Entropy Formula

As stated earlier the number of equivalent states or number of possible ways that a given condition can occur is denoted as W.

Entropy is denoted as S

k is the Boltzmann Constant with k= 1.38 X 10-23 JL-1

S = k ln W  

This clearly shows that entropy is a measure of the disorder of a system because disordered systems have a greater number of equivalent microstates (W) than ordered systems.  Such that the larger W is the more disordered the system is and by contrast the smaller W is the more ordered the system is. Therefore the larger the entropy of system the more disordered it is and the smaller the entropy of a system the more ordered it is.


Applied Energy and Entropy

The biggest problem with entropy is that it has a tendency to increase. So as a result the big question is what is needed to decrease entropy?  A common answer is energy. That is that applying energy to a system can decrease its entropy. However this is an overly simplistic answer because when energy is applied to a system its affects the system’s entropy depends on how the energy is applied to the system.

The best example of this is the difference between construction work and a bomb. Construction work decreases the entropy of a building under construction. On the other hand a bomb on the same site with the same amount of energy will increase the site’s entropy. So how energy is applied to a system affect how that energy changes that system’s entropy

What is needed is a general principle that describes this difference.

A general principle to show how to produce order from disorder.


In conclusion Statistical Thermodynamics shows that entropy is a measure of disorder. It also shows why entropy tends to increase. However the Statistical Thermodynamics also shows how and when entropy can be decreased.


 

Sponsor a page

at $1 a month

$11 for a year

Support this website  with a $1 gift.

 
 

Visit our

Online Store

Gifts of other amounts

Click Here

 

Custom Search