Home Page  General Articles  Genesis Science Mission 
Scripture  Categories  Genesis Science Mission Online Store 
Anomalies  Genesis Mission Blog 
Statistical Thermodynamics: The branch of physics that applies probability theory to the study of thermodynamics. 
Statistical Thermodynamics is also called Statistical Mechanics but regardless of the label it provides a way to relate the microscopic properties of atoms and molecules to the macroscopic world. It also provides a molecular level interpretation of thermodynamic quantities.
Probability theory: The branch of mathematics dealing with the probability, and analysis of random phenomena. 
Probability theory deals with the probability of events occurring and in Thermodynamics it shows what is likely and even possible. Coins are a common stating point in Probability theory because they have two sides commonly called heads and tails. Furthermore they have an equal chance of coming up ether side.
Heads 
Tails 
What makes coins so useful in probability theory is that they can be flipped with an equaled chance of getting each result.
1 coin has 2 possibilities 

1 heads 
1 tails 
2 coins have 4 possibilities 
1 set of 2 heads 
2 sets of 1 heads and 1 tails 
1 set of 2 tails 
3 coins have 8 possibilities 
1 set of 3 heads 
3 sets of 2 heads and 1 tails 
3 sets of 1 heads and 2 tails 
1 set of 3 tails 
4 coins have 16 possibilities 
1 set of 4 heads 
4 sets of 3 heads and 1 tails 
6 sets of 2 heads and 2 tails 
4 sets of 1 heads and 3 tails 
1 set of 4 tails 
Note the pattern 
1 try has 2 possibilities 
2 tries has 4 possibilities 
3 tries has 8 possibilities 
4 tries has 16 possibilities 
Results in : 
n tries having 2^{n} possibilities 
Extending this to all situations were each try has an independent outcome.
p = x^{n} 
x is the number of possible outcomes for each try, 
n is the number of independent tries. 
p is the total number of possible outcomes of a given number of tries. 
When the possibilities are not independent the situation is a little different because with dependent outcomes once one option is used the number of available outcomes changes. Consider a room with five chairs and five people come in and sitdown.

The first person that comes into the room has five choices as to where to sit. 

Now the second person that comes into the room has only four choices as to where to sit since one seat is occupied. 

When the third person comes into the room has only three choices as to where to sit since two seats are now occupied. 

As a result when the fourth person comes into the room they have only two choices remaining as to where to sit since three seats are now occupied. 

Unfortunately for the poor fifth person that comes into the room there is only one seat left and no choices left as to where to sit since all other seats are now occupied. 

This produces a filled room but there are 120 different ways the seats could be filled, 
To get the total number of ways the people can sit in the room we need to multiply the number of choices available to each of them. Now if the number of possible arrangements is labeled as p then there result we get is p = 5*4*3*2*1 = 120 possible arrangements. This is called a factorial in mathematics which is designated as N! In this case we have here it is 5! = 5*4*3*2*1 = 120 however in the general case it is depicted as N! = N*N1*N2*…*1.
In simple cases such those with two independent outcomes over the long hall they tend to balance out and this occurs even with larger numbers of possibilities However in more complicated situations a bell shape structure is produced This bell shape structure ends up approaching something like the one above as the number of tries gets extremely large It is also the pattern that is produced by the brake down of the probabilities of patterns within specific cases.
Statistical Entropy is based on the probability of molecular positions which most often results from molecular motion. The tendency of entropy to increase that us seen in the 2nd Law of Thermodynamics is a result of the fact that high entropy configurations are more probable than low entropy configurations.
A common example is a gas in a half plugged container. 
When the pug is removed it is highly improbable that all of that gas would remain on one side. 

The overwhelmingly more probable situation is that the gas gets evenly become distributed between both halves of the container. This is what is actually observed to occur. 
Entropy is related to probability by the fact that more probable conditions have more possible ways to occur than less probable ones.
The number of possible ways for a given condition to occur is denoted as W. 
Entropy is denoted as S 
k is the Boltzmann Constant = 1.38 X 10^{23} JL^{1} 
The relation ship between entropy and the number of possible ways for a given condition to occur is:
The fact entropy is related to the number of possible configurations that a system may have is why it can be considered a measurement of the degree of disorder in a system since disrobed systems have more possible configurations than ordered systems.
Statistical Thermodynamics provides a way to study molecular motion to show how that motion affects the world we see. It not only shows how that motion affects the laws of Thermodynamics but how those laws actually work at the level atoms and molecules.
Statistical Laws of Thermodynamics 
Statistical analyses of the 0th Law of Thermodynamics shows that molecular motion would spread evenly between connected systems putting them all in equilibrium. 
Statistical analyses of the 1stLaw of Thermodynamics shows that molecular motion moves energy around without loosing it. 
Statistical analyses of the 2nd Law of Thermodynamics shows that molecular motion tends to put systems into their most probable arrangement which is by definition the arrangement with the highest entropy. 
Statistical analyses of the 3rd Law of Thermodynamics shows that molecular motion has a minimum value but it also show that absolute zero can not be reached. 
Statistical Thermodynamics shows us why the laws of Thermodynamics work the way they do while at the same time connecting the microscopic and macroscopic worlds in a way nothing else can do.
Statistical Thermodynamics shows what entropy is and why entropy tends to increase. It also shows how and when entropy can be decreased.

Sponsor a pageat $1 a month$11 for a year 
Support this website with a $1 gift. 

Visit ourOnline Store 
Gifts of other amountsClick Here 
Custom Search
