|Home Page||General Articles||Genesis Science Mission|
|Scripture||Categories||Genesis Science Mission Online Store|
|Anomalies||Creation Science Talk Blog|
When entropy is examined statistically it can be considered a measure of randomness. Now the more random a system is the more disordered it is. The formula for statistical entropy is:
S is entropy.
k is the Boltzmann Constant =
is the number of equivalent equally probable configurations. This is a direct measurement of disorder.
Random or disordered systems have such a significantly higher number of equivalent equally probable configurations, that they can basically be considered inevitable. Now entropy is not quite the same as disorder, but entropy is logarithmically related to disorder. Entropy can be considered a measurement of disorder in the way that the Richter Scale is a measurement of Earthquakes or decibels are a measurement of sound. The result is that it is accurate to call entropy a measure of disorder. This means a reduction in entropy does result from an increase in organized complexity.
| J Philip Bromberg,
Physical Chemistry, 1984, pg. 690
While the following concept is related to the 2nd law of thermodynamics, it extends beyond the basic concept of the second law. It comes from a statistical analysis of how energy is applied to a system; affects entropy. The fact that how energy is applied is critical to an increase or decrease in entropy is evident from the difference between construction work and a bomb.
Consider a pile of wood. If a group of people work to organize the pile of wood, a building can be built that has less entropy than the pile of wood. If however a bomb of equal energy is applied to the pile of wood, the result is that the pile is scattered with more entropy than the pile of wood.
Now any time energy is applied to a system there is some degree of randomness to how it is applied. The result is that randomness of the system is moved towards that of applied energy.
= The number of equivalent equally probable states of the system.
= The number of equivalent equally probable states of the application of energy.
Now the statistical formula for entropy is:
This results in:
The result is that:
= Entropy of system.
= Entropy of energy application.
= Maximum change in entropy.
The result is that if energy is applied to a system in a manner more random than the system, then it becomes more random. If the same amount of energy is applied to a system in a manner less random than the system, then it becomes less random. This explains why organized work can build buildings, but a bomb will bring it down. The more organized application of energy would be an organizing force.
Evolutionists claim that natural selection communicates information from the environment to populations of organisms In fact natural selection communicates nothing. All it is, is a quality control mechanism. Natural selection can only select from what already exists. In and of itself, Natural selection causes no changes in DNA. The real source of genetic change for evolution is random changes in DNA called mutations.
Now DNA is very organized, while mutations are very random. This results in the following:
Plugging this in to formula 1:
This results in:
This means that random mutations can only increase the entropy of DNA. Furthermore since:
This means that mutations will increase the randomness of DNA. As a result the best that natural selection can do is to remove the most randomized DNA. So natural selection cannot communicate anything, therefore the general theory of evolution has no organizing force and no program. The result is that since the general theory of evolution needs to decrease the entropy of DNA and lacks an organizing force it would seem to contradict the laws of Thermodynamics.
A reprint of the original paper on the affect on entropy of the application of energy to a system.
at $1 a month
$11 for a year
Support this website with a $1 gift.
Gifts of other amounts