Statistical Entropy is probability theory applied to entropy showing that it is a measurement a system’s disorder in. Based mainly on the probability of the positions of molecules it explains the tendency; seen in the 2nd Law of Thermodynamics; of entropy to increase. This tendency is because configurations with high entropy are more probable than configurations of low entropy.
The biggest problem with entropy is that it has a tendency to increase. This makes understanding how to decrease entropy is highly important. A common answer is adding energy to the system though that is all that is needed to decrease its entropy. However this answer is overly simplistic since when energy is applied to a system the way it affects the system’s entropy depends on the way the energy is applied to the system. Consider the difference between construction work and a bomb. Construction work will decrease the entropy of a building under construction. One the other hand a bomb with the same amount of energy and on the same site will inevitably increase the site’s entropy.
This shows that the manner by which energy is applied to a system affects how that energy changes that system’s entropy. What is needed is a general principle that describes this difference and statistical entropy shows exactly how and when entropy can be decreased it shows how to produce order from disorder. Understanding this is critical to a proper study of origins.