the concept of ‘entropy’ is very slippery


‘Entropy’ by alexiuss

Entropy was first used in thermodynamics as a measure of the degree of order present in a system.

Difficulties in teaching it have led to the concept of “energy dispersal’ at a specific temperature.

A classical example has been of a river;

Since ancient times the power of a river has been recognized by people living along its banks and who are dependent on it for their survival. The river becomes venerated as a God.

When a dam is built, the energy of the river can be collected via a turbine as electricity and used in a variety of ways:

Our modern society is totally dependent on electricity for survival as it is used as a flexible provider of energy at points of need. All forms of energy are inconvertible if appropriate technology is used.  Energy being taken out of the system results in an increase of the entropy of that system.

A deck of cards on the floor


I lay the deck out in 4 rows following the sequence of the stack. It is a Tarot deck and ordered in a non-random way, presumably by the person or people playing before as I had not shuffled the deck. What were these rules? Difficult to know. There is no simple rule to describe the sequence. By taking the photograph I have produced a record. Alternatively I could make a record in a different way by listing the cards as follows: row 1 Card 1 Queen of Diamonds (Dame de Carreau strictly speaking as it happens to be a french deck).
row 1 card 2 etc

This is a very laborious process!

What has happened to the entropy of the system (me and the cards)?  Using my muscles I have laid the cards out so they are all at the level of  the lowest of the cards originally in the deck. By doing this I have exposed all the cards and the information  held in the sequence.  How could the sequence have been predicted?  To answer this question leads to the history of codebreaking and mathematical analysis of sequences of events using a Bayesian logic approach.  Turing and Good used these approaches at the beginning of the Second World War to decipher the German Naval Code Enigma.   These events were highly classified and remained secret long after the war ended.  Good contends that Turing developed the idea of using entropy as a measure of information during this work. page 10

The use of entropy as a measure of information was published by Claude Shannon in a classified report to Bell Laboratories in 1945 and later published as ‘Communication Theory of Secrecy Systems’ in Bell Systems Technical Journal 1948. It is a moot point how much communication there had been between Shannon and Turing  on this topic during their wartime meetings.

2 thoughts on “the concept of ‘entropy’ is very slippery”

  1. Ignorance in the physical concerning Truth and Reality are brought along with them
    in the discarnate state. It is for this reason that we will consider it separately, apart from the general view of religion,
    giving it a category of its own. There is a wonderful science fiction story
    ‘ a John Collier story ‘ that brings this idea home.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s