Entropy and related stuff LO17490

Mnr AM de Lange (amdelange@gold.up.ac.za)
Fri, 20 Mar 1998 11:46:56 GMT+2

Replying to LO17304 --

Dear Organlearners,

Joe Cox <diaplan@gte.net> wrote

> I have tried to follow the ideas about entropy and entropy production etc.
> and have so far been unable kernalize the ideas. It seems like there may
> be something in this that I should understand and be able to use but so
> far I'm struggling to turn the ideas which I poorly understand into
> something pragmatic. It all seems so esoteric. Would anyone who is
> knowledgeable in this area have time to put the ideas into a short story
> or illustration that will help me? Am I the only one who is on the
> intellectual sideline here?

Joe, I am very sorry that I could not give sooner attention to your
request. I had a lot of pressing work to finish.

You are not alone in feeling to be on the intellectual sideline. What
you and most others on this list experience with respect to "entropy"
and "entropy prodution" is the result of what I call "academical
apartheid". Another word for apartheid is separatism. The separation
between the natural sciences and the humanities is destroying Academy
as apartheid between the European and African races has been
detroying South Africa.

The great cosmologist Sir Arthur Stanley Eddington considered the
Second Law of thermodynamics the most important law of the physical
universe. He knew enough that he could have written a thick book to
substantiate his claim. Is it not sad that mostly those who have
studied physics, chemistry or engineering have some knowledge on this

Why do I say sad? It may be the case that this law is responsible for
ALL the organisations in REALITY (material and abstract) and ALL the
changes in them. In other words, it may be the case that this law is
at the very centre of a theory AND practise for everything. Since the
discovery of this law about 150 years ago, more and more pointers
emerge to this very possibility.

Whether this law will become the rosetta stone for understanding
reality, is besides the point. (The Rosetta Stone occupies a
unique position in Egyptian archeology - do read about it.) The
dreadful point is that if we do not know anything about this law and
since the law is possibly the rosetta stone for understanding
reality, we will know too little to recognise the actual rosetta
stone, whatever it will prove to be.

The Second Law concerns the physico-chemical quantity "entropy".
Although the quantity entropy is closely related to another quantity,
namely "energy", it is not the same thing. The First Law of
thermodynamics concerns the quantity "energy". You may have never
heard before of entropy, but you certainly have heard enough of the
quantity energy.

You probably have used the word "energy" thousands of times in your
life. I bet that in the majority of cases the use of the word
"energy" was essential to you. Is it then not strange that you have
never used the word "entropy" before to express something which was
essential to you? Either the word "entropy" is not essential, or
there are things which you know about, but could not express them
since you did not know the word "entropy". Those things which you
know, but have not yet articulated before, make up your "tacit
knowledge". What is the ratio between your tacit knowledge and your
expressed knowledge? 1: 1000? 10 000:1? How would you know if you
do not know how to organise your tacit knowledge into expressed
knowledge? How will know how to organise if you know nothing about
its possible rosetta stone, namely entropy?

I have said before that "entropy" and "energy" are closely related.
How closely related are they? The are so closely related that if you
speak about the one without speaking about the other one, you are
most probably speaking nonsense. Nonsense? It is such a harsh word.
Rick will not allow such a harsh judgements on this list. If I were
Rick, I also would also not allow it. So, let me explain. Although
you are probably speaking in a way which make sense to you while
using only one of them, you will only discover the lack of sense
after you have learned to use both of them correctly.

It is like a man and a woman who fall perfectly in love with each
other for ever. Before they have known each other, they did not
realise how senseless their lives had been. Now they cannot live
without each other. When the one finally passes away, the other one
follows in a few days. I once knew such a couple in my life and
they filled me with wonder. Were it not for knowing them, I would
never have used them as an example because it seems to be too good to
be true.

Maybe this not what you wanted to know with my question
"How closely related are they?" One way to discover how closely
related they are, is study the history of the discoveries of the
First Law and the Second Law of thermodynamics. I did it. It is drama
of a quality which matches the fiction of Sheakespeare or Goethe. I
will encourage anyone to do it any day. Maybe there is some great
writer among us who will one day capture this great drama. The fact
that these two laws have the numerical (ordinal) names First and
Second may seem dull, but it is another way of pointing to the close

Maybe this is also not what you wanted to know with the question
"How closely related are they?" Another way to discover how closely
related they are, is to study them in terms of the theme
content/form. This theme content/form have been a main theme of
discussion in many philosophies and subjects the past 2500 years.
Here is the clue to begin a discussion yourself. Entropy is the form
of energy and energy is the content of entropy. As I see it today,
the every manifestaion of content/form anywhere in the universe
(physical and spiritual) is the result of the energy/entropy

Let me explain the content/form relationship of energy/entropy by a
model which we may call my Picture-Pixel Model of entropy. This model
should not be confused with the actual operational definition for
entropy used by physicists, chemists and engineers. There are
currently two other models used to explain entropy, namely Shannon's
Information Model and the Quantum Statistical Model based on
Boltzmann's work in statistical mechanics. Students often confuse any
one of these models to be the definition of entropy - causing them a
lot of other confusion. I encourage you to avoid the same confusion
with my Picture-Pixel Model of entropy. It is a model, not a

This is the first time ever that I will describe the Picture-Pixel
Model in public. Look at any printed picture with a magnifying glass
to see the pixels (dots). Even better, use the MS Windows Paintbrush
program (or any similar program) to experiment self with what I have
to say.

Consider any picture made up by a rectangular array of pixels (dots),
say a 100x100 array consisting of 10 000 pixels.

Each pixel will represent a quantum of energy. The different colours
will represent quantums with different energy values, say increasing
energy from red, orange, yellow, green, blue to violet.

The organisation of the pixels relative to each other will represent
the entropy. The entropy will be calculated by counting D units
(D = Difference). When counting Ds, note that every pixel (except
those at the borders) is closely bordered (surrounded) by four other
pixels - left, right, top and bottom. The entropy of the picture is
calculated as follows.

Begin with a picture with all its pixels (10 000 of them) in one
colour, say yellow. Each pixel is closely bordered by four others.
Since a pixel and its four neighbours have the same colour, there are
no differences and thus zero Ds. This is the case for all 10 000
pixels. The sum of all the D's of all the pixels is also zero. Thus
the entropy of this picture has its lowest value. Let us make it
arbitrarily 0D (zero D).

Now change the colour of only one pixel (not a border pixel) into
say, blue. Let is calculate the D for this blue pixel. The pixel to
its right is yellow which gives one D. The same applies to the other
three sides of it. The total Ds for this pixel is (1+1+1+1)D = 4D.
Thus the entropy of the picture with a yellow background and one blue
pixel in it, is 4D.

Change the colour of another pixel, not close to the first one, also
into blue. The total Ds for this pixel is also 4D. Thus the entropy
of the picture is (4+4) = 8D. For every isolated blue pixel intoduced
into the picture, its entropy increase by 4D.

Let us consider only two blue pixels in the picture, but now
connected to each other in ahorizontal fashion. Let us count the Ds.
For the pixel on the left it is 3D (1D to its top, 1D to its left and
1D to its bottom). It is not 4D because there is no diffirence to its
right - that pixel is also blue so that it gives 0D. Similarly, the
blue pixel on the right has 3D. Thus it SEEMS as if the total entropy
of the picture is (3+3)D = 6D. Hence it appears as if the entropy of
a picture with two connected pixels, namely 6D, is less than the
entropy of a picture with two isolated pixels, namely 8D.

But there is something terribly wrong with this conclusion. These two
pixels are connected like two photons of a laser beam with which we
eventually can produce holograms. We have not taken into account in
terms of entropy the fact that the two blue pixels are CONNECTED. We
have not taken into account that together they form a STRUCTURE. We
have not taken into account that they COMMUTE.

Almost exactly a year ago, I introduced the concept "commutation" to
this list. (See in the archives: Commutation in the LO LO12877 Thu,
13 Mar 1997 14:11:38 GM.) I stressed the importance of this concept
to LOs. I wish I had the time to write on how one of Senge's five
disciplines, namely teamwork, hinges on this concept. I have also
showed in later contributions on the Maturana Seminar how this
concept helps us to interpret some of the rather diffcult concepts of
Maturana. I will now use this concept again.

The close contact between the two blue pixels means that they commute
with each other. It is like a man and a woman who fell perfectly in
love with each toher forever. The man become through the woman and
the woman become through the man. Through each other they shair much
more. The blue pixel on the left has 3D on its own, but through its
commutation with the blue pixel on the right, it has another 3D. Thus
it has (3 +3)D = 6D. Similarly, the blue pixel on the right has 6D.
Thus the entropy of the whole picture is (6+6)D = 12D. Consequently
the entropy of a picture with two connected pixels, namely 12D, is
more than the entropy of a picture with two isolated pixels, namely

Let us calculate the entropy of a pixture with 3 blue pixels
connected in a horizontal line. For the pixel on the left it is its
own 3D plus the 2D of the (centre) pixel to its right, i.e. 5D. For
the pixel in the centre it is its own 2D plus the 3D of the pixel to
its left pus the 3D of the pixel to its right, namely 8D. For the
pixel at the right it is its own 3D plus the 2D of the (centre) pixel
to its left, i.e 5D. Thus the entropy of the whole picture is
(5+8+5)D = 18D. Consequently the entropy of a picture with three
connected pixels, namely 18D, is more than the entropy of a picture
with three isolated pixels, namely 12D.

By bringing more blue pixels into the picture, the entropy of the
pixture steadily increases. If the incoming pixels are isolated, the
entropy increases not so much as when the incoming pixels commute
(become closely connected) to existing blue pixels in the picture.
Think of isolated pixels as "chaos of becoming" and commuting pixels
as "order of structure". In other words, although chaos represents an
increase in entropy, order represents an even faster increase in
entropy. Furthermore, as the commuting pixels begin to outline
recognisable forms like a tree, a car or a face, the entropy of the
picture steadily increases!

I am not going to trouble you too much with what happen when we
introduce a third colour (energy quantum with differnt value) into
the picture, say green. You can work out the details yourself. Here
are some clues. A picture with two isolated pixels (one blue and one
green) against a yellow background has an entropy of (4+4)D = 8D. But
if the blue and green pixels commute (are connected), the entropy
becomes ((4+4)+(4+4))D = 16D and not 12D as in the case of two
commuting pixels with the same colour!

This is an extremely important result! By introducing more variety
into the picture, maintaining the same pattern, the entropy increases
even more. By now Winfried Dressler's hair should begin rising,
because of his passion to learn more about the seven essentilaities
of creativity. By introducing the thrird colour green, we have
formalised the essentiality "quality-variety" (rangeness).

However, some of you may have considered the blue and green pixel
next to each other (and not far away from each other) not as two
commuting pixels, but as two isolated pixels next to each other.
(There is nothing wrong with doing so.) In other words, they are two
connected, but non-cummuting pixels. If you did so, the entropy for
the picture must be (4+4)D = 8D and not 16D as when they were
commuting. In other words, the difference between the two cases
formalises the essentiality "connect-beget" (fruitfulness).

Note that as your picture become more complex by the coloured forms
depicted in it, the more you will have to make use of the
essentiality "associativity-monadicity" (wholeness) to decide what
amounts to a whole figure in the picture.

By now you should get the general idea that the richer the picture is
in organisation (structures and how they are related to each other),
the greater the entropy of the picture.

Now take any picture rich in entropy, i.e organistion. Calculate its
entropy. Reduce the organisation of the picture as follows. Seek all
the red pixels and move them all together to form one red
horisontal band (100 pixels wide and a few pixels in height). Seek
all the orange pixels and do the same. The organge band will
generally have a different height. Do it for all the colours until a
100x100 picture has been derived with no other organisation than
coloured bands from top to bottom. It will look like a rainbow.
Calculate the entropy of this rainbow. It will be much, much less
than the orginal picture rich in organisation.

This rainbow picture is another way to illustrate what I mean by
"academical apartheid". Each of the faculties of the acedemy in a
university is like a colour band of the rainbow. Think of one faculty
representing the colour blue. The majority of those working in that
faculty, believe that they are painting a picture. They are indeed
doing it, but only in different shades of blue. Another faculty does
it in red. They all are trying to paint the same picture, namely
reality. But each faculty does it in terms of a filter - blue, red
or whatever other colour. Unfortunately, the picture of all these
faculties, seen from a distance, is a rainbow and nothing else. It is
only when we come closer to each band that we see a picture in this
band. Thus we have identities, but no categoricity. Note that I have
formalised the essentiality "identitiy-categoricity" (sureness).

Similarly, beware of "organisational apartheid" - the death of a
"learning organisation".

This contribution has now became long and I did not intend it. So
let me finsish off by mentioning another essentiality, namely
"becoming-being" (aliveness).

Joe, you have explictly requested with respect to both "entropy"
and "entropy production" the following:
> Would anyone who is knowledgeable in this area have time
> to put the ideas into a short story or illustration that will
> help me?
Well, I did not tell a story, but I hope that my illustration with
the Picture-Pixel model will work.

I will now switch over to the story. The Picture-Pixel Model as I
have described it so far, has one immense drawback. It is static and
have no dynamics of its own. It is a being, but not also a becoming.
It is true that "entropy" is a picture. This is what is teached to
students in physics, chemistry and engineering by Carnot Cycles.
But they are teached nothing about "entropy prodution", the becoming.

The Second Law is principally not about "entropy", but about "entropy
production"! Now what is the difference between the two? The
difference is the same as the difference between a picture (for
entropy) and a movie (for entropy production). It is the movie and
not the picture which has dynamics of its own, which depicts beings
and their becomings.

The more people look at movies, the more they loose their ability to
imagine dynamics into that static picture. In other words, the more
people look at movies, the more their own imagination (part of their
own dynamics) gets quenched.. Thus they find it more and more
difficult to read and comprehend a book. A book, even with all the
letters in it, is nothing more than a static being. The less we can
become in our own minds, the more the books will become closed to us.

I try to guide my granddaughter Jessica through experiencing this
essentiality. She has now much more passion for having me reading her
a story than to look at the TV wonderful stories, even like The Lion
King. In less than two hours, I will be picking her and my wife up.
This weekend we will be experiencing becoming in nature. See you
next week, if God permits.

Best wishes


At de Lange Gold Fields Computer Centre for Education University of Pretoria Pretoria, South Africa email: amdelange@gold.up.ac.za

Learning-org -- Hosted by Rick Karash <rkarash@karash.com> Public Dialog on Learning Organizations -- <http://www.learning-org.com>