**Next message:**AM de Lange: "Learning and Grace. LO26422"**Previous message:**AM de Lange: "Schrodinger's Cat, his Kittens and Reality? LO26420"**In reply to:**Gavin Ritz: "The dance of LEP on LEC LO26360"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ]**Mail actions:**[ respond to this message ] [ mail a new topic ]

Replying to LO26360 --

Dear Organlearners

Gavin Ritz <garritz@xtra.co.nz> writes:

*>What I would like to see is, how specifically one
*

*>can measure the state of an organization (entropy)
*

*>(in the business sense) with respect to (and some
*

*>nice real life examples).
*

Greetings dear Gavin,

You ask a very difficult question because much of the answer is not yet

known. I will answer you as far as literature can help us. Then I will

then add to that answer from my own experiences on the discovery that LEP

has not only a physical dimension, but also a spiritual one too.

CHANGES IN entropy S was first "measured" some hundred and fifty years

ago. I cannot stress enough this "changes in entropy" /_\S rather than the

entropy S itself. When a system receives an amount Q of heat reversibly,

the entropy change is given by dividing the heat q (measured in the unit

joule or J) by the absolute temperature T (measured in the unit kelvin or

K), i.e.

. /_\S = Q/T

In other words, first the heat Q and temperature T are measured and then

the change in entropy /_\S is calculated from these two measurements. This

is why I began this paragraph by putting "measured" in quotation marks.

Knowing the value of a change in entropy /_\S involves not only

measurements, but also a calculation! This cannot be avoided because it is

impossible to measure entropy directly.

So how did physicists came to know the entropy of a system when they could

measure only changes in entropy? They made use of two assumptions. The

first is a mathematical assumption. If the system's entropy was at a value

S(begin) and then changes by /_\S, then the system's entropy reaches the

value S(end) given by

. S(end) = S(begin) + /_\S

Somehow they needed an original value to which they can add all these

changes which they can calculate based on measurements. So they made a

second assumption that

. S(zero kelvin) = 0 J/K

It is often called the "zeroth law", but it is not actually a law. It is

rather a conjecture, namely that at the lowest possible temperature in

nature all systems have no thermal energy and thus no "thermal entropy".

The basic idea of the reversible Carnot cycle in thermodynamics was to

justify the first assumption, namely that the entropy S is indeed a

systemic quantity. The second assumption also needed justification. Thus

the quest for reaching the lowest possible absolute temperature began.

Unfortunately, the second assumption caused among far too many physicists

the mental model that entropy is purely a thermal property. Furthermore,

this second assumption together with the endeavour of few physicists like

Boltzmann to explain the second law of thermodynamics (LEP) in terms of

Newtonian mechanics by using principles of statistics caused also another

mental model among most physicists. Since (by assumption) the entropy S is

zero at zero temperature and since the crystal order of matter is not

disturbed by thermal energy at zero temperature, entropy can express only

chaos and not order too.

Gavin, we have to delve deeper in this last mental model otherwise it will

constrain our thinking on complex systems seriously. The "thermal entropy"

is indeed zero at zero temperature. But to assume that when this zero

"thermal entropy" is added to the remaining entropy of the system the sum

will still be zero (the "zeroth law") is to assume indirectly that this

remaining entropy is also zero. However, we can also assume any other

finite, constant value for it and not necessarily zero as the constant

value. Should we do this, we will not endanger the first assumption, but

only make it more complex. Firstly, we will have to think more relatively.

Secondly, we will have to interpret this constant, non-zero value. I think

that the best interpretation possible is that this non-zero remaining

entropy expresses the "structural entropy" which exists even at zero

temperature. Once we have freed ourselves from this last mental model, we

will be able to think of entropy as expressing both chaos (such as

"thermal entropy" does) and order (such as "structural entropy" does).

So, is there any example for thinking about "structural entropy" in a

sensible manner? Fortunately, yes. JW Gibbs showed through profoundly

holistic work how to compute the "chemical entropy" of a chemical

substance. This "chemical entropy" is the entropy which a substance has

because of its chemical organisation and not merely because of its

temperature. Unfortunately, this computation is complex because it

requires many measurements covering many different changes, even that

change better known as the chemical reaction. It also requires a high

degree of relativity. Not only is it assumed that the thermal entropy at

zero kelvin is zero, but also that the chemical entropy of all elements

(but not compounds) at zero kelvin are constant. Since chemists work

basically with changes /_\S rather than the boundaries S(begin) and

S(end), what better constant value than the value zero could be given to

the "structural entropy" for each of the elements at zero temperature? To

begin at zero with a complex string of additions to follow is the easiest.

For example, we do it in topographic maps by assume the height at sea

level to be zero. Should we have taken the height at to bottom of the

deepest trench in the sea to be zero, obviously the height at sea level

will be non-zero. Despite this, we will still be able to draw lines of

equal increases in height on topographic maps.

The computation of the entropy of chemical substances is so complex,

tedious and expensive that the entropy of only a few thousand of the

millions of known substances have been computed after almost a century.

(Note also that most students in physical chemistry struggle immensely

with this complex part of the course ;-) Usually it is substances with

immense industrial value or laboratory significance of which the entropy

eventually became calculated. Once we look at such known entropy values of

chemical substances, the notion of the physicists that entropy express

thermal chaos has to be abandoned. These values for chemical compounds

also express order in chemical structure!

Perhaps the most important thing we can learn from chemistry is that other

quantities which depend on the entropy S and total energy E of a system,

become more important in managing chemical reactions than S and E. Here

the free energy F or the Gibbs' free energy G (somewhat more complex than

F) stands first in the row. One look at /_\G of a reaction tells the

chemist far more than the /_\S and /_\E of that reaction. The chemist will

know how far the reaction will proceed (chemical efficiency) and what

concentrations for the compounds at equilibrium conditions can be

expected.

Another very important thing we learn from chemistry is that all kinds of

chemical reactions are associated with a change of entropy. In other

words, each kind of chemical reaction (like acid-base and redox) is a

different manifestation of entropy production. It means that the

knowledgable chemist begins to expect changes in entropy not because

measurements and calculations tell it, but because the peculiar

manifestations of such entropy changes tell it. The more complex a

reaction becomes, the more its unique behaviour becomes the telling

indication of its unique entropy production.

When we plan and build an entire chemical industry, then we will use the

exact entropy values determined by Gibbs' empirical procedure. But when we

want to explore new grounds and speculate on possibilities, the complexity

of Gibbs' procedure becomes a constraint. We need simpler procedures to

get quickly some indication of where we are heading to. Any such a

procedure is based on a model for the entropy of a complex chemical system

rather than its empirical determination. One such a modelling technique

has already been used by physicists employing Newtonian mechanics and

principles of statistics. It is better known as statistical mechanics.

However, this model of statistical mechanics is of little help to the

chemist because it considers all mechanical objects as points without

inner structure. Furthermore, since most chemists are interested in the

various chemical manifestations of entropy production rather than the

entropy production itself, little advancement has been made in creating

models suitable for chemistry.

I myself wanted at some stage to establish a causality pattern between

entropy production and its chemical manifestations. So I have devised a

model based on my concept of commutation. This model made it possible to

compute in a non-standard way (i.e not measuring heat and temperature) the

entropy of a molecule and thus determine the best fitting structure for

it. Because of the experience gained in creating this commutation model,

I think it is possible to devise other models too, each with its own

non-standard computation of the "entropy" of a system. I put the entropy

in quotation marks because this "entropy computed according to the model"

is not exactly equal to the empirically determined entropy.

Nevertheless, we ought trying to improve each such a model so that its

non-standard computations make comparisons possible which are isomorphic

(similar form but not equal content) to comparisons based on standard

entropy values.

Such non-standard models become crucially important with respect to living

systems. We cannot apply the standard procedure to living systems because

we will have to cool them off to absolute zero temperature. No living

system will survive it. One way open to us, is to consider the living

system as the emergent whole of a complex bio-chemical system consisting

of hundreds, if not thousands, of bio-chemical reactions. We then add all

these hundreds of chemical reactions together, also making provision for

how they interact with each other so that the whole is more than the sum

of the parts. The calculations will be hideously complex. I once did it

for a simple bacterium and it took me months to compute the entropy of the

bacterium.

Gavin, the extensive account above on the chemical viewpoint of entropy

(involving ordered structures) is in line with my own experiences rather

than in line with what you can find anyway in standard textbooks on

chemical thermodynamics. My own experiences with entropy involve not only

chemistry and physics, but also soils, plants and animals up to 1983.

Since then my experiences also involved the mental dimension of entropy

and not merely its material dimension. I now want to stress in terms of

these experiences that we will need more than empirically based

calculations and model based computations to uncover the complex outcomes

of entropy production. Measurements and calculation will bring us to a

certain level of understanding, but to proceed beyond that level into

higher levels of understanding, we will have take more than measuring and

calculating into account. What else will we have to take into account? I

can summarise it one sentence -- patterns persisting through all levels in

any complex system. To work with determinations of entropy production

alone and not also these persisting patterns is like trying to work with

our five senses but not also our brain. Not even a moron will do it.

Allow me to explain it in terms of my own empirical discovery that LEP has

both the material and mental dimensions. In my teaching I became fed-up

with the existing taxonomies of learning objectives because they helped

little, if not actually constraining, the mastery of chemistry. So I

created hundred of objectives which would definitely help the student, how

unfitting these objectives may be with respect to existing taxonomies.

Afterwards I began to search for taxonomical patterns among these

objectives. I felt like Linneaus, trying to find order among so many

herbarium and museum species. Perhaps the most striking pattern for him

was the distinction between plants and animals. When comparing my work to

it, the most striking pattern was the distinction between structural

objectives ("beings") and procedural objectives ("becomings"). I also

discovered two other clear patterns, but articulating them became a

nightmare for me. The closest I came was to call them the categoricity and

monadicity patterns.

I then decided to measure how efficiently this taxonomy with its three

strange patterns helped the students in their learning. I considered

various domains (like logic, ethic and systems systematic) for quantifying

some of their learning. Eventually I decided on logic. I learned as much

as possible of logic in its broadest sense so as to quantify each possible

logical ACT as a CHANGE. My greatest problem at that stage was to decide

between a linear or a non-linear quantification. I chose the non-linear

case because, curiously enough, it was easier for me. Obviously, because

of my choice, I measured only the logical dimension of the students'

learning. My "experiment" began in 1982. I began to plot the learning

performances of the students (based on some 50 000 calculations using some

20 000 logical measurements) graphically. The graphs shocked me. I

expected a statistical indication (bell curve, standard deviation, etc.)

to determine the efficiency of their learning. But the graphs showed a

pattern unique to entropy production in the physical world. Then, as a

scientist ought to do, I first established that this unique pattern could

be repeated. Thereafter in the next year 1983 I established that this

unique pattern could not be falsified.

By way of this unique pattern I had to conclude that LEP acts in both the

physical and spiritual dimensions of reality. This unique pattern is

orders more complex than the pattern

. Q/T(low) - Q/T(high) > 0

which allowed Clausius to infer that a strange law (LEP) was operating

here (see the Primer on Entropy). This latter pattern needs four physical

measurements and three calculations whereas the unique pattern mention

above needed orders more of logical "measurements" and calculations.

However, in both cases the patterns were as crucial in identifying LEP as

the measurmenets and calculations.

Gavin, to conclude, I want to stress six points. Firstly, you will have to

create your own non-standard model for computing the entropic changes of

systems as a result of processes in them. Secondly, when you consider a

complex ensemble of computations according to your non-standard model and

they exhibit no pattern typical to the manifestations of entropy

production, improve the model rather pushing a far-fetched interpretation

on the existing model. Thirdly, you will break new ground and thus cannot

expect much guidance from existing literature which seldom measure up to

what the seven essentialities require. Fourthly, quantify a dimension of

thinking (like I did with logic) of which its structures and processes

have been well established by other thinkers. Fifthly, always try

differentiate between those computations based on form and those

computations based on content because the entropy S is the form of energy

E as its content. Sixthly, when you want to proceed from energy E and

entropy S to work W and free energy F so as to dance with LEP on LEC, seek

for harmony between form and content in the model.

What patterns are typical manifestations of entropy production -- patterns

which your computational model should be able to lift out? I think

foremost is that pattern which will enable you to distinguish between

intensive and extensive variables so that you subsequently can identify

entropic forces and entropic fluxes. Another important pattern is that one

which will enable you to distinguish between changes close to equilibrium

and changes close to the edge of chaos. Then there is also a number of

patterns (like electrophoresis and chromatography) unique to entropy

production. They all have one feature in common -- they depict a

one-to-many-mapping in some or other feature of the system.

I am deeply under the impression that my advice seems to be woolly or even

mystic. However, I am also under the impression that you are learning

creatively about some key issues on entropy production because you write:

*>1- Maximum entropy (no flow of energy)
*

*>2- Dynamic equilibrium (i.e. Si increasing and
*

*>Se flowing out)
*

*>3- Not at equilibrium ( Si increasing, Se going in
*

*>or out of the system)
*

Your issue 1 refers to what chemists generally recognise as the

equilibrium state. Your issue 2 refers to what some irreversible

thermodynamists recognise as thermokinetics -- something between

thermostatics and thermodynamics proper. Your issue 3 refers to the

endless becoming of a system SY together with its surroundings SU. Some

systems become extinct, others exist without change while the rest evolve

into systems with more complex organisations -- all as a result of

non-equilibrium conditions. Thus your three issues indeed cover a very

rich picture.

Lastly, you write:

*>This connects with the article on grace and
*

*>learning (LO 26327) the 14th paragraph about
*

*>transfer of E and S to and from a system.
*

*>
*

*>Looking forward to some interesting answers.
*

I am deeply under the impression that the understanding which I gained by

seeking the wholeness between entropy production from below and loving

grace from above is incomprehensible by many other fellow learners. Even

worse, I have to recommend strongly to fellow learners that they ought to

question my understanding with their own authentic learning rather than

importing it by rote learning. Insight gained by inner irreversible

entropy production /_\(irr)S is vastly superior to understandings imported

from the outside by reversible entropy changes /_\(rev)S. The former

drives all kinds of evolution whereas the latter can easily undo

evolution. The authenticity of fellow learners is much more important to

me than my ramblings on entropy (the picture) and its production (the

movie).

My long answer may not be as interesting as you might have expected. It

says, in a grand summary, that although empirical measurements are

necessary to our understanding, they are not sufficient. We also need to

extend them with computations based on models so as not to destroy the

system when measuring it. With respect to the sufficiency requirement, we

have to rely far more on patterns indicative of entropy production. Our

understanding involves more than measurements because the essentiality

otherness ("quality-variety") cannot ever be reduced to or be replaced by

spareness ("quantity-limit").

Our understanding also involves mental emergences and thus the

essentiality openness. To measure the novel requires an already

established unit which thus cannot self be novel. The innovator has no

peers. Thus it is impossible to identify novel emergences with

measurements. It is easier for a blind person to identify any painting as

a great work of art. To identify the miracle of emergences we have to open

ourselves up to patterns persistent in this miracle.

Gavin, I admire your opening up to new dimensions of understanding. You

have surprised me because I sometimes speculated privately that you are

also a dealer in the proliferating junk of systems thinking. Although I

try to avoid judgement, I cannot avoid speculations because they are

integral to the scientific method as I understand it. I need to make

speculations on my initial observations so as to falsify them. The

remaining speculations which cannot be falsified help me to complexify my

own understanding. You are indeed a thinker struggling with managing the

complexity of human systems. The future of humankind depends on thinkers

like you because managing human systems with simplicity is bringing

humankind to the edge of chaos with its inevitable bifurcation. Simplicity

management will eventually propel humankind into an immergence too ghastly

to contemplate. But complexity management will attract humankind into the

emergence of a higher consciousness where harmony rather than

confrontation will be sought.

The more I explore the dance of LEP on LEC, the more I become convinced

that this dance will play a crucial role in overcoming the looming ethical

dilemmas of lately. These dilemmas stem from the fact that whereas many

humans now create immensely complex systems, very few humans can manage

these complex systems for the better of the entire Creation. Gavin, please

take extreme care when you introduce the complexity of LEP dancing on LEC

to facilitate managers on complex systems. It will not be as mild as

putting a cat among chickens in a pen.

Entropy production is real, even and especially when we merely speak about

it. When we merely speak about it, we deluge others with our own entropy

production rather than guiding them to produce entropy self when learning

about entropy production. Any deluge of entropy is dangerous for all

complex, self-organising systems. How I wish for the seemingly impossible,

namely that every long contribution of mine on entropy production to this

LO-dialogue was not so dangerous. The only way to minimize this danger

when speaking of entropy production, is to do it by way of dialogue in a

LO. Thus I would appreaciate your comments as well as the contributions of

fellow learners very much.

With care and best wishes

--At de Lange <amdelange@gold.up.ac.za> Snailmail: A M de Lange Gold Fields Computer Centre Faculty of Science - University of Pretoria Pretoria 0001 - Rep of South Africa

Learning-org -- Hosted by Rick Karash <Richard@Karash.com> Public Dialog on Learning Organizations -- <http://www.learning-org.com>

**Next message:**AM de Lange: "Learning and Grace. LO26422"**Previous message:**AM de Lange: "Schrodinger's Cat, his Kittens and Reality? LO26420"**In reply to:**Gavin Ritz: "The dance of LEP on LEC LO26360"**Messages sorted by:**[ date ] [ thread ] [ subject ] [ author ]**Mail actions:**[ respond to this message ] [ mail a new topic ]

* "Learning-org"
and the format of our message identifiers (LO1234, etc.) are
trademarks of Richard Karash.
*