by MrBean » Sun 24 Oct 2004, 22:42:52
$this->bbcode_second_pass_quote('MonteQuest', '
')The only definition there is to consider is the one defined by the second law of thermodynamics. This is the new world view we must embrace.
But isn't defining (dis)order by entropy tautological?
snip from the book I mentioned (pp 137-139):
"Such transformations between randomness and simple regular orders are intimately related to the entropy of a system. The notion of entropy is a concept of particular importance, not only in physics but in chemistry and the life sicences. Entropy is popularly described as the measure of disorder in a system. a notion that clearly carries subjective overtons. On the other hand, the science of thermodynamics enables the quantity known as entropy to be measured objectively in terms of the amount of heat and work that is associated with a system. Left to itself, a physical system tends to maximize its entorpy, a process which is therefore associated with decay, disintegrating, "running down", and increasing disorder in the system. But accordint to the metaphor that chaos
is order, an increase in entropy has to be understood in a different way, that is,in terms of a kind of change of order.
Of key importance in this connection is the idea of a
range of variatian in random and chaotic montion. This idea was introduced earlier in the case of the grouping of shots from a gun. A more interesting example, however, arises from a river that is in chaotic movement. Imagine an irregular and changing whirlpool that fluctuates in a very complex way, but always remains within a certain region of the river. The whirlpool may perhaps be roughly determined by neighboring rocks or features in the riverbed. As the velocity of the river increases, this variation in space may grow. But in addition, there sill also be an inward growth of subvortices of ever finer nature. Therefore, a measure of the overall range of variation of the whirlpool should include both of these factors - the inward and the outward growth.
As a matter of fact, in classical mechanics, a natural measure of this kind has already been worked out. Its technical name is
phase space and its measure is determined by multiplying the range of variation of position and the range of variation in momentum. The former, the range of variation in position, corresponds roughly to the changes in location of the vortex as it spreads out into the river and the surrounding water becomes more agitated. The latter, the range of variation of momentum, corresponds to the extent to which the whirlpool is excited internally so that it breaks into finer and finer vortices.
Clearly the measure in phase space corresponds quite well to an intuitive notion of the overall degree of order involved in the flow. For the more the general location of the vortex expands, the higher is the degree of order; and the more the internal vortices subdivide, the gigher also is the degree of order. What is particularly interesting about this measure in phase space is that it corresponds to what is actually used in physics to define entropy.
Entropy is a concept which is of vital importance in many areas of science, yet which lacks a clear physical interpretation. For example, there has been much debata on the extent to which the concept of entropy is subjective or objective. However, with the present approach to the notion of order, chaos and randomness, it is now possible to clarify what is meant by entropy.
Consider an isolated system of interacting particles. Each particle acts as a contingency for all the others, in such a way that the overall motion tends to be chaotic. When such a system is left to itself, it moves toward what is called thermal equilibrium, a condition in which tere is no net flow of heat or energy within the system and regular suborders vanish almost entirely. In this state of equilibrium, the entropy of the system is at its maximum. This maximum entropy is therefore associated with the inability of the system to carry out work, transfer usefull energy from one region to the other, or in any other way generate global orders of activity.
In statistical mechanics the numerical value of this entropy is calculated from the range of random montion in phase space. (To be more exact, it is the logarithm of this measure.) This means that when energy is added to the system, the range of random motion will grow and the corresponding entropy will increase.
A change in entropy is therefor a measure of the change in the range of fluctuations that occur within the random order. Entropy now has a clear meaning that is independent of subjective knowledge or judgement about the details of the fluctuation. This approach to entropy does not require any discussion of disorder, which in any case cannot be defined in a clear way. Treating entropy in this fashion avoids many of the difficulties normally associated with this topic, such as the subjective notion of what could be meant by disorder. After all, since entropy is an objective property of a system which can actually be observed with the aid of thermodynamic processes, why then should subjective and ultimately undefinable feelings about disorder affect the objective behavior of such a system?"
This gives only small glimpse of Bohm's notion of order, I hope you read the whole book.