Next Article in Journal
Open Access Publishing Policy and Efficient Editorial Procedure
Previous Article in Journal
Editor-in-Chief's Report
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Structuring Information and Entropy: Catalyst as Information Carrier

20 avenue des Cottages, 69300, Caluire, France
Entropy 2006, 8(3), 113-130; https://doi.org/10.3390/e8030113
Submission received: 16 March 2006 / Accepted: 15 June 2006 / Published: 16 June 2006

Abstract

:
Many authors tried to exploit the similarities between expressions of the statistical thermodynamics for the entropy and those of Shannon's information theory. In a new approach, we highlight the role of information involved in chemical systems, in particular in the interaction between catalysts and reactants, what we call structuring information. By means of examples, we present some applications of this concept to the biosphere, by visiting a very vast domain going from the appearance of life on earth to its present evolution.

Introduction : Entropy and Information revisited

The analogy between entropy and information has been proposed by many authors since Brillouin [1,2], who found similarities between expressions of the statistical thermodynamics for entropy and those of Shannon's information theory [3].
For Boltzmann, entropy is linked to the number W of complexions of a system:
S =kBln(W)
kB is known as Boltzmann’s constant = 1.38 x 10-23 J/molecule.K
Accordingly, the entropy S of a system can be expressed as follows:
S = k B p i ln ( p i )
pi being the probability of an elementary state characterized by its level of energy as well as by its spatial conformation.
Relation (2) may identify with relation (1) if all of the states are equally probable.
These relations stemming from Boltzmann's theory apply to thermodynamic systems with different possible configurations linked to their energy levels. The introduction of the constant kB leads to express entropy as J/molecule.K.
For any system made up of various elements or objects, Tonnelat has proposed a relation similar in form to (1) [4]:
C = ln Ω
C being the complexity of the system and Ω its realizability, equal to the number of possible states of the system or complexions. C is dimensionless, since the constant kB was not used in this formulation.
The realizability can be linked to energy levels, as in the framework of the Boltzman theory, as well as to spatial configurations and to any characteristics of the system's elements. For a molecular system, the complexity is comparable to conformational entropy.
Shannon's theory of information proposed to quantify the information contained in any message made of a succession of symbols xi via the following expression:
H = p i log 2 ( p i )
pi being the frequency of use, or probability of appearance, of the symbols xi used to write the message. The set of all of the symbols xi constitutes the alphabet used.
Function H represents the average information carrying capacity of symbols. The information content of a message made of N symbols will therefore be expressed by the product N.H.
More generally, the amount of information Hi linked to an event with a probability of appearance pi can be evaluated using the following expression:
Hi = -log2(pi)
The information quantity unit as defined by the above relation is called a bit and corresponds to a probability of occurrence of the event of 1/2.
To illustrate these concepts, let us consider the classic case of a number expressed in binary notation stored in computer memory. If we have N digits, the number of complexions W or realizability Ω is 2N; therefore the complexity is given by:
C=ln(Ω)=ln(2N)
The amount of information stored by this number is obtained from the relation (4) as follows:
I = N . [ p i . log 2 ( p i ) ]   with   p i = 1 / 2
And, as it is well known, I=N bit.
When storing a N digit number, the state of the memory is fixed and the complexity becomes zero. Therefore, it can be said that the introduction of N bit of information reduces the complexity from ln(2N) to 0.
While the similarity between the expressions (2) and (4) is obvious, the equivalence between entropy and the quantity of information contained in a system is more difficult to accept. This link between entropy and information has even been hotly disputed by many authors, Tonnelat in particular [4].
We must always be careful when using analogies that can only be formal and do not correspond to any physical reality. We shall, however, try to revisit this formal analogy between entropy and information and formulate it differently from what has been proposed thus far.
Let us take an example from everyday life and consider the case of sorting a set of N distinguishable objects, characterized by different parameters as their shape, their color, the material, they are made of, their dimensions, etc...
At first we have a set of N objects, mixed randomly, without identification of the characteristics of each object. Dealing with a set of objects, we shall use the complexity C rather than the entropy S to characterize its degree of organization.
The complexion W or realizibility Ω of this set can be calculated based on the number of possible permutations between objects. We thus write:
W = Ω = N!
The complexity C associated with this situation can be obtained using equation (3), i.e.:
C =ln(Ω)=ln(N!)
Now, suppose we can sort these objects based on n adequate criteria. After sorting, the N objects will be distributed between n groups in which the numbers of objects will be Ni (i=1 to n). The complexity C’ of the new set thus modified is then calculated as follows:
Ω’ = ∏(Ni!) (i=1…n)
C’ =ln(Ω’)=ln(∏(Ni!)
The quantity of information corresponding to the set, and necessary to characterize it, can be obtained using the Shannon’s formula (equation 4):
H = p i log 2 ( p i )
With
pi=Ni/N
The quantity of information is then calculated as indicated above:
I = [ N i . log 2 ( N i / N ) ]
We can therefore compare the variation in this information with the variation in complexity of the system following the sorting performed based on the information linked to the criteria used:
Δ C = ln ( ( N i ! ) ln ( N ! ) )
Δ I = [ N i . log 2 ( N i / N ) ] + log 2 ( 1 / N )
As an example let us assume the following numerical values: N=100 identical balls of 4 different colors (n=4) N1=25 (black) N2=20 (white) N3=40 (red) N4=15 (green)
From the previous relations, we can calculate ∆C and ∆I:
C=-125 ∆I=184 bit
We may thus conclude that using information when modifying a system leads to a reduction of its complexity.
Many examples of the same type could be proposed to illustrate our conclusion: a limited amount of information can greatly increase the probability of an event and therefore render this event very likely. Therefore, the link between entropy and information, or more generally, between complexity and information could be formulated as follows:
When modifying a system made up of many components, information input can reduce the random complexity of the system and therefore increase its degree of organization.
We suggest calling this information, which modifies the complexity of a system, structuring information, because, when introduced into the given system, it enables its structure to be changed.
The use of Shannon information, as defined above, is nonetheless restricted to systems similar to a message composed of a succession of characters or events. Generally, it is necessary to have an adapted definition of a system's information content. Roederer calls this pragmatic information [12].
The notion of complexity is often used intuitively, which sometimes creates confusion. Two sorts of complexity should be distinguished, however [5]: random complexity and organized complexity. The first involves disorder or chaos while the second corresponds to a highly structured system, often rich in information.
The following table summarizes these various notions by qualitatively indicating their possible evolutions.
Disorder, Chaos ----------------→Order
Entropy + Negentropy ----------→Decrease in entropy
Random complexity + Structuring Information ----→Organized complexity
The term "negentropy" was introduced by Brillouin and we use it as a synonym of decrease in entropy.
Any physicochemical transformation is subject to the laws of thermodynamics, which can be summarized as follows:
-
Conservation of the energy
-
Spontaneous evolution of a closed system if its free energy decreases
-
Any irreversible transformation is accompanied by an increase in entropy.
Since they were formulated, these requirements seem to have been in contradiction with the appearance and development of the life on earth, which involves a decrease in entropy. Quite the contrary, for a system receiving free energy from the environment, local production of negentropy is possible. This argument, accepted by many authors as an explanation of the appearance and development of life, was even considered by E.J. Chaisson as the fundamental mechanism of cosmic evolution [27].
The entropy of a physicochemical system made up of molecules can, in principle, be evaluated using classical thermodynamics. On the other hand, the definition of its complexity, as well as its information content, is not straightforward. The complexity could be linked to the number of different atomic species involved in molecules and to their bondings. For this purpose, group contribution methods could perhaps be developed, as it is for the evaluation of properties of pure compounds [30].

Application to the biosphere

Hereafter, "biosphere" is taken to mean all space around the earth in which living organisms are present (the atmosphere, hydrosphere and lithosphere). In this vast space, there is a great variety of organisms ranging from plants, single-cell bacteria and animals to humans. Therefore, we are facing a very complex system in which physical as well as chemical mechanisms are involved in a highly intricate way. Almost all of these mechanisms are linked to the energy input in the form of solar radiation.
Most of the energy received by the biosphere is transformed into heat, which means a destruction of available free energy and a production of entropy. As a first approximation, the free energy destruction in the biosphere may be taken as the difference between the energy delivered by the solar radiation and the energy of the waste heat emitted by the earth. Only a small part of this flow of energy is used by the biosphere for the development of living organisms.
These flows of energy and entropy have been evaluated by several authors and the following figures are generally accepted [19]:
  • Net energy input into the biosphere : 119 000 TW
  • Entropy production : around 500 TW/K 1
  • Energy consumption by living organisms : 95 TW
The bulk of the energy received by the earth is either reflected or degraded into heat and therefore produces entropy. Yet, based on what has been said above, a small part subtracted from the total flow of energy is utilized by the biosphere and used for the development of life.
It is clear that life is related to solar energy, which globally generates negentropy if its degradation into heat is avoided. However, this available free energy must be captured and used to carry out transformations with a decrease in entropy. It is at this stage that information, in any appropriate form, could play its role, as illustrated in the example given in the previous paragraph.
Let us take another very simple example to clearly illustrate how information could enable a physical operation leading to a decrease in entropy. The isothermal mixing of two ideal gases, A and B, produces an increase in entropy ∆S, expressed as follows:
S=-n.R[xA.ln(xA)+xB.ln(xB)]
xA and xB being the molar fractions of A and B in the mixture made up of a total of n moles.
The content of Shannon information that can be associated with this mixture is obtained using equation (4):
I=-n[xA.ln(xA)+xB.ln(xB)]
Using the above equation depends on whether or not it is possible to know the values of XA and XB. This is relatively easy for an observer with instruments such as a gas chromatograph or a mass spectrometer. However, achieving this aim requires knowing the identities of the A and B species, a qualitative information relative to the mixture.
Let us now separate this mixture into its two components A and B, using an appropriate means, such as adsorption on a solid surface or permeation through a membrane. The decrease in entropy of the gaseous system will be –∆S.
As the mixing of gases is not a reversible operation, this decrease in entropy requires an input of external free energy. However, separation is possible only if the information associated with the mixture is used to select the solid adsorbing surface or the membrane in order to differentiate the two gaseous species. It could be said that the solid adsorbent or the membrane uses the information linked to each molecule of gas in order to produce its separation. The process results from the interaction between the gas and the information-selected separation medium.
When dealing with physicochemical systems involving atoms, ions and molecules, the information linked to these components is to be found in the electronic configuration responsible for the existence of chemical bonds. These bonds take various forms (ionic, covalent, coordination, orbitals, etc..) depending on the number of electrons involved and their localization. The spatial conformation of molecules also carries information [29].
As stated by E.J. Chaisson, at the most basic level, gravitation is a promoter in the evolution of all organization [27]. Therefore, for systems made up of large amounts of materials, fundamental laws, like Newton’s mass attraction law, could be considered as a source of information. Such an example may be found in the transformation of heat generated by the absorption of solar radiation at the earth's surface, generating a temperature gradient in the troposphere and, consequently, wind around the globe. In this case, the promoters of organization are gravity (F=mg) and the behavior of the gases (PV=nRT), allowing the transformation of heat into kinetic energy, and therefore a reduction in entropy. At the same time, the vaporization of water at the surface of the globe and its condensation at high altitude generates hydraulic potential energy. As a result, a giant natural thermal engine produces both kinetic and potential energy, utilizing air and water as working fluids and the difference in temperature between the earth's surface and the stratosphere, the driving force being linked to gravity that steers the transformation.
The flux of energy for these two phenomena is estimated at approximately 102 W/m2 of the surface of the earth [17]. If this process is considered as a thermal engine working between the temperatures of the surface of the globe (around 300K) and that of the high atmosphere (around 250K), we can calculate a maximum yield of 17%. It may therefore be deduced that the thus generated mechanical energy (17 W/m2) corresponds to a production of negentropy of 17/300 W/K.m2, or for the whole globe, approximately 29 TW/K. However, we must note that this mechanical energy will eventually be degraded into heat and dissipated in the form of infrared radiation.
This example shows, as previously said, that a flow of free energy is required to generate negentropy. However, we must point out that this is not sufficient. The system must present a kind of organization allowing the saving of free energy and the production of negentropy. This is valid for any thermal engine. To illustrate this, let us compare a furnace and a gas turbine fueled with natural gas. The furnace produces heat exclusively (i.e. entropy), whereas the turbine generates mechanical power as well as heat. This difference is linked to the internal organization of the gas turbine.

Chemical reactions

It is worth to note that the biochemical reactions involved in life process take place at conditions chemists consider as mild, compared with those used in industry. Generally, the pressure is close to one atmosphere, while the temperature is around 20°C, generally ranging from 0°C to 50°C. In spite of these mild operating conditions the rates of the reactions involved are surprisingly high, thanks to the use of very active and highly selective catalysts.
Various theories have successively been proposed to formulate and explain the expressions of chemical kinetics observed in practice [6]. Thus, for a simple reaction, the proposed expressions include always two terms that may appear as follows, in a simplified way:
r = k α exp Δ G * k B T
With:
  • r: rate of a simple reaction of the type A→B or A+B→C
  • α: a term bringing in the concentrations of the reagent(s)
  • ∆G* : variation in the free energy of the reactive molecules (J/molecule.K).
  • k : kinetic constant
  • kB: Boltzman's constant
  • T: Temperature at which the reaction takes place (K)
Several models have successively been imagined to explain the form of the equation (6). Taking into account an energy barrier that the reactants should cross before being able to react, a model has progressively been refined to become what is often called the theory of the transition state.
The main hypothesis of this model is that the reactants, for example A and B, form an activated transition state AB* before transformation into reaction product C. The rate of reaction is thus written proportional to the concentration of this intermediate compound AB*, which moreover is supposed to be in thermodynamic equilibrium with the reactants A and B.
This equilibrium is typically expressed as:
C A B * C A C B = K e q
The equilibrium constant depends on the temperature and the variation in free energy (∆G*) accompanying the formation of the transition state:
K e q = exp ( Δ G * k B T )
In addition, we have: ∆G*=∆H*-T·∆S*
  • H* being the enthalpy of activation expressed in J/mol.K
  • S* being the entropy of activation expressed in J/mol.K
  • R : constant of perfect gases = 8.31 J/mol.K
These variations in enthalpy and entropy are relative to the formation of the transition state complex from the reactants.
Equation (6) can then be written:
r = k α exp ( Δ S * R ) exp ( Δ H * R T )
In practice, to express experimental kinetic measurements relative to an A+B→C reaction, the kinetic constant is often written as follows:
k A + B = A exp ( E R T )
kA+B being the kinetic constant for reaction A+B→C
E being the energy of activation typically expressed in J/mol.
A is called the pre-exponential factor:
A = k exp ( Δ S * R )
For catalytic reactions, it is often considered that the catalyst has the effect of lowering the free energy of activation for the formation of the transition state compound. However, as the enthalpy of activation of catalytic reactions is often of the same order of magnitude as that of homogeneous reactions (between 40 and 200 kJ/mol), acceleration of the reaction due to catalyst should then be linked to the pre-exponential factor A in which the entropy of activation is lumped.
The entropy of activation is often negative and therefore the increase in rate can not be due to it, but merely to the large increase in the rate constant k of equation (9).
This can be interpreted by the fact that the catalyst favors the formation of a transition complex, which is much more reactive and therefore responsible for the acceleration of the reaction. In the same way, this will also affect the selectivity.
Qualitative explanations of the specific action of the catalyst have often been proposed [6]. The geometric structure of the solid surface of the catalyst was first considered. For example, the inter-atomic distance of the metal elements responsible for the catalytic activity are similar to the bond lengths of the reactants or of the products. The most striking example is the reaction in which benzene is formed from acetylene on a palladium surface. Not only are the inter-atomic distances of the atoms of palladium equal to the bond length between the carbon atoms of the acetylene, but furthermore the atomic network of the palladium forms hexagons of the same size as the molecule of benzene.
Other types of catalyst, whose geometric characteristics are of prime importance, are called shape-selective: their activity and selectivity are linked to the size and shape of the molecules to be transformed. Many industrial processes use such catalysts, made of zeolites.
To have a more general overview of the roles a catalyst can play, we may list some of the ways a catalyst can act, as presented by R.Masel [6]:
-
By stabilizing the transition state,
-
By holding the reactants in close proximity,
-
By holding the reactants in the right configuration to react,
-
By blocking side reactions,
-
By stretching bonds to make them easier to break.
This need to form a transition compound between catalyst and reagent with the best possible spatial configuration for the reaction to take place is certainly even more imperative when homogeneous catalysts or enzymes are used.
Very illustrative examples can be found in polymerization reactions. The case of polypropylene production is quite interesting since two forms - isotactic or syndiotactic, depending on the respective positions of the methyl groups - can be obtained [6]. The isotactic form is produced industrially by using a Ziegler-Natta catalyst made of Ti surrounded by organic poly-aromatic ligands. The active site is the Ti atom, but the ligands play two different roles: one is the binding site and the other prevents the propylene from twisting in order to get methyl groups onto the same side of the polymer chain.
With a similar aim, a recent publication has shown that polymer synthesis with stereocomplex structure can be achieved by using molecular-scale stereoregular template as catalyst, in order to transfer its stereoregular structure to the formed polymer [25].
Concerning enzymes, that are at the basis of any biologic activity, specialists have advocated the formation of stereospecific complexes [10]. To underline this essential aspect, enzymatic reactions are often represented by means of very suggestive schemes (lock and key, for example), clearly showing evidence of this geometric and steric character [11]. Furthermore, this adapted spatial configuration is often at the origin of the enantio-selectivity of the reactions [13]. Several authors have shown that stereoselectivity is due to the entropy term [14,24].
Enzymatic catalysts also stabilize the transition state. This capacity has recently been exploited by building catalytic antibodies that selectively stabilize the transition state for a given reaction. For example, ester hydrolysis or Diels-Adler reaction have been the subject of many studies in order to synthesize esterase antibodies and Diels-Alderases [20]. In these situations the enzyme is considered as an entropy trap: the formation of the transition state is accompanied by a decrease in activation entropy, with ∆S values for the Diels-Adler reaction ranging between -30 and –40 J/mol.K..
Reactions in solutions in more or less polar solvents have also been investigated. The variations in rate were correlated to the polarity of the solvents and to their ability to form an active intermediate compound [6,9].
Nonetheless, we must keep in mind the fact that a high rate of reaction does not necessarily entail a complete conversion of the reactants into products. To accomplish this, the thermodynamics of the transformation must be favorable, i.e. the total variation in free energy must be negative. The role of the transition activated complex is to direct the transformation, but certainly not to let the system evolve towards a state that is not acceptable by thermodynamics.
The catalyst, whatever its physical form (heterogeneous, homogeneous, in solution, enzyme, etc.), may therefore be considered as an information carrier and its structuring effect appears as soon as there is an interaction with the reactant(s) to form the activated complex. This interaction could therefore be analyzed in terms of information exchange between the catalyst and the molecules of reactant(s).
When applying the above approach to chemical reactions and before extending it to other systems, it must be clear that the structuring information concerns the formation of a transition state, which, in favorable environment conditions, will be transformed into the final products of the reaction.
By considering the kinetics of catalytic reactions to establish a link between entropy and information, we have concluded that even the simplest catalyst carries structuring information responsible for the increases in rate and selectivity. For homogeneous catalysts and the enzymes, the role of spatial conformation, coupled with structural information, was already recognized as very important [7,11]. Furthermore, he link between the selectivity of these complex catalysts and activation entropy has been shown by several authors [14].
Associating the specificity of enzymes with the information carried by these molecules has led to the progressive understanding of mechanisms involved in cell reproduction and the explosion of the knowledge relative to the genome [10]. The concept of genetic information is now a universally accepted means of expressing the discoveries of molecular biology [2].
Antigens operate with the same mechanism that associates molecular structures bearing complementary information. An antigen provokes an immunological reaction which is bound to the physicochemical nature of its functional groups. It is detected by the recognition structures of the immune system. The concept of determining antigenic site is one of the fundamental paradigms of immunology: any antigen molecule can be considered, as regards its ability to react with specific recognition molecules, as a molecular structure involved in the specific connection with stereospecific complementary antibody sites. In a similar way, through its surface proteins, a virus recognizes specific membrane receivers of the target cells.
It is certainly difficult to evaluate the amount of information involved in the interaction between catalyst and reactant(s), if not in a relative way. As a first approximation one could take the ratio between ∆I and ∆S obtained for the separation of a mixture of two gases: ∆I=∆S/R. It is now possible to calculate free energy surfaces and activation entropy [8]. This could enable the relation between entropy and information to be quantified. By considering the various possible ways of binding the molecules during the formation of the transition state complex (covalent or hydrogen bonds, orbitals, electric charges, etc..) additivity rules could perhaps be progressively established in order to predict binding affinities and estimate simultaneously the entropy and amount of information involved in this fundamental step [26]. Furthermore, it must be noted that a new area of research called “molecular information theory” is being developed to study genetic systems and their interactions [23].

Extensions to other area

We will now try to identify some scientific fields in which the concept of structuring information can bring a new unifying view. A Roederer, who introduced the concept of information-driven interactions in biological systems, has recently proposed a similar approach [12].
While the development and the transmission of the life in all of its forms are relatively well understood and well known, a gray area still remains as far as the appearance of the life on earth is concerned [15,16]. One popular hypothesis concerns the participation of minerals in the building of the first organic molecules necessary for the construction of the living systems. Before the appearance of life on earth, the various chemical elements were indeed only found in the form of gases, salt water and various solid minerals. However, some kinds of organization appeared through the formation of crystals and other mineral structures such as clays.
Among the proposed hypotheses, the following one has often been cited: simple molecules (H2O, CO2, CH4, NH3, etc..) present in the gaseous atmosphere or dissolved in the oceans were adsorbed on the surface of minerals. Next, various catalytic reactions produced more complex molecules, used as precursors for obtaining the chemical functions necessary for life (amino-acids, sugars, fatty acids, etc.).
Another hypothesis, called the "hydrothermal vent theory", states that submarine volcanic hot water jets could create an favorable environment for prebiotic chemistry [28]. In this context, minerals, mostly iron and nickel sulfides abundant in deep-sea volcanic vents, could provide catalytic species (solid or in solution) as well as free energy [31].
It must be pointed out that sun light plays a particularly important role by furnishing the free energy required for the formation of increasingly complex organic molecules [17].
Going back to our presentation on catalytic reactions, we could thus propose that the minerals of the early earth provided the structuring information required for the prebiotic reactions. Afterwards, the amount of useful structuring information increased while diversifying, with new chemical compounds slowly being synthesized.
Let us review the various information carriers used by the catalytic reactions considered thus far:
Solid catalysts [6]: crystalline structure, electric charge distributed on atoms at the surface, electron mobility, etc.
Homogeneous catalysts [18]: anions or cations possessing one or several electric charges, hydrogen bond, electronic density distribution in the various molecular orbitals, etc.
Enzymes: the same as for the homogeneous catalysts, but with the spatial distribution of the active groups being of greater importance [11].
Pursuing our inventory of information carriers in the living world, we find components structured in a much more systematic way. The best example is unquestionably the genetic code written using only four letters, the four bases: Adenine (A), Cytosine (C), Guanine (G) and Thymine (T). Using these four letters, three-letter words are formed, the triplets or codons responsible for the synthesis of proteins that make up living organisms. Only 20 amino-acids are thus associated, through enzyme catalysis, to form proteins that vary widely in terms of composition and molecular weight. In these complex processes, the carriers of the structuring information may be identified: the codons, which uses RNA to select the enzymes necessary for proteins synthesis. All this information is stored in the DNA, from which it can be extracted whenever required.
The association of the 4 bases in 3-letter words, the triplets or codons, allows to express 43=64 words, a figure much greater than 20, the number of amino-acids to be selected during the synthesis of proteins. We must underline the enormous amount of information stored by the DNA that, in human beings, contains about 75000 genes, each gene requiring between 2000 and 5000 bases, without counting intergenic zones that account for more than 70% of the DNA.
As we have pointed out, a limited amount of information may greatly modify the course of events. A typical example is the appearance of a genetic disease linked to few additional coding triplets in a chromosome (e.g. Huntington disease).
It is interesting to compare protein synthesis with the mass production carried out by the manufacturing industry (cars, electronics, household appliances, etc.). Indeed, we find the same production stages [22]:
1–Establishment of drawings, destined to store the information necessary for manufacturing (DNA or descriptive documents with specifications);
2–An assembly line, which, using existing elements, will assemble them using the information stored beforehand. The cell uses the RNA to get the information from the DNA and assembles pre-existing amino-acids. In industry, a computer-controlled assembly line uses robots to put together various prefabricated elements.
3–The finished assembly (protein or object) is then sent to the place where it can perform the specific function he was designed for.
Through this repetitive process, a large number of systems can be produced with identical characteristics corresponding to the functional information stored in their complex structure.
Another important feature shared by the two classes of systems (biological or manufactured) is that they have a limited life cycle, at the end of which they will be out of service, thrown away and thus worthless. Their functional complexity is then transformed into a random complexity, due to the disorder resulting from the mixture of the various parts they are built of (materials, molecules or chemical elements). We may consider that at the end of life, both for living as for industrial systems, the order generated during manufacturing is transformed into a mess. Therefore, new information input will be required to separate the various chemical compounds (amino-acids, for example) or mechanical and electronic parts in order to reuse them. If there is no sorting with the aim of recycling, the total destruction of the organized system produces the basic elements which were originally used to manufacture it. We can provide two examples of such processes: anaerobic fermentation of biological waste, generating mainly carbon dioxide and methane, and incineration of mixed waste, producing carbon dioxide and ashes containing the other chemical elements under various oxidized forms.
Continuing our examination of information carriers in the living world, we can observe what takes place in the animal kingdom, which human beings are part of.
Generally speaking, individuals can exchange information using one or more of their 5 senses - sight, hearing, touch, smell and taste - as receivers. Signals can thus be sent to one of these receivers; this may involve the emission of sounds or shouts, gestures and varied postures, touches, excretion of smells (pheromones, for example).
As far as human beings are concerned, new means of communication have progressively been used. The drawing (cave paintings, for example) appeared very early on as an information carrier. A great leap forward was made with the appearance of language, which radically differentiated man from other animal species. Another means of communication, music, also an information carrier, appeared relatively early. The advent of the writing later brought a new, very powerful information carrier. In addition, writing, like the image, allows information to be stored and used at will. As with the 4 bases of the genetic code, the creation of various alphabets allowed words and sentences to be used for expressing a wide variety of information. Printing then facilitated widespread distribution of the information, in particular via the news media.
Later on, computers came to complement these means and offer enormous capabilities for handling and storing increasing volumes of information, essentially in the form of texts or images. The code used contains only two characters, 0 and 1. The amount of information contained in a text is calculated using Shannon's formula and is expressed in megabits, gigabits or terabits, as the technology evolves.
More recently, new communication channels - radio, telephone, cinema, television and Internet - have appeared. They allow us to very quickly disseminate information to a large number of persons, by combining writing, speech and animated image, thus increasing its contents and efficiency.
For example, the receiver can react by in turn becoming a sender and answering the received message: here, we have dialogue. The sender can transmit his or her message to a number of receivers, who may or may not receive it. That is the case of radio or television broadcasts.
A large number of senders can simultaneously transmit messages to many receivers; if the amount of emitted information becomes too high, it can turn into noise that the receivers will no more effectively receive. As the sender is in many cases also a receiver, we can experience the generation of echo and phase synchronization of the sent messages. An example of such a phenomenon is observed for the applause at a theatre, which quickly puts itself into phase [21]. In the field of mass media, we quite often observe the repetition of the same information, sometimes resulting in what is called "single thought".
Concerning life in society, the mass media are often used to condition the individuals receiving their information. Advertisement is a typical example of diffusion of structuring information: slogans must hold the interest of large numbers of individuals by stimulating their fundamental reflexes. Finally, the expansion of fast communication networks throughout the entire planet has led to interaction among all human beings, which is now known as “globalization”.
Finally, we must clearly point out some peculiarities of information exchange. It is necessary to have a sender and a receiver able to get the emitted information. The receiver must also be capable of deciphering the message and using it. In these conditions the information can become structuring, as explained above for the reactants + catalyst pair. Nevertheless, if the model proposed for a catalyst is used, it should be clear that the structuring information creates an intermediate state that often cannot be observed. When the environment is favorable, this intermediate state could lead to a new, more structured configuration.

Conclusion

Revisiting the analogy between entropy and information, and applying it to the transition state theory proposed in the framework of chemical kinetics, we have arrived at the new concept of structuring information. The information exchanged between the catalyst and the reagents enables the formation of a transition compound with an adapted structure and in an activated state, which helps to accelerate the reaction and make it particularly selective.
Through various examples, we have reviewed the application of this concept to the biosphere, sweeping a very vast domain ranging from the appearance of life on earth to its present evolution. Thus, the development of living organisms could be considered from two complementary aspects: use of the solar radiation energy allowing organizational progress thanks to the internal structuring information.

References

  1. Brillouin, L. La science et la théorie de l'information; Masson: Paris, 1959. [Google Scholar]
  2. Atlan, H. L'organisation biologique et la théorie de l'information; Herman, Editeurs des Sciences et des Arts: Paris, 1992. [Google Scholar]
  3. Shannon, C.E. A mathematical theory of communication. Bell Syst. Tech. J. 1948, 22, 379–423; 636–656. [Google Scholar] [CrossRef]
  4. Tonnelat, J. Thermodynamique Probabiliste: un refus des dogmes; Editions Masson: Paris, 1991. [Google Scholar]
  5. Delahaye, J.-P. Information, complexité et hasard; Editions Hermes: Paris, 1994. [Google Scholar]
  6. Masel, R.I. Chemical Kinetics and Catalysis; Wiley Interscience editor, 2001. [Google Scholar]
  7. Wolfenden, R.; Snider, M.J. The depth of chemical time and the power of enzymes as catalysts. Acc. Chem. Res. 2001, 34, 938–945. [Google Scholar] [CrossRef] [PubMed]
  8. Stajbl, Marek and al. Calculations of activation entropies of chemical reactions in solution. J. Phys. Chem. B 2000, 104(18), 4578–4584. [Google Scholar] Ab Initio evaluation of the free energy surfaces for the general base/acid catalysed hydrolysis. J. Phys. Chem., B 2001, 105(19), 4471–4484. [CrossRef]
  9. Nevecna, T. and al. A study of effects of temperature and medium on reaction of trimethylamine with ethyl bromide. Collect. Czech. Chem. Commun. 1994, Vol.59, 1384–1391. [Google Scholar] [CrossRef]
  10. Monod, J. Le hasard et la nécessité; Editions du Seuil: Paris, 1970. [Google Scholar]
  11. Copeland, R.A. Enzymes; Wiley VCH editor, 2000. [Google Scholar]
  12. Roederer, J.G. On the concept of information and its role in nature. Entropy 2003, 5, 3–33. [Google Scholar] [CrossRef] Information and its role in nature; Springer-Verlag: Berlin, 2005.
  13. Wang, D.Z. Conservation of helical asymmetry in chiral interactions. CPS :orgchem/0403001; 2004. [Google Scholar]
  14. Barteri, M.; Pispisa, B. Coupling between binding-induced conformational phenomena: stereospecific effects in asymmetric reactions. J. Chem. Soc. Faraday Trans. 1 1982, 78, 2073–2084. [Google Scholar] [CrossRef]
  15. Maurel, M.-C. Les origines de la vie; Editions Syros: Paris, 1994. [Google Scholar]
  16. Smith, J.M.; Szathmary, E. The origins of life; Oxford University Press, 1994. [Google Scholar]
  17. I.P.C.C. Climate change 2001: The scientific basis; Cambridge University Press, 2001. [Google Scholar]
  18. Bhaduri, S.; Mukesh, D. Homogeneous Catalysis: Mechanism and industrial applications; Wiley-Interscience, 2000. [Google Scholar]
  19. Rosen, M.A.; Scott, D.S. Entropy production and exergy destruction: Part I. International Journal of Hydrogen Energy 2003, 28, 1307–1313. [Google Scholar] [CrossRef]
  20. Gouverneur, V.; Reiter, M. Advances in Organic Chemistry; Atta-ur-Rahman, Ed.; Bentham Science Publishers, 2005; Vol.1, pp. 519–540. [Google Scholar]
  21. Néda, Z.; et al. The sound of many hands clapping. Nature 2000, Vol. 403, 849–850. [Google Scholar] [PubMed]
  22. Ayres, R.T. Information, entropy and progress: a new evolutionnary paradigm; American Institute of Physics Press, 1994. [Google Scholar]
  23. Schneider, T.D. New approaches in mathematical biology: Information theory and molecular machines. In Chemical Evolution: Physics of the Origin and Evolution of Life; pp. 313–321. Kluwer Academic Publishers: The Netherlands, 1996. [Google Scholar]
  24. Takahiro, T.; et al. Chiral perturbation factor approach reveals importance of entropy. J. Org. Chem. 2002, 67, 6593–6598. [Google Scholar]
  25. Serizawa, T.; et al. Polymerization within a molecular-scale stereo-regular template. Nature 2004, (6 May). Vol.429, 52–55. [Google Scholar] [CrossRef] [PubMed]
  26. Dill, K.A. Additivity principles in biochemistry. Journal of Biological Chemistry 1997, Vol.272, N°2. 701–704. [Google Scholar] [CrossRef] [Green Version]
  27. Chaisson, Eric, J. Cosmic evolution: The rise of complexity in Nature; Harvard University Press, 2001. [Google Scholar]
  28. Hazen, Robert, M. Gen.e.sis : The scientific quest for life’s origin; Joseph Henry Press: Washington, 2005. [Google Scholar]
  29. Loewenstein, Werner, R. The touchstone of Life; Oxford University Press, 1999. [Google Scholar]
  30. Vidal, Jean. Thermodynamics. Editions Technip: Paris, 2003. [Google Scholar]
  31. Wächterhäuser, G. Life as we don’t know it. Science 2000, Vol.289, 1307–1308. [Google Scholar]
  • 1TW/K= 1012 Watt/°Kelvin , with a reference environment temperature of the earth of 282 K

Share and Cite

MDPI and ACS Style

Trambouze, P.J. Structuring Information and Entropy: Catalyst as Information Carrier. Entropy 2006, 8, 113-130. https://doi.org/10.3390/e8030113

AMA Style

Trambouze PJ. Structuring Information and Entropy: Catalyst as Information Carrier. Entropy. 2006; 8(3):113-130. https://doi.org/10.3390/e8030113

Chicago/Turabian Style

Trambouze, Pierre J. 2006. "Structuring Information and Entropy: Catalyst as Information Carrier" Entropy 8, no. 3: 113-130. https://doi.org/10.3390/e8030113

Article Metrics

Back to TopTop