StudentShare
Contact Us
Sign In / Sign Up for FREE
Search
Go to advanced search...
Free

Introduction to Gibbs Paradox - Essay Example

Cite this document
Summary
This essay "Introduction to Gibbs Paradox" contains a deeply analyzed report about Gibbs Paradox along with its relationship with different theories like Information theory and thermodynamic entropy. A simple derivation of the entropy of an ideal gas is based on the Boltzmann distribution…
Download full paper File format: .doc, available for editing
GRAB THE BEST PAPER98.2% of users find it useful
Introduction to Gibbs Paradox
Read Text Preview

Extract of sample "Introduction to Gibbs Paradox"

This paper contains a deeply analyzed report about Gibbs Paradox along with its relationship with different theories like Information theory and thermodynamic entropy. A Brief Introduction of Gibbs Paradox:- In statistical mechanics, a simple derivation of the entropy of an ideal gas based on the Boltzmann distribution yields an expression for the entropy which is not extensive (is not proportional to the amount of gas in question). This leads to an apparent paradox known as the Gibbs paradox, allowing, for instance, the entropy of closed systems to decrease, violating the second law of thermodynamics. The paradox is averted by recognizing that the identity of the particles does not influence the entropy. In the conventional explanation, this is associated with an indistinguishability of the particles associated with quantum mechanics. However, a growing number of papers now take the perspective that it is merely the definition of entropy that is changed to ignore particle permutation (and thereby avert the paradox). The resulting equation for the entropy (of a classical ideal gas) is extensive, and is known as the Sackur Tetrode equation. Information Theoretic Entropy:- Before going deep in the information theoretic entropy, let me write a brief and simple definition of Information. Information ( I ) is the amount of the data after data compression. Further explaining the term Information, here are some of the examples for it. If the total amount of data is L, entropy ( S ) in information theory is defined as information loss, L = S + I . Let us consider a 100GB hard disk as an example: L =100GB. A formatted hard disk will have S =100GBand I = 0. Based on this definition of information and the definition that (information theory) entropy is expressed as information loss, S = L ? I , or in certain cases when the absolute values are unknown, ?S = ?L ? ?I , there are 3 laws of Information entropy are proposed, those are The first law of information theory: the total amount of data L (the sum of entropy and information, L = S + I ) of an isolated system remains unchanged. The second law of information theory: Information (I) of an isolated system decreases to a minimum at equilibrium. The third law of information theory: For a solid structure of perfect symmetry (e.g., a perfect crystal), the information I is zero and the (information theory) entropy S is at the maximum. If entropy change is information loss, ?S = ??I , the conservation of L can be very easily satisfied, ?L = ?S + ?I = 0 . Another form of the second law of information theory is: the entropy S of the universe tends toward a maximum. The second law given here can be taken as a more general expression of the Curie-Rosen symmetry principle [5,6]. The third law given here in the context of information theory is a reflection of the fact that symmetric solid structures are the most stable ones. Indistinguishable Particles:- Two particles are called identical if the values of all their inner attributes agree. H must be so constituted that the transposition of two identical particles is defined for every vector in H (quantum case) or every phase space point in H (classical case), respectively. Two identical particles are called indistinguishable if every pure quantum state (every classical microstate) is invariant under transposition of these two particles; otherwise the two particles are called distinguishable. Two non-identical particles are always considered distinguishable. Resolution of the paradox in terms of Indistinguishable particles:- In the preceding section as I discussed about indistinguishable particles (Two particles are said to be indistinguishable if they are either non-identical, that is, if they have different properties, or if they are identical and there are microstates which change under transposition of the two particles.) The GP1 is demonstrated and subsequently analyzed. The analysis shows that, for (quantum or classical) systems of distinguishable particles, it is generally uncertain of which particles they consist. The neglect of this uncertainty is the root of the GP1. For the statistical description of a system of distinguishable particles, an underlying set of particles, containing all particles that in principle qualify for being part of the system, is assumed to be known. Of which elements of this underlying particle set the system is composed differs from microstate to microstate. Thus, the system is described by an ensemble of possible particle compositions. The uncertainty about the particle composition contributes to the entropy of the system. Systems for which all possible particle compositions are equiprobable will be called harmonic. Classical systems of distinguishable identical particles are harmonic as a matter of principle; quantum or classical systems of non-identical particles are not necessarily harmonic, since for them the composition probabilities depend individually on the preparation of the system. Relation between Shannon’s Information Entropy and Thermodynamics Entropy The inspiration for adopting the word entropy in information theory came from the close resemblance between Shannon's formula and very similar known formulae from thermodynamics. In statistical thermodynamics the most general formula for the thermodynamic entropy S of a thermodynamic system is the Gibbs entropy, Where kB is the Boltzmann constant, and pi is the probability of a microstate. The Gibbs entropy was defined by J. Willard Gibbs in 1878 after earlier work by Boltzmann (1872). The Gibbs entropy translates over almost unchanged into the world of quantum physics to give the von Neumann entropy, introduced by John von Neumann in 1927, Where ? is the density matrix of the quantum mechanical system and Tr is the trace. But, at a multidisciplinary level, connections can be made between thermodynamic and informational entropy, although it took many years in the development of the theories of statistical mechanics and information theory to make the relationship fully apparent. In fact, in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information needed to define the detailed microscopic state of the system, that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states for the system, thus making any complete state description longer. Maxwell's demon can (hypothetically) reduce the thermodynamic entropy of a system by using information about the states of individual molecules; but, as Landauer (from 1961) and co-workers have shown, to function the demon himself must increase thermodynamic entropy in the process, by at least the amount of Shannon information he proposes to first acquire and store; and so the total entropy does not decrease (which resolves the paradox). Entropy as Information Content:- Entropy is defined in the context of a probabilistic model. Independent fair coin flips have entropy of 1 bit per flip. A source that always generates a long string of B's has entropy of 0, since the next character will always be a 'B'. The entropy rate of a data source means the average number of bits per symbol needed to encode it. Shannon's experiments with human predictors show an information rate of between 0.6 and 1.3 bits per character, depending on the experimental setup; the PPM compression algorithm can achieve a compression ratio of 1.5 bits per character in English text. From the preceding example, note the following points: 1. The amount of entropy is not always an integer number of bits. 2. Many data bits may not convey information. For example, data structures often store information redundantly, or have identical sections regardless of the information in the data structure. Shannon's definition of entropy, when applied to an information source, can determine the minimum channel capacity required to reliably transmit the source as encoded binary digits (see caveat below in italics). The formula can be derived by calculating the mathematical expectation of the amount of information contained in a digit from the information source Shannon's entropy measures the information contained in a message as opposed to the portion of the message that is determined (or predictable). Examples of the latter include redundancy in language structure or statistical properties relating to the occurrence frequencies of letter or word pairs, triplets etc. Limitations of Entropy as Information Content:- There are a number of entropy-related concepts that mathematically quantify information content in some way: the self-information of an individual message or symbol taken from a given probability distribution, the entropy of a given probability distribution of messages or symbols, and The entropy rate of a stochastic process. (The "rate of self-information" can also be defined for a particular sequence of messages or symbols generated by a given stochastic process: this will always be equal to the entropy rate in the case of a stationary process.) Other quantities of information are also used to compare or relate different sources of information. It is important not to confuse the above concepts. Oftentimes it is only clear from context which one is meant. For example, when someone says that the "entropy" of the English language is about 1.5 bits per character, they are actually modeling the English language as a stochastic process and talking about its entropy rate. Conclusion:- Gibbs Paradox and Information entropy are two of the terms which have a lot in common. The whole thermodynamics is depend on these two terms. In this report, I have tried my level best to focus on the importance of these two topics as well as I have used much simple language to understand the things further. References:- 1. Lin, S.-K. Understanding structural stability and process spontaneity based on the rejection of the Gibbs paradox of entropy of mixing. Theochem–J. Mol. Struc. 1997, 398, 145-153. 2. Jaynes, E.T. The Gibbs paradox. In Maximum Entropy and Bayesian Methods; Smith, C. R.; Erickson, G.J.; Neudorfer, P. O. Eds.; Kluwer Academic: Dordrecht, 1992; pp. 1-22. 3. Lin, S.-K. Gibbs paradox of entropy of mixing: Experimental facts, its rejection, and the theoretical consequences. J. Theoret. Chem. 1996, 1, 135-150 Read More
Cite this document
  • APA
  • MLA
  • CHICAGO
(“Gibbs Paradox Essay Example | Topics and Well Written Essays - 1500 words”, n.d.)
Retrieved from https://studentshare.org/environmental-studies/1422495-gibbs-paradox
(Gibbs Paradox Essay Example | Topics and Well Written Essays - 1500 Words)
https://studentshare.org/environmental-studies/1422495-gibbs-paradox.
“Gibbs Paradox Essay Example | Topics and Well Written Essays - 1500 Words”, n.d. https://studentshare.org/environmental-studies/1422495-gibbs-paradox.
  • Cited: 0 times

CHECK THESE SAMPLES OF Introduction to Gibbs Paradox

Why US foreign policy towards Cuba has not thawed since the end of the cold war

As such, the United States granted the then colonies of Spain their independence with the condition that the United States could intervene in their internal affairs if it becomes necessary (Aronson, Rogers, Sweig & Meid, 2001; gibbs & gibbs, 2011).... However, things began to change with the 1959 Cuban Revolution (gibbs & gibbs, 2011).... As a result thereof, the administration of then-President Eisenhower responded by imposing trade restrictions except for the basic necessities such as food and medicines (gibbs & gibbs, 2011)....
32 Pages (8000 words) Dissertation

Is direct military action the best way to deal whith problems of terrorism

This research paper examines the arguments for and against direct military action as an effective counterterrorism strategy.... Counterterrorism debates are centered round two strategic philosophies: deterrence and a rational choice analysis of terrorism.... .... ... ... This paper begins by examining the definition and main characteristics of terrorism as a means of establishing how direct military counterterrorism strategies can respond to terrorism and how it might not be compatible with the nature and definition of terrorism....
11 Pages (2750 words) Essay

Characteristics of Successful Entrepeneurs

introduction Entrepreneurship requires various personal skills, talents and qualities like innovative ideas, creative thinking and risk bearing and therefore it is essentially an economic as well as a holistic function.... Characteristics of Successful Entrepreneurship Your name …………………....
4 Pages (1000 words) Essay

Who Owns the Water of Waikato River in New Zealand

This essay "Who Owns the Water of Waikato River in New Zealand" sheds some light on the Waikato River that is not just an important national resource but also contributes a lot in making history of the Waikato people who make claim to it.... ... ... ... The process has in recent times culminated in the Waikato-Tainui and the Crown duel....
8 Pages (2000 words) Essay

Safe and Effective Nursing Care

I will facilitate critical thought and relate theory to practice in an effective manner using the gibbs model.... The first stage of the gibbs Model is a description that requires me to describe the situation or event I dealt with.... It has been used worldwide since its introduction at Stanford University in the late 1950s by Dwight W....
9 Pages (2250 words) Essay

Project 4 assignment

gibbs.... We have not yet divided the parts and plan who is going to handle the introduction or the conclusion, but we are working on it.... This is an individual exercise;each of you must type the answers to the following questions and bring them to our conference. ...
1 Pages (250 words) Essay

Reflexive Managers in Contemporary Organisations

Furthermore, scholars such as Schon, Dewey, gibbs, and Kolb have developed models for performing reflection process.... The author concludes that in modern organizations, the managers have to integrate the concept of the reflective practitioner so that they can evaluate their own course of actions....
12 Pages (3000 words) Term Paper

The Paradox of Sustainable Development

The paper 'The paradox of Sustainable Development' is a wonderful example of an Environmental Studies Case Study.... The paper 'The paradox of Sustainable Development' is a wonderful example of an Environmental Studies Case Study.... The paper 'The paradox of Sustainable Development' is a wonderful example of an Environmental Studies Case Study.... The paradox of sustainable development is whether the likelihood of adjusting the requirement for Development and sustaining our natural resources at the exceptionally same time as people abide further into the ancient landmarks and extraordinary occasions that have occurred over the previous hundreds of years....
6 Pages (1500 words) Case Study
sponsored ads
We use cookies to create the best experience for you. Keep on browsing if you are OK with that, or find out how to manage cookies.
Contact Us