Despite the similarities between the two concepts there is a very important difference between the two. The information entropy which is denoted as H can be calculated for almost any probability distribution and if the message is taken to be that the event which is represented by n had probability pi takes place, out of the space of the list of events which are possible. But the thermodynamic entropy S refers to thermodynamic probabilities pi in particular. (Skyttner, 2005) In addition to this the thermodynamic entropy is dominated by different kinds, space and arrangements of the system, and particularly its energy, which are possible and especially on a molecular scale. When we compare we see that, information entropy of any macroscopic event is so small as to be completely useless and irrelevant. (Skyttner, 2005) We can also make a connection between the two and if the probabilities related to any question are the thermodynamic probabilities.
Gibbs entropy represented by can then be seen as simply the amount of Shannon information needed to analyze and define the microscopic state of the system in great details, or we can say its been given its macroscopic description.