Entropy's role in Information Theory#
Excerpting the application of Claude Shannon's work to thermodynamics here: the information entropy of a system is the amount of "missing" information needed to determine a particular microstate, given the macrostate -- so a continuum from least information, maybe the absense of information, to complete information. Shannon, whose sights were on electronic communications at the time, was focusing on expressing mathematically the "informational value" of a communicated message in a channel, the study of which is seen as the beginning of Information Theory. It became clear the concept was equally applicable to thermodynamic entropy.
Claude Shannon, Bell Labs Engineer, IAS fellow, MIT faculty member (theory developed in 1940s)