$\begingroup$

I just found the following paragraphs from J. R. Pierce, An Introduction to Information Theory, p.206, which I surmise would also support the thinking that one could practically use a computer and a deterministic program to turn a string A into a string B of higher Shanon entropy:

"We can regard any process which specifies something concerning which state a system in in as a message source. This source generates a message which reduces our uncertainty as to what state the system is in. Such a source has a certain communication-theory entropy per message. This entropy is equal to the number of binary digits necessary to transmit a message generated by the source. It takes a particular energy per binary digit to transmit the message against a noise corresponding to the temperature T of the system."

"The message reduces our uncertainty as to what state the system is in, thus reducing the entropy (of statistical mechanics) of the system. The reduction of entropy increases the free energy of the system. But the increase in free energy is just equal to the minimum energy necessary to transmit the message which led to the increase of free energy, an energy proportional to the entropy of communication theory."

"This, I believe, is the relation between the entropy of communication theory and that of statistical mechanics. One pays a price for information which leads to a reduction of the statistical-mechanical entropy of a system. This price is proportional to the communication-theory entropy of the message source which procudes the information. It is always just high enough so that a perpetual motion machine of the second kind is impossible."