A more general definition of entropy was proposed by Boltzmann (1877) as S = k ln W, where k is Boltzmann’s constant, and W is the number of possible states of a system, in the units J⋅K−1, tying entropy to statistical mechanics. Szilard (1929) suggested that entropy is fundamentally a measure of the information content of a system. Shannon (1948) defined informational entropy as $$S=-\sum_{i}{p}_{i}{log}_{b}{p}_{i}$$ where pi is the probability of finding message number i in the defined message space, and b is the base of the logarithm used (typically 2 resulting in units of bits). Landauer (1961) proposed that informational entropy is interconvertible with thermodynamic entropy such that for a computational operation in which 1 bit of information is erased, the amount of thermodynamic entropy generated is at least k ln 2. This prediction has been recently experimentally verified in several independent studies (Bérut et al. 2012; Jun et al. 2014; Hong et al. 2016; Gaudenzi et al. 2018).