An advantageous method for understanding complexity is information geometry theory. In particular, a dimensionless distance, called information length L, permits us to describe time-varying, non-equilibrium processes by measuring the total change in the information along the evolution path of a stochastic variable or the total number of statistically different states the variable passes through in time. Here, we elucidate the meaning of information length L and information rate Γ = dL/dt in light of thermodynamics (entropy production rate S˙T, non-equilibrium free energy F, microscopic chemical potential μ, etc). In particular, the average 〈∂tμ〉 gives the average rate of work (power) while the second moment 〈(∂tμ-∂tV)2〉 is proportional to Γ2. Here, the angular brackets denote average and V is the potential. The upper bound on the entropy production rate ST is set by the product of Γ and the RMS value of the fluctuating part δμ = μ- 〈μ〉. Specifically, in the case of the non-autonomous Ornstein-Uhlenbeck process for a stochastic variable x, we show that S˙T is bounded above by Γ2 up to the fluctuation normalization σ2 = 〈(δx)2〉 where σ is the standard deviation and δx = x- 〈x〉 is the fluctuating component of x. The equality σΓ = √DS˙T holds in the (isothermal) case where σ and the temperature D are constant. We discuss the implications of L as a proxy for the entropy production along an evolution path and understanding self-organization.
|Number of pages
|Journal of Statistical Mechanics: Theory and Experiment
|Published - 24 Sept 2021
Bibliographical notePublisher Copyright:
© 2021 IOP Publishing Ltd and SISSA Medialab srl.
FunderLeverhulme Fellowship (RF-2018-142-9)
- fluctuation phenomena
- stochastic processes
- stochastic thermodynamics
ASJC Scopus subject areas
- Statistical and Nonlinear Physics
- Statistics and Probability
- Statistics, Probability and Uncertainty