-
Shannon vs Universal Compression
February 16, 2023
$$ \newcommand{\mb}{\mathbb} \newcommand{\mc}{\mathcal} \newcommand{\B}{\mb{B}} \newcommand{\Z}{\mb{Z}} \newcommand{\N}{\mb{N}} \newcommand{\R}{\mb{R}} \newcommand{\D}{\Delta} \newcommand{\X}{\mc{X}} \newcommand{\T}{\Theta} \newcommand{\O}{\Omega} \newcommand{\o}{\omega} \newcommand{\l}{\lambda} \newcommand{\z}{\zeta} \newcommand{\g}{\gamma} \newcommand{\e}{\varepsilon} \newcommand{\E}{\mb{E}} \newcommand{\set}[1]{\left\{#1\right\}} \newcommand{\par}[1]{\left(#1\right)} \newcommand{\brac}[1]{\left[#1\right]} \newcommand{\floor}[1]{\left\lfloor#1\right\rfloor} \newcommand{\ceil}[1]{\left\lceil#1\right\rceil} \newcommand{\abs}[1]{\left\lvert#1\right\rvert} \newcommand{\real}[1]{#1^{(\R)}} \newcommand{\bin}[1]{#1^{\par{\B^\infty}}} \newcommand{\binn}[1]{#1^{\par{\B^n}}} \newcommand{\cyl}[1]{{\overbracket[0.5pt]{\underbracket[0.5pt]{#1}}}} \newcommand{\int}[2]{\left[#1,\,\,#2\right)} \newcommand{\len}[1]{\abs{#1}} \newcommand{\Mid}{\,\middle\vert\,} \DeclareMathOperator*{\argmin}{argmin} \DeclareMathOperator*{\argmax}{argmax} \newcommand{\up}[1]{^{\par{#1}}} \newcommand{\Km}{Km} $$ References An Introduction to Kolmogorov Complexity and Its Applications - Li & Vitanyi 2008 (L&V) Elements of Information Theory - Cover & Thomas 2nd ed. 2006 (C&T) A Mathematical Theory of Communication - Claude Shannon 1948 (Shannon) AIT = Algorithmic Information Theory…
-
Variational Inference
July 6, 2022
This is a primer on variational inference in machine learning, based on sections of Jordan et al. (An Introduction to Variational Methods for Graphical Models; 1999). I go over the mathematical forms of variational inference, and I include a discussion on what it means for something to be “variational.” I hope this conveys a bit of the generating ideas that give rise to the various forms of variational inference. …
-
Ideal Gas Entropy Derivation
June 21, 2022
Derivation of the change in entropy formula for an ideal gas (used in the Carnot Cycle post) from state space volumes. Discussion about connections between the observer’s information about the gas and how that relates to the reversibility of transformations applied to the gas. …
-
Liouville Supplemental: Bertrand Paradox
April 5, 2022
I reframe the Bertrand paradox as the statement that uniformity of measure is relative to choice of coordinate system. The objective-Bayesian approach to the problem of priors is to assign a maximally uninformative prior to the given possibility space. What is considered maximally uninformative can be derived with the maximum entropy principle - a generalization of the principle of indifference. In many cases this ends up being a uniform prior. However, we run into a problem since uniformity is relative to choice of coordinates. This is relevant to physics since there is no preferred coordinate system to work in. …
-
Liouville Supplemental: Coordinate Transformations
April 5, 2022
This is supplemental material for Liouville's Theorem. Specifically I go through a few examples of phase space transformations, canonical and non-canonical. I also show that we can turn arbitrary configuration space transformations into canonical phase space transformations, a result that will be useful for my discussion about the Bertrand paradox (Liouville's Theorem#the-bertrand-paradox). …
-
Liouville's Theorem
April 5, 2022
Liouville’s Theorem states that the size of a state region of any closed system remains constant as the system evolves through time. This has consequences for connections between information and physics. …
-
The Carnot Cycle
March 17, 2022
This a formal description of the Carnot cycle which I hope is a useful reference for anyone who wants to quickly ramp up on thermodynamics. The Carnot cycle is often used as a canonical introduction to classical thermodynamics (specifically the thermodynamics of ideal gasses) since it nicely illustrates the relationship between the entropy, temperature and volume of a gas. …
-
The Reversibility Problem
March 1, 2022
This is my exploration into formalizing the reversibility problem, i.e. the question “Which processes are reversible?” My long term goals are to, formally define what it means for any process to be reversible, regardless of equilibrium considerations; clarify the connection between information and reversibility (and by extension the connection between information and entropy); clarify (make well defined) the meaning of statements like “breaking a glass is irreversible because the entropy of the broken glass is higher than the entropy of the unbroken glass,” and “the entropy of the universe is monotonically increasing.” …
-
Szilard Cycle Particle-Piston Interaction Model
February 17, 2022
Admittedly my first-pass interaction model in Reversible Szilard Cycle Problem#part-ii-information-is-never-lost is not physically realistic. By interaction model, I mean how the particle and piston interact over time. Here I explore some alternative interaction models that try to be more realistic. My main question is whether which-side information is still preserved. …
-
Why Doesn't Uncopying Defeat The 2nd Law?
February 17, 2022
In Reversible Szilard Cycle Problem I pondered whether uncopying a bit of information at the end of the Szilard cycle makes the full cycle reversible, apparently getting around the 2nd law. This “loophole” is more much pervasive to thermodynamics than the Szilard engine. I will go through its generalization in #Maxwell’s Superdemon. I assume the 2nd law holds, so in #Slaying The Superdemon I consider some possible reasons why this loophole doesn’t work. …