Entropy from first principles

If there are a certain number of things, then there is a minimum amount of information that it takes to say which thing you have.

By “things you have” we could mean objects you’re selecting between, or messages you’re receiving, or states you’re describing, or worlds you live in.

By “say” we could mean decide, or record, or communicate, or come to believe.

If the number of possible things is infinite, there are ways of narrowing in on which thing you have.

There are many types of things for which we have given the information different names; the rules are all very similar.

If the thing you have is hot, then there are a few more rules; this is where the rules were first discovered.

If the thing changes over time, there are laws that dictate how much the information changes.

The thing you have could be a small part of a much bigger thing. The rules can be made to apply here.

If you would like to change the thing you have, there are rules about how much you can do that, and at what cost.

Here, you can come to understand these rules.

In­tro­duc­tion to ab­stract entropy

Deal­ing with in­finite entropy

A dy­nam­i­cal sys­tems primer for en­tropy and optimization