I’m attempting to do a read through of this inspired by Andy Matuschak (see)
What kind of problem does probability theory solve?
Although initial conditions appear to be the same, the outcome is not knowable
Faced with a non-deterministic outcome, we can derive some form of regularity.
We do not know the outcome of an event, yet we can still say something about its outcome.
We know nothing about the outcome of a coin toss, yet we can say something about the outcome of an infinite number of coin tosses.
This helps tackle the ‘uniformity of nature’ assumption but because we can never know the outcome it does not imply a contradiction.
How does probability manifest itself as a ‘style of thinking’?
If I think about a computational ‘style of thinking’ it brings to mind states and transitions. Algorithms, or complete knowledge about a process. In this sense, its a thinking searching for primitives that carry out the operation of a mechanism.
In probability, our first step is to enumerate, or intuitively graps (reduction of informatioN) the range of possible outcomes. In the example of the role of a die, we can enumerate all possible (plausible) outcomes. In the case of the future, there is an infinite number of states so reduction of this space may come in the form of.
How might probability models differ from a model you’re familiar with?
A model helps explain our impressions of reality.
Models are also dynamic in that different models can explain the same thing, as long as they’re useful.
What role does symmetry play in establishing a measure called probability?
Symmetry across outcomes implies indifference. The measure is distributed with no preference for any one outcome.
What role does independence play in establishing probability as a measure?
Independence allows the measure of the product sample space to scale by the Cartesian product of the sample spaces.
If they were not independent, the measure of one, when placed in an ordered pair with the order would change by an unknown amount.
What is the relation between randomness of a trial and independence of a trial?
Randomness is absence of a pattern. This implies that previous trials tell us nothing about the outcome of the next trial, the definition of independence. Discovering independence is the same as classifying a trial as random.
Why is a random sample a misnomer?
Randomness is an a priori concept, having no physical meaning. Once a sample is drawn, it is not random, the outcome has been determined.
Before the roll of a die, the outcome could take on 6 possible values. When rolled, the outcome takes on a fixed value.
Define the randomness in the sequence produced from a source?
To define randomness would mean define the lack of pattern in what a source produces. We can’t do this, but we try to compare randomness in the form of the minimal algorithm that could generate such a sequence. In this sense 1010101 is more random than 00000. This underlies the complexity involved in producing the sequence. The greater the complexity, the higher the randomness. A random string is one which can’t be compressed.
How might we interpret given information from the pov of the sample space?
The more information we have, the more cases from the initial space can be removed. The ‘new’ sample space no contains a more refined set of the cases of interest in relation to possible case.