Technical notes · Information (I)
I as Structured Difference
This page formalizes Information (I) as structured difference: patterns in Relation (R) that are stable enough to encode, transmit, and transform distinctions. We connect the IO notion of information to standard tools from information theory (entropy, mutual information) while emphasizing its ontological rather than purely statistical status.
Abstract
In Informational Ontology, information is not an abstract measure attached to messages, but the structural form of difference itself when organized through relation under constraint. Starting from Δ and R, we define informational structures as those relational patterns that support reliable encoding and transformation. Using entropy-like measures as a guide (rather than an ultimate foundation), we show how information emerges from constrained variation and why I is the natural bridge from structural ontology to awareness in the next stage.
1. Preliminaries and Notation
Retaining U as a domain and Δ, R as defined in prior modules, we now introduce a more explicitly information-theoretic vocabulary.
- A state space S is a set of possible configurations (e.g. of a system, signal, or environment).
- A random variable X on S is a function from an underlying sample space Ω to S; for our purposes, we can treat X as a variable ranging over S with associated probabilities p(s).
- The Shannon entropy of X is H(X) = −∑ p(s) log p(s), measuring expected surprise.
- The mutual information of X and Y is I(X; Y) = H(X) + H(Y) − H(X, Y), measuring reduction in uncertainty about one variable given the other.
In IO, these quantities are interpreted as measures over structures grounded in Δ and R, not as free-floating mathematical artifacts.
2. From R to I: Structured Difference Under Constraint
Informally, we can express the R → I step as:
Information is what relational structure looks like when it is constrained, repeatable, and exploitable by systems.
Consider a system with state space S and a set of admissible transitions T ⊆ S × S. Let RS be the relation capturing which states can be distinguished (via Δ) and how they are connected (via T). An informational structure is present when:
- not all transitions are allowed (constraints), and
- those constraints can be exploited to reduce uncertainty about S.
Definition 1. An informational channel in IO is a triple (S, T, μ) where:
- S is a Δ-structured state space,
- T ⊆ S × S is a Δ-consistent transition relation (only between distinct states),
- μ is a measure assigning probabilities or frequencies to transitions in T.
The informational content of such a channel depends on how μ departs from maximal entropy over admissible transitions.
3. Entropy as a Measure of Differentiation
Shannon entropy quantifies how finely a probability distribution differentiates its support: uniform distributions are maximally undifferentiated, highly peaked distributions are highly differentiated. In IO, this is interpreted structurally:
Proposition 1. For a fixed Δ-structured state space S, higher entropy distributions over S correspond to more even distribution of realized differences, while lower entropy distributions correspond to concentration on specific differences.
This does not make entropy the essence of information; rather, it is a way to quantify how a system's realized states explore the possibility space carved out by Δ and R.
4. Mutual Information and Structural Coupling
Mutual information I(X; Y) measures how much knowing X reduces uncertainty about Y. Ontologically, this corresponds to structural coupling: patterns in one system systematically track patterns in another.
Definition 2. Two Δ-structured systems S1, S2 are informationally coupled if there exist variables X on S1 and Y on S2 such that I(X; Y) > 0 with respect to some joint measure μ.
In IO, this is the formal backbone of representation: a system "represents" something when its internal states stand in a stable informational coupling with external states. This becomes crucial for Awareness (A), where such couplings become internalized as models.
5. Structural vs. Statistical Information
IO distinguishes between:
- Structural information: relational patterns that exist independently of any particular probability measure (e.g. possible state transitions, symmetries, invariants).
- Statistical information: numerical measures like H and I(X; Y) that quantify how those structures are actually traversed or realized.
Structural information is primary in IO: it is what reality is like in virtue of Δ and R. Statistical information is derivative: it is how reality happens to be explored by specific processes.
6. Information as Precondition for Awareness
Awareness, in IO, arises when informational structures become integrated and self-referential. The step from I to A is not a leap from physics to mystery, but a tightening of information into closed loops and internal models.
In the next technical module, we formalize Awareness as what happens when information about differences is not merely present, but maintained, integrated, and recursively applied by a system to itself and its environment.