Information and entropy
Tags
Ready to Publish
Ready to Publish
Publish Date
Slug
Excerpt
Related Posts
Featured
Featured
External Link
Extra Info
Status
:Source URL
bear://x-callback-url/open-note?id=0946245D-48D8-4EBD-9857-8783E61348EF-11205-000002BC1CF8C6E6
:Priority
C
Authors
Information is ...
::“reduction (or resolution) in (of) uncertainty”::
Prediction capability. A unit of prediction (in the context of “What is important” and “What is”
::“a measure of our ignorance”:
Claude Shannon
::“the difference that makes a difference.”::
Gregory Bateson
::the logarithm of the inverse of the probability of an event::: I = log (1/Pe)
Vlatko Vedra
If something is completely predictable there is zero information
Entropy is…
a measure of unpredictability
Wikipedia
a measure of missing information
the average amount of “surprise” associated with set of events
::information that is hidden from view::. In other words the amount of missing information
Leonard Susskind
Shannon’s Information theory was based on these precepts
- Encode everything into bits
- Compress it
- Add error detection and correction algorithms
Value-theories = Information
A Value-theory represents information. As we grow we learn how to reduce the uncertainty of how to influence our future. As your father said to you; life is about creation. An old friend of mine used to say that “we are the whole learning how to become itself”. That is it
From More reporting on my progress on OGFOL today
Knowledge
The purpose of knowledge is prediction (of a meaningful outcome)