Information entropy
From Wikipedia, a free encyclopedia written in simple English for easy reading.
Entropy is a concept from information theory. It tells how much information there is in an event. In general, the more uncertain or random the event is, the more information it will contain. The concept of entropy was created by a mathematician. He was named Claude Elwood Shannon.
[edit] Example
Let's look at an example. If someone is told something they already know, the information they get is very small. It will be pointless for them to be told something they already know.
If they were told about something they knew little about, they would get much new information. This information would be very valuable to them. They would learn something.
[edit] See also
- Entropy encoding
- Kolmogorov-Sinai entropy in dynamical systems
- Theil index
[edit] External links
- Information is not entropy, information is not uncertainty ! - a discussion of the use of the terms "information" and "entropy".
- I'm Confused: How Could Information Equal Entropy? - a similar discussion on the bionet.info-theory FAQ.
- Java "entropy pool" for cryptographically-secure unguessable random numbers
- Description of information entropy from "Tools for Thought" by Howard Rheingold
This short article can be made longer. You can help Wikipedia by adding to it.