Entropy

From Qunet
Jump to: navigation, search

Entropy

Entropy is one of the most important concepts in physics. In information theory it is quantitatively associated with information. Information is associated with a probability distribution.

They way in which probability distributions are associated with information is through the uncertainty, or certainty, associated with a particular distribution. For example, if the probability of a certain event is one. Then we know that the probability of any other event occurring is zero and thus we have no uncertainty about the outcome of the experiment. On the other hand if we have a set of events that are all equally likely, then there is no good guess as to the outcome of the experiment to determine which event occurs because no one event is more likely than the others. In between these two extremes is the case where some event, or events, are more likely than some other and thus we have some information about which event is more likely, but not complete information about which event will occur.

Shannon provided a