The method of measuring how much information a source conveys to a destination is called entropy.
"The exact amount of transmitted information depends on how much the designation knows beforehand, and on the number of choices the source has in deciding what message to select from a number of alternatives".
The degree of similarity of function or organization of any system in terms of probability can be considered entropy. The idea of entropy was first introduced in physics. In thermodynamics the tendency for a closed system to run down from an organized and improbable state to a disorganized, more probable an chaotic state is considered an increase in entropy.
In open systems, when entropy increases there is a tendency for higher order because of the fact to increase entropy the system takes in energy from outside the system. The source of energy is known as information.
In other words, to exchange information is to give more order and organization to a system and to reverse the direction of change of entropy. In final words, information is the reduction of uncertainty which is the freedom of choice available in a system.
Among the patterns of form or behavior of a system, lack of originality, uniqueness, and distinction indicates high probability and thus low information. Information theory then provides measurement criteria depending on the amount of information a message contains. The less the probability of occurrence of a message the more information it can carry. The concept is like the measurement of electricity as the potential difference at two points of a wire. It measures the uncertainty before and after an event.-Asghar Talaye Minai, Architecture as Environmental Communication, 1984
No comments:
Post a Comment