next up previous contents
Next: Post the 1956 MIT Up: Psychology Lecture 2 Previous: Influence of information processing   Contents


The concept of ``information''

Claude Shannon (Mathematician at Bell Telephones, designer of switching circuits that implemented Boolean operations, 1948): Started from the observation that the quantity of information transmitted over a channel depends on variation in a signal - in the simplest case, on or off. The basic unit of information, a bit or binary unit distinguishes between two alternatives. Adding a second bit doubles the no. of alternatives to 4 (00, 01, 10, 11). A 3rd bit doubles the alternatives to 8. Informativeness of an event depends on the number of alternatives it excludes.

Extending this concept to more than two alternatives with unequal probabilities: for an English sentence Shannon defined informativeness in terms of the average number of guesses needed to guess the next letter in a word. Redundancy is the reciprocal of the average number of guesses needed to generate the correct letter. Printed English yielded a redundancy estimate of 50% (average of two guesses were needed per letter).

George Miller (Dept. of Psychology, Harvard from 1946): Interested in mathematical analysis, information theory: work of Claude Shannon

Miller wrote in 1956 ``The magical number seven plus or minus two: Some limits on our capacity for processing information'' - this was a synthesis of previously known results.

The limitation is built into us either by learning or design of the nervous system. Several techniques are used by us to increase the memory span, one of these is chunking. IBMCIABBCUSA exceeds the limit but if chunked can be remembered easily. BSPBJPTDPDMKSP.

Other methods - making relative rather than absolute judgements and coding entities along several dimensions.


next up previous contents
Next: Post the 1956 MIT Up: Psychology Lecture 2 Previous: Influence of information processing   Contents