Information Theory

Search this site with Google

A branch of applied mathematics involving the quantification of information. Originally developed by Claude E. Shannon to find fundamental limits on compressing and reliably storing and communicating data.

A recent branch of information theory has sought to characterise the level of information conveyed by describing the shortest program that could generate the same stream of data. However as Chaos Theory shows seemingly complex results can arise from simple systems. The implications for Information Architecture are that innovative approaches can often deliver extremely powerful metaphors that make it possible to handle apparently intractable information structures.

Relation to Names

One place where information theory has a direct impact is on the names we give to things. For example the international telephone dialing scheme is set up to make commonly dialed numbers shorter

Relation to the User Interface

Many user interface designers also adopt Information Theory concepts. However in this case they measure the number of interaction events to achieve a particular goal.

Under this scheme they enumerate all the actions that users want to perform, assign weights to each to take account of the relative frequency of the different actions and then adjust the various menus, forms and other interaction elements to minimise the average number of navigation events.

Links to this page

The following pages link to here: Chaos Theory, Complexity, Information technology, Redundant, Theory


Comment on the contents of the 'Information Theory' page
Subject: Email to Reply To (optional):