Lavoisier S.A.S.
14 rue de Provigny
94236 Cachan cedex
FRANCE

Heures d'ouverture 08h30-12h30/13h30-17h30
Tél.: +33 (0)1 47 40 67 00
Fax: +33 (0)1 47 40 67 02


Url canonique : www.lavoisier.fr/livre/mathematiques/novelty-information-and-surprise/descriptif_4745946
Url courte ou permalien : www.lavoisier.fr/livre/notice.asp?ouvrage=4745946

Novelty, Information and Surprise (2nd Ed., 2nd ed. 2022) Information Science and Statistics Series

Langue : Anglais

Auteur :

Couverture de l’ouvrage Novelty, Information and Surprise

This revised edition offers an approach to information theory that is more general than the classical approach of Shannon. Classically, information is defined for an alphabet of symbols or for a set of mutually exclusive propositions (a partition of the probability space ?) with corresponding probabilities adding up to 1. The new definition is given for an arbitrary cover of ?, i.e. for a set of possibly overlapping propositions. The generalized information concept is called novelty and it is accompanied by two concepts derived from it, designated as information and surprise, which describe "opposite" versions of novelty, information being related more to classical information theory and surprise being related more to the classical concept of statistical significance. In the discussion of these three concepts and their interrelations several properties or classes of covers are defined, which turn out to be lattices. The book also presents applications of these concepts, mostly in statistics and in neuroscience.


Surprise and Information of Descriptions: Prerequisites.- Improbability and Novelty of Descriptions.- Conditional Novelty and Information.- Coding and Information Transmission: On Guessing and Coding.- Information Transmission.- Information Rate and Channel Capacity: Stationary Processes and Information Rate.- Channel Capacity.- Shannon's Theorem.- Repertoires and Covers: Repertoires and Descriptions.- Novelty, Information and Surprise of Repertoires.- Conditioning, Mutual Information and Information Gain.- Information, Novelty and Surprise in Science: Information, Novelty and Surprise in Brain Theory.- Surprise from Repetitions and Combination of Surprises.- Entropy in Physics.-  Generalized Information Theory: Order- and Lattice-Structures.-  Three Orderings on Repertoires.- Information Theory on Lattices of Covers.- Bibliography.- Index.

Günther Palm studied mathematics in Hamburg and Tübingen. After completing his studies in mathematics (Master in 1974, Ph.D. in 1975) he worked on nonlinear systems, associative memory and brain theory at the MPI for Biological Cybernetics in Tübingen. In 1983/84 he was a Fellow at the Wissenschaftskolleg in Berlin. From 1988 to 1991 he was professor for theoretical brain research at the University of Düsseldorf. Since then he has served as a professor for computer science and Director of the Institute of Neural Information Processing at the University of Ulm, where his focus is on information theory, pattern recognition, neural networks, and brain modelling.

Provides definitions of useful new concepts: description, novelty, surprise, template

Discusses new viewpoints on information theory in relation to the natural sciences

Demonstrates a method of analyzing neuronal spike trains (burst surprise)

Date de parution :

Ouvrage de 293 p.

15.5x23.5 cm

Disponible chez l'éditeur (délai d'approvisionnement : 15 jours).

137,14 €

Ajouter au panier

Date de parution :

Ouvrage de 293 p.

15.5x23.5 cm

Disponible chez l'éditeur (délai d'approvisionnement : 15 jours).

137,14 €

Ajouter au panier