Skip to main content

Biomedical and Electrical Engineer with interests in information theory, evolution, genetics, abstract mathematics, microbiology, big history, Indieweb, and the entertainment industry including: finance, distribution, representation

boffosocko.com

chrisaldrich

chrisaldrich

+13107510548

chris@boffosocko.com

u/0/+ChrisAldrich1

stream.boffosocko.com

www.boffosockobooks.com

chrisaldrich

pnut.io/@chrisaldrich

mastodon.social/@chrisaldrich

micro.blog/c

 
 

@hjalli I've certainly heard this type of usage for taxis in the past (particularly for trips to airports). It's not a bad metric mathematically as cab pricing takes into account not only mileage but driving time so that one can slightly better differentiate between long country commutes versus much shorter drives with heavy traffic in major cities. An Uber version also presumably also takes into account the additional surge pricing variable which indicates how readily available a ride might be had at that time of day as well. Thus an Uber price could potentially be giving a lot more information to the listener in a much shorter sentence.

 

Ooh! There's an arXiv math.IT bot?! @mathITbot

 
 
 

There aren't a lot out there, but here are the ones I'm aware of:
*Thomas Cover (YouTube): https://www.youtube.com/user/classxteam
*Raymond Yeung (Coursera): https://www.coursera.org/course/informationtheory (May require account to see 3 or more archived versions)
*Andrew Eckford/York University (YouTube): Coding and Information Theory https://www.youtube.com/channel/UCEFL7YLqmfa8fMW8iExYF2g
*NPTEL: Electronics & Communication Engineering http://nptel.ac.in/courses/117101053/

Fortunately, most are pretty reasonable, though vary in their coverage of topics. I'd be glad to hear about others, good or bad if others are aware. The top two are from professors who've written two of the most common textbooks on the subject. If I recall a version of the Yeung text is available via download through his course interface.

 

@bestteenring Let's continue discussion on when you have time. I'm still intrigued.

 

Best way to start the holiday weekend #WhatImReading

"Why Information Grows" with homemade rosemary bread

 

The Evolution of Information Gathering: Operational Constraints by Cynthia F. Kurtz

The Evolution of Information Gathering: Operational Constraints
Cynthia F. Kurtz
1991 Master's Thesis, SUNY Stony Brook, Ecology & Evolution

Abstract: I present two new approaches to the study of information in foraging theory. First, rather than determine the cost a forager should pay to obtain information, I concentrate on the consequences of information use in an interacting population. I describe a density-dependent model which tracks genotypes with high and low information access through evolutionary time. Stable polymorphisms result. I suggest that the value of information is not monotonically increasing. Second, I present a scheme for partitioning the information used in the decision making process. Three types of information are recognized: internal information, or an individual's internal state; external information, or environmental factors; and relational information, or rules for predicting transformations of internal state. Interactions between the three types are examined in an extension of the basic model.

 
 

"A few exciting words": information and entropy revisited | Lyn Robinson and David Bawden - Academia.edu

A review is presented of the relation between information and entropy, focusing on two main issues: the similarity of the formal definitions of physical entropy, according to statistical mechanics, and of information, according to information theory; and the possible subjectivity of entropy considered as missing information. The paper updates the 1983 analysis of Shaw and Davis. The difference in the interpretations of information given respectively by Shannon and by Wiener, significant for the information sciences, receives particular consideration. Analysis of a range of material, from literary theory to thermodynamics, is used to draw out the issues. Emphasis is placed on recourse to the original sources, and on direct quotation, to attempt to overcome some of the misunderstandings and oversimplifications that have occurred with these topics. While it is strongly related to entropy, information is neither identical with it, nor its opposite. Information is related to order and pattern, but also to disorder and randomness. The relations between information and the “interesting
complexity,” which embodies both patterns and randomness, are worthy of attention.

 

"Waiting for Carnot": information and complexity | Lyn Robinson and David Bawden - Academia.edu

Abstract: The relationship between information and complexity is analysed, by way of a detailed literature analysis. Complexity is a multi‐faceted concept, with no single agreed definition. There are numerous approaches to defining and measuring complexity and organisation, all involving the idea of information. Conceptions of complexity, order, organization and ‘interesting order’ are inextricably intertwined with those of information. Shannon’s formalism captures information’s unpredictable creative contributions to organized complexity; a full understanding of information’s relation to structure and order is still lacking. Conceptual investigations of this topic should enrich the theoretical basis of the information science discipline, and create fruitful links with other disciplines which study the concepts of information and complexity.

 

Arieh Ben-Naim's book is out next week: Information, Entropy, Life and the Universe http://amzn.to/1el5ocG

 

8th Annual North American School of Information Theory

(NASIT) - August 10-13, 2015 - UC San Diego, La Jolla, California
The School of Information Theory will bring together over 100 graduate students, postdoctoral scholars, and leading researchers for four action-packed days of learning, stimulating discussions, professional networking and fun activities, all on the beautiful campus of the University of California, San Diego (UCSD) and in the nearby beach town of La Jolla.

 

The Homological Nature of Entropy

We propose that entropy is a universal co-homological class in a theory associated to a family of observable quantities and a family of probability distributions. Three cases are presented: (1) classical probabilities and random variables; (2) quantum probabilities and observable operators; (3) dynamic probabilities and observation trees. This gives rise to a new kind of topology for information processes, that accounts for the main information functions: entropy, mutual-informations at all orders, and Kullback–Leibler divergence and generalizes them in several ways. The article is divided into two parts, that can be read independently. In the first part, the introduction, we provide an overview of the results, some open questions, future results and lines of research, and discuss briefly the application to complex data. In the second part we give the complete definitions and proofs of the theorems A, C and E in the introduction, which show why entropy is the first homological invariant of a structure of information in four contexts: static classical or quantum probability, dynamics of classical or quantum strategies of observation of a finite system.

Shannon information; theory; ; ; homotopy of links; mutual informations; Kullback–Leiber divergence; trees; monads; partitions;