Language change as a (random?) walk in entropy space

This talk explores language change from an information-theoretic perspective. The word entropy of more than 1000 written languages from over 100 families is mapped along two dimensions to measure lexical diversity and runs of the same word combinations. Both turn out to be highly constrained in the sense that languages of the world fall into relatively narrow ranges in both dimensions. However, zooming into these ranges, there is also considerable variation between languages. As languages change over time, they move around in the entropy space. A fundamental question is whether this walk in entropy space is purely random, directed, or a combination of both. In a first step to answer these questions, I explore how ancient languages within the same language family compare to their modern counterparts.

2356 232

Suggested Podcasts

Santagato Studios

Slate Podcasts

Wendy Bruton, PhD

John Hill

Lingu

All The Things Podcast Network

Swami Adgadanand