This was written in 1956, 8 years after Shannon's work on informati...

This paper is by Claude Shannon, the father of information theory a...

"I️ personally believe that many of the concepts of information the...

The decoder from the famous channel diagram in a "A Mathematical Th...

"Researcher rather than exposition is the keynote, and our critical...

"Authors should submit only their best efforts, and these only afte...

"Seldom do more than a few of nature’s secrets give way at one time...

1956

IRE TRANSACTIONS---INFORMATION THEORY

3

The Bandwagon

CLAUDE E. SHANNON

NFORMATION theory has, in the last few years,

become something of a scientific bandwagon.

Starting as a technical tool for the communica-

tion engineer, it has received an extraordinary

amount of publicity in the popular as well as the

scientific press. In part, this has been due to connec-

tions with such fashionable fields as computing ma-

chines, cybernetics, and automation; and in part, to

the novelty of its subject matter. As a consequence,

it has perhaps been ballooned to an importance

beyond its actual accomplishments. Our fellow scien-

tists in many different fields, attracted by the fanfare

and by the new avenues opened to scientific analysis,

are using these ideas in their own problems. Applica-

tions are being made to biology, psychology, lin-

guistics, fundamental physics, economics, the theory

of organization, and many others. In short, informa-

tion theory is currently partaking of a somewhat

heady draught of general popularity.

Although this wave of popularity is certainly

pleasant and exciting for those of us working in the

field, it carries at the same time an element of danger.

While we feel that information theory is indeed a

valuable tool in providing fundamental insights into

the nature of communication problems and will

continue to grow in importance, it is certainly no

panacea for the communication engineer or, a fortiori,

for anyone else. Seldom do more than a few of

nature’s secrets give way at one time. It will be all

too easy for our somewhat artificial prosperity to

collapse overnight when it is realized that the use of a

few exciting words like information, entropy, redun-

dancy, do not solve all our problems.

What can be done to inject a note of moderation in

this situation? In the first place, workers in other

fields should realize that the basic results of the

subject are aimed in a very specific direction, a

direction that is not necessarily relevant to such

fields as psychology, economics, and other social

sciences. Indeed, the hard core of information theory

is, essentially, a branch of mathematics, a strictly

deductive system. A thorough understanding of the

mathematical foundation and its communication

application is surely a prerequisite to other applica-

tions. I personally believe that many of the concepts

of information theory will prove useful in these other

fields-and, indeed, some results are already quite

promising-but the establishing of such applications

is not a trivial matter of translating words to a new

domain, but rather the slow tedious process of

hypothesis and experimental verification. If, for

example, the human being acts in some situations like

an ideal decoder, this is an experimental and not a

mathematical fact, and as such must be tested under

a wide variety of experimental situations.

Secondly, we must keep our own house in first class

order. The subject of information theory has cer-

tainly been sold, if not oversold. We should now turn

our attention to the business of research and devel-

opment at the highest scientific plane we can main-

tain. Research rather than exposition is the keynote,

and our critical thresholds should be raised. Authors

should submit only their best efforts, and these only

after careful criticism by themselves and their col-

leagues. A few first rate research papers are preferable

to a large number that are poorly conceived or half-

finished. The latter are no credit to their writers and

a waste of time to their readers. Only by maintaining

a thoroughly scientific attitude can we achieve real

progress in communication theory and consolidate

our present position.

"Seldom do more than a few of nature’s secrets give way at one time. It will be all too easy for our somewhat artificial prosperity to collapse overnight when it is realized that the use of a few exciting words like information, entropy, redundancy, do not solve all our problems."
"Authors should submit only their best efforts, and these only after careful criticism by themselves and their col- leagues. A few first rate research papers are preferable to a large number that are poorly conceived or half- finished."
Shannon's citation history dating back to when google scholar started tracking it...I️ wonder what he would think of the explosion in both scientific publishing as well articles citing/using information theory...
![imgur](https://imgur.com/HPM6y5U.png)
The decoder from the famous channel diagram in a "A Mathematical Theory of Communication" takes the received signal and decodes it into a message that reaches the destination.
![Imgur](https://imgur.com/UB46Z56.png)
"I️ personally believe that many of the concepts of information theory will prove useful in these other fields-and, indeed, some results are already quite promising-but the establishing of such applications is not a trivial matter of translating words to a new domain."
These words still ring true today of both information theory and any other bandwagon tool/technique (deep learning trends come immediately to mind).
Some of the famous formulas/concepts from information theory: entropy, mutual information, KL divergence, channel capacity- here is a short mathematical overview of some of these concepts: https://www.cs.cmu.edu/~odonnell/toolkit13/lecture20.pdf
This was written in 1956, 8 years after Shannon's work on information theory was first published in "A Mathematical Theory of Communication". In those 8 years, information theory began to make its mark - setting the stage for the information age. Shannon focused less on the applications of information theory, so perhaps he would be surprised by just how many fields it has been applied to, and the digital revolution it led to. From circuits to microchips, data storage, cell phones, machine learning and social networks, information theory has been continued to be foundational to so much technology and theory, not to say it wasn't a bandwagon in 1956.
"Researcher rather than exposition is the keynote, and our critical thresholds should be raised." Indeed!
Poor Shannon would be rolling in his grave if he saw the explosion of quantity over quality in today's academic publishing. Perhaps there is some merit to giving voice to a wide range of authors, but it certainly is exhausting to wade through the multitudes of articles that add little our collective knowledge.
It seems buzzwords held disproportionate sway even in Shannon's day
This paper is by Claude Shannon, the father of information theory and one of the most creative and important scientists of the 20th century. Information theory transformed our thinking about the quantification, communication and storage of information and Claude Shannon’s “Mathematical Theory of Communication” lays the foundation for the field. Claude Elwood Shannon (April 30, 1916 – February 24, 2001) was a master inventor and tinkerer, and had a storied career. Among his accomplishments: he proved that circuits could do anything that Boolean algebra code could solve (fundamental to electronic digital computers); he made major advancements in cryptanalysis as a codebreaker during World War II; he worked at the storied Bell Labs, inventing the field of information theory (which subsequently allowed for the digital revolution to occur).
Some interesting facts about Shannon-> he invented the first wearable computer (a device to improve the odds in playing roulette); he loved building robots / AI and famously built an AI mouse while at Bell Labs; when he moved to Massachusetts to be a professor at MIT, his house was famously called the Entropy House; and he became a multi-millionaire from investments in technology companies.
More about Shannon here: https://en.wikipedia.org/wiki/Claude_Shannon
A few other annotated papers by Shannon:
1) Prediction and Entropy of Printed English:
https://fermatslibrary.com/s/prediction-and-entropy-of-printed-english-2
2) Scientific aspects of juggling:
https://fermatslibrary.com/s/scientific-aspects-of-juggling
Shannon mouse: https://www.youtube.com/watch?v=vPKkXibQXGA