Rethinking about Information—Semi-sesquicentennial Later
The 1940s was a very interesting era. Sure, it was the time of the war, but it was also the time of recovery and rethinking of our collective futures. The decade started with the time of bloodshed and ended with some of the most significant developments in human history, including the creation of the United Nations and the independence or formation of several key countries around the world, including India, Israel, and the People’s Republic of China. It was also the time when many scholars, emerging from war-torn nations and nationalism, started thinking about a collective vision of a world where scientific advances could better humankind. That is where we can find some of the most profound ideas and innovations that laid the foundation of a modern notion of “information.” Yes, I am choosing to not call it “information science” or “information technology.” That’s because these innovations were not necessarily bound to a specific technology or a disciplinary thinking. They were, instead, about rethinking what “information” is, what it can do, and how it could impact humanity at large.
Take, for example, Claude Shannon. Working at the Bell Labs in New Jersey, he was interested in measuring information like it was some kind of matter or currency. This was an absurd thing to think about at the time, and Shannon’s original paper in The Bell System Technical Journal went almost unnoticed. Next to the discovery of transistor (invented around the same time in 1948) at the Bell Labs, this idea of quantifying information didn’t seem so groundbreaking. Shannon come up with a new unit of measurement, which he called bits (short for “binary digits”).
Around that time, another brilliant mind by the name of Vannevar Bush was thinking about information in a different way while heading the Office of Scientific Research and Development (OSRD). In his 1945 essay “As We May Think,” he envisioned a machine that he called Memex. This machine was modeled after an office desk—the state-of-the-art for storing and processing information at the time. Memex was meant to take all of one’s collection of information (think about books, dictionaries, encyclopedia), store it all in some fashion in a compressed way, and retrieve relevant pieces when needed. It would do so by making appropriate associations among different information segments and sources, much like how a human brain does. This idea of creating human-associative memory into an electro-mechanical device was in parallel to what was being developed by Claude Shannon, Warren McCulloch, and Walter Pitts at MIT, which resulted in McCulloch-Pitts neuron in 1943, considered to be the first mathematical model of a biological neuron in the human brain.
—Shannon wants to feed not just data to a Brain, but cultural things!—
Shannon and Bush were not the only ones at the time interested in redefining how we think about information and how we can do intelligent things with it. Across the pond, British codebreaker Alan Turing was working on combining data and logic into solving computational problems. In other words, he was after building a machine that could take in inputs of data and instructions and process any kind of logic. This was, at least in theory, a universal machine, not bound by the hardware. The idea contained a proposal for a modern-era computer.
In 1943, Turing visited the Bell Labs and met Shannon. He exclaimed, “Shannon wants to feed not just data to a Brain, but cultural things! He wants to play music to it!” Later, however, Turing seems to completely embrace this idea. In 1949, he told The London Times, “I do not see why it (the machine) should not enter any one of the fields normally covered by the human intellect, and eventually compete on equal terms. I do not think you even draw the line about sonnets, though the comparison is perhaps a little bit unfair because a sonnet written by a machine will be better appreciated by another machine.”
To put these ideas by Shannon, Bush, and Turing in perspective, consider that the first computer for the public—ENIAC—was announced in 1946. The first program was run on a computer in 1948. And it wasn’t until 1953 that IBM started selling data-processing machines to businesses. Such a progression of vision to reality is not unprecedented. Arthur C. Clarke wrote The Sentinel in 1951, which became a movie 2001: A Space Odyssey in 1968, a year before the first man walked on the moon. Sometimes a vision can come off as speculation and be completely off from the reality. Other times, it helps us drive the innovations.
The United Nations celebrated its 75 years (semi-sesquicentennial) in 2020, and as we come to other semi-sesquicentennial events (Bush’s Memex, Shannon’s theory of information, The Turing Test), it’s important to reflect on how those ideas and visions have played out over these decades and where do we go from there. Of course, I don’t mean to imply that nothing else that is profound has happened in these 75 years. But so much of what has happened in this time can trace their origins to those formative years of information, that I have to single out that era.
Information has become even more central to our lives and our future. And perhaps it’s time to rethink what it is and what we should do with it. With the launch of this new platform—Information Matters—we embark on a new journey to explore that. We will ask—not just the experts and loud voices—but also ourselves about what information means to us and how we could use it for the betterment of the society. We will learn from each other, and sometimes argue, too. Both of these are equally important for our advancement. It matters that we do this. It matters that we are here. The discourse matters. The dissemination of these ideas matters. Information matters.
Cite this article in APA as: Shah, C. (2021, August 7). Rethinking about information—Semi-sesquicentennial later. Information Matters. Vol.1, Issue 8. https://r7q.22f.myftpupload.com/2021/08/rethinking-about-information-semi-sesquicentennial-later/