top of page
  • Writer's pictureCharles Edge

A Visual Representation of The History of Computing (and what it means)

The end of the year is a great time to reflect and think deeper about how we are changing, how the world is changing, and what our place in the world is. Software is eating the world. But hardware has always been able to eat software. Hardware can also just arbitrarily choose not to bootstrap an operating system or load a program (or so a few tattered floppies from college seem to indicate).

Sitting through hundreds of pitches a year from talented founders really makes you think about how much technology is taking over every aspect of our lives. We store large amounts of data in public clouds. We use our devices for work, to connect with our families, to occupy our free time with games, to help us stay fit at the gym, and even to help us navigate the world when driving or taking public transit. We are firmly in the era of ubiquitous computing.


We took a few decades to get comfortable with virtual everything. But then the Dot-com bubble put the idea of everything on the web in our heads. Given that we have the devices on us at all time now, we’ve transitioned to an app for everything. If we can break down a process into combinatorial math or basic logic, we can automate it. The machine learning capabilities becoming ubiquitous on even the cheapest devices mean that we can go even further and now have the ability to have the devices observe patterns of every ilk. Sure, they might get a movie recommendation wrong, but they get far more right than wrong!


I’ve been meaning to write about the generations of computing that are laid out by most modern computing historians for awhile, but unable to find the original source of the generational model. To summarize, the five generations roughly follow the advent of the technology used for central processing of information, which includes:

  • First Generation (1940-1956): Vacuum tubes for processing, magnetic drums for memory

  • Second Generation (1956-1963): Transistors (invented in 1947 at Bell Labs but didn’t fully replace tubes immediately) for processing and often magnetic cores for memory

  • Third Generation (1964-1971): Integrated circuits

  • Fourth Generation (1971-2010): Microprocessors

  • Fifth Generation (Present Day): SoCs (System on a Chip) with cores for Machine Learning

We can then look at how the technology evolved based on the processors in the devices we use. These follow a few trends, as we moved from generation to generation. Behind them flows new programming languages and even operating systems to improve the ability to interact with the underlying software. Sometimes gradually and sometimes in bursts, but consistent when plotted on a timeline. Let's just look at that timeline (access on Miro.com here or use the embedded Miro board below).


Here's the timeline in a vectorized file if the embed doesn't work.

ComputingHistoryTimeline
.pdf
Download PDF • 173KB

These interconnections and advances are relevant (and even critical) to understand because the trends are consistent with technological adoption in the current age. Let’s go through a few of these trends.

  • Devices get smaller. The first generation devices began as entire floors in buildings and were reduced to the size of rooms and transistorized computers got to the size of a refrigerator. Integrated circuits ushered in the era of the minicomputer (roughly the size of a 4U or 6U server and microprocessors got even smaller with fifth generation devices fitting in our pockets or on our wrists. In the timeline, we include MOSFET scaling to show how transistors shrink over time.

  • Power consumption per device goes down. Tubes take a lot of power to fire up and run. Transistors take less but they added up and still created a lot of heat. Once we moved multiple transistors into an integrated circuit package the microprocessor became somewhat deterministic. We might say the Fifth Generation is more about systems on a chip than artificial intelligence as it’s typically labelled. The move to SoCs has seen the lowest power consumption yet, although as we end up with resistors the size of subatomic particles, we may beed more shielding and power again. But even that seems like something scientists are working to solve.

  • Devices became more pervasive with each generation. The first behemoth mainframes were used primarily for scientific research, Cold War-era cryptography, and very large businesses. They slowly spread on the backs of sellers from companies like IBM through the Second Generations and Minicomputers in the Third Generation brought the price down to the point that most any business could afford one if they wanted them. But most companies had just that one until the Intel 4004 CPU paved the way for microcomputers from early PC companies. Slowly the devices spread to every desk in the office and most homes, at least until the arrival of the SoC and then the age of mobility in the Fifth Generation.

  • Standards become more important. Each generation saw devices do more, store more, and process more repetitive tasks with a higher likelihood of success. So we began to rely on them more and more. Not only could they help humans with repetitive scientific tasks but they could also then begin to replace people who performed those tasks. First in science, to solve differential equations, but as we entered the Second Generation, computers were used for missile defense and the business uses began to emerge, like the intent of the Bell Labs research to automate the work previously done by telephone operators to connect phone calls. And then business automation as we entered the Third Generation, nearly every process automation as we entered the Fourth, and finally knowledge work as we entered the Fifth. Standards allow the spread to be controlled. We slow down innovation in exchange for reliability across vendors and interoperability. And so with a greater international marketplace following World War II, we saw ISO founded in 1947, following the rapid spread of electrical equipment we saw the IEEE founded in 1963, and with the rise of the Internet we saw the IETF in 1986. They help govern everything from weights and measurements to 5G in our modern era.

  • Computers become more interactive. We loaded information into the original mainframes by patching wires to carry electrons through tubes. We then loaded punch cards and we hooked up a keyboard and monitor when computers got interactive. That interactivity and limited capacity of computing resources led us on a journey to network computers first over wired links, then wireless, then over cellular and satellite communications until we could chat friends, family, and coworkers by voice or text or email or whatever means the urgency of a given communication demanded. Interactivity first with the computer then with each other using the computer as a medium.

  • Code gets more re-usable. Punch cards ran only on the system they were punched throughout the first two generations of computing. Early programming languages appeared in the Second Generation and by the Third Generation, operating systems were born and languages like C could even run between them. Object oriented code began early in the Fourth Generation and by the Fifth we had REST interfaces for web services and the era of the public API for everything pretty much as a service was upon us.

  • Devices become more stable and require less maintenance. This is important if we are to rely on devices. First generation tubes were prone to going out when they were flipped between the on and off positions. As the generations went on, devices got more dense, with less moving parts and by the Fourth Generation, companies were making devices that weren’t field-serviceable any longer. This extends to the kernels that our operating systems sit on and then how those segment memory and file system storage in the Fifth Generation as we have to be more careful not to allow an app to run amuck than ever (for stability and security reasons).

  • Devices got cheaper to design, manufacture and purchase. Imagine a time when core memory had to be manually woven together or when Seymour Cray hand-drew his processors. We have tools to build tools, tools to design tools, and tools to repair tools. As tooling becomes more mature and more precise, we can print just about anything quickly and efficiently and even split the development from the manufacturing (e.g. fabless chip design houses or offshored software development). By the start of the Fourth Generation there was a race to the bottom in pricing, resulting in price wars between companies like Commodore and Atari. This has come and gone in waves for decades, which helps continue the mass proliferation of devices into every aspect of our lives.

  • Peripheral and component options exploded. We traded early punch cards for paper tape and then a keyboard and monitor as computers became more interactive. Time sharing systems in the third generation even allowed multiple terminals and so multitasking required more internal components. We then invented gestures, offloaded graphics with GPUs, hooked up cameras, grabbed a mouse, and of course the old gaming joysticks once we had microprocessors in the fourth generation. The fifth generation has seen much of those move to wireless forms of connectivity, allowing for more flexibility to control devices (which are usually themselves miniature computers) from voice assistants and so we replace many of the electrical devices in our homes (such as light switches and HVAC controls) with internet-enabled alternatives.

  • We became more dependent on the devices. The original computers were used primarily for scientific research and built as one-offs by scientific teams at universities (similar to quantum computing five years ago). But as they spread to business, people would get checks that had holes punched in them for processing, or trade those in for cards with magnetic stripes, and now NFC chips on our phones to enable transactions. We need the maps to find our way, rely on alarms to get up in the morning instead of traditional clocks, shop online, trade stocks online, bank online. We depend on these things sometimes because they’re more efficient, sometimes because they’re easier (thus improve our quality of life), and sometimes because they more consistently provide telemetry into aspects of our lives than we can get from our own qualitative analysis.

  • Computing companies got wealthier. There’s a reason “big tech” is constantly in the news. It’s big business. Burroughs paid nearly 4 and a half billion to merge with Sperry in 1986 forming the second largest computing company in the world at the time. IBM now has a market cap of over $10 billion but the companies that are far younger (or were born in subsequent generations) are even wealthier, like Apple who is nearing a $3 trillion market cap. We see this in sub-generations, so Netscape made plenty of millionaires but Google made billionaires. And we see this in the transition to online away from, as an example, brick and mortar - so Barnes and Noble sits on a $367 million market cap while Amazon is worth a staggering $1.7 trillion in comparison.

  • We get more productive. Those rewards in terms of wealth are there for a reason. Productivity and profit. The First Generation of computers made some of the best scientists in the world more productive, allowing them to do things like develop the hydrogen bomb. The Second Generation spread throughout academia and as we turned the page to the Third Generation even spread to the military-industurial-university complex and from the biggest companies down to any company. And personal computing in the Fourth Generation gave anyone accounting software, the ability to print, spreadsheets, and allowed us to leverage computing for leisure - so games. And so by the Fifth Generation we went from sharing access to a computer to ubiquitous computing where many of us have an average of three devices on us most of the time. They enable us to be hyper-productive and put ourselves on the strictest of schedules, diets, workout regiments, etc. Consider when we might update cells through an entire book of sheets by hand, maybe using an older electro-mechanical calculator vs now we just sum a column in about a second. Or just pull some data in from a sql database or a REST endpoint through Postman instead, since someone else has probably done whatever task we need to do.

  • Intellectual property needs greater protections. The standards bodies represent a way to protect competitive marketplaces but for vendors who choose to take products to market on their own, we've seen an evolution in patent and copyright laws protect the increasingly expensive cost of research and development. Computers from the First and Second Generation relied primarily on patents but as software became portable in the Third Generation we had to explore how copyright case law was applied to source code and assembled code. And as design became increasingly important we had to see a new body of laws emerge to protect design elements. These help reward organizations that put new and novel solutions on the market for a duration in order to fuel further innovation.

  • We put funding new companies and ideas on an assembly line. There are so many good ideas. When they work, we think of them as innovations. Companies form to take innovations to market and help spread their benefits throughout the world. Rich patrons funded research for centuries but as the Industrial Age unfolded we got new ways of funding companies. Venture capital emerged following World War II, as Georges Doriot and others wanted to push the world forward faster. That helped spur the Second Age of computing faster as he made massive gains for his limited partners on Digital Equipment as they ushered in transistorized computing out of the TX-0 work done at MIT. Arthur Rock helped Apple bring in the Fourth Generation by supporting Intel, Apple, and others. And the number of (and types of) funding continue to grow with organizations like Y Combinator and Techstars going downstream to a seed stage and private equity firms like KKR, Bain, and Blackstone helping take on the later stages (and public markets of course).

  • We have more telemetry. Computers stored more and more data during the Third Generation when magnetic storage and tape allowed us to store information permanently. Banks, hospitals, universities, and every type of organization could store our records and look them up later. As we moved to the Fourth Generation we could store our own data on floppies and CDs and with the rise of the web and the invention of REST in 2000, we started being able to exchange information as service. Now we can see incredibly detailed information mashed up (sorry to use an old buzzword there) in any app that decides to pull it in. That's data, and of course the video advancements don't even need a mention here.

  • We get more collaborative. When Kemeny and Kurtz wrote BASIC they’d been taking a train up to MIT to process punch cards they loaded onto the system. At the time, collaboration was having a person like John McCarthy tell them they should look into getting grants to build their own timesharing systems. And so they networked that and provided it as a service to others, just as PLATO had done for students at the University of Illinois in the 1960s. Those Third Generation systems evolved into tools like Lotus Notes and Microsoft Exchange in the Fourth Generation and then from email and groupware to apps for everything and private and corporate chat built into most tools we use today. Collaboration across different disciplines and cultures and sections of the population is the wellspring of innovation.

  • We get access to more information. The Memex was a system foreshadowed in an essay called “As We May Think” by the great Vannevar Bush in 1945. He told us of a future where we could have a device on our desk that gave us access to all the written information of the world and allowed us to quickly get at the exploding branches of science in the post-War era. We started cataloging data as soon as we had permanent storage but the arrival of nets of computers and their ability to communicate with each other in an internet in the Fourth Generation saw an explosion in people creating content, cataloging everything from recipes to the history of the world, and then with user generated content in the rise of Web 2.0 as we turned the page to the Fifth Generation, we saw an even deeper understanding of one another. Perhaps too deep sometimes.

There are so many other trends, but these are some of the most impactful. Our lives are so different than those who sat in 1945 reading Bush’s article in The Atlantic. And yet the assembly line of innovation continues on and the next developments are likely to continue to shape our lives in new and yet similar ways. This is where the timeline helps us get at new insights.

One reason to observe these impacts on our overall cultures and lives is that by looking at the trends through the past, we can infer certain things about the future. Let’s take this diagram we’ve been developing on the history of programming languages, operating systems, processing technology, etc.


Now let’s ask what social coding, re-usable bits of object-oriented code, service oriented architectures, voice assistants, deep learning, sensors on/in everything, and other aspects of technology do to the way we live our lives. Another trend through the generations of computing is that we get a new evolution, the new advancement is shoehorned into architectures from the previous generation, and at some point we rethink how we do things and refactor around the evolution - thus realizing an even greater return on what we got. Which leads us to the final question here: what is next for each tool in the computing arsenal that runs our modern life and how can we link it all together in new and interesting and innovative ways?

740 views0 comments

Recent Posts

See All

Some of our technical projects

Building software is in our DNA. We also like to help guide others when they're doing so. We have enough projects up on our github and codenamed trinkets here and there that we decided to put up a lit

bottom of page