We talk a lot about “digital.” But, ironically, digital isn’t binary. It’s a gradual encroachment of technology at the intersection of humans and computers. If public services don’t keep up with shifts that have radically democratized technology, we’ll miss huge opportunities to make government a platform atop which people can build their own tools and interfaces.
It’s useful to think of our relationships to digital tools as a spectrum. At the one end is the dawn of computing: People toiling over analog computers, sliding beads on an abacus, flipping knobs to calibrate a bomb sight or adjusting a slide rule. This was the start of our relationship with computers, and it was ugly, hard, and slow.
The workings of the Norden bomb sight, an early analog computer (http://www.twinbeech.com/images/bombsight/bifnordennomenclature.jpg, Public Domain, https://commons.wikimedia.org/w/index.php?curid=3771841)
As we separated software from hardware, we used punched cards—an idea stolen from looms—to store our programs. A single wrongly-punched hole could ruin an entire programming job, and computers were scarce. We stayed up late for free time on the mainframe, adjusting our schedules to the machines.
This deck contained a single program—and every hole had to be in the right place, entered in the correct order. (ArnoldReinhold – Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=16041053)
Back then, computing was so valuable there weren’t any spare cycles left to make things easier for the humans. Early digital programming was as close to the vacuum tubes, transistors, and chips as possible. It was literally Machine Language. My first computer, an Apple ][ clone, came with few instructions; I had to resort to manuals that explained how to program its CPU (the 6502.)
Machine language programming is still abstracted from the actual machine. To understand how complicated this makes it, here’s a trip down memory lane:
- Binary (ones and zeros) are electrical settings (on and off.)
- Four binary digits store a number from 0 to 16, known as hexadecimal, which we can count to using 0-9 and A-F. Traditionally, we prefix these numbers with a “#”
- Eight binary digital store numbers from 0 to 255, which was plenty for early computers. That means a value (#A4) is 164.
- On the Apple, each hex value had a specific meaning. So, for example, #A9 means “Store the number I’m about to give you in the accumulator.”
- To make this a bit easier to remember, you can use a mnemonic: “LDA” means “Load the accumulator.”
Which means machine language code looks like this:
- BASIC had a problem, though: It was slow. But through the inexorable progress of Moore’s Law, that changed.
- We developed other languages that could be compiled into machine code to make them faster.
- We developed more human interfaces—mice, graphics cards, GUIs, voice recognition, speech, and so on.
- Screens became inputs, which meant the interface could show you only what was relevant to the task at hand, in context.
Today, even a cheap computer like the Raspberry Pi outstrips humans senses in many ways. That means there’s spare computing to devote to making things easier. Today, computers are more than meeting us halfway. Most of the code in my phone’s calculator app, from buttons to display, is devoted to my human senses. Very little of that code is actually about doing math. Access to computing has changed dramatically.
SaaS platforms have democratized the building of digital things. You can mock up an app in Figma or Canva, and get someone halfway around the world to build it for you, then run it on a distributed cloud without ever touching a server.
Anyone can set up a presence online with a blog that looks better than most dot-com websites circa 2000. Many SaaS tools offer “no-code” programming—you can configure a pretty robust CRM without knowing how databases work—and for more advanced development there are platforms like Bubble and Webflow.
At the bleeding edge, software development is becoming less about writing code, and more about managing data. With GPT-3, you write prompts, often with surprising results. For example, you might write:
And GPT-3 would respond with:
(You can read the complete response here.)
There’s another, more philosophical, change underway. My Apple did my bidding: I turned it on, and then gave it instructions. I, on the other hand, do my iPhone’s bidding. Instead of a patiently waiting cursor, I get a constellation of tiny red dots reminding me of my digital obligations.
Put another way: I used to write code that produces data; now, I create data that produces code. I used to tell a machine what to do; now, it tells me.
The spectrum from machine-convenient to human-convenient tech interfaces.
These are huge, tectonic shifts. And yet we haven’t really embraced them in the public sector. Rather than giving people canned reports, PDFs, and static tools, we need to give them APIs, no-code interfaces, and the ability to play with the data that is rightfully theirs. We talk about service design, but what about letting citizens design the services themselves? And not just geeky civic technologists, but average citizens?
When we say, “government as platform,” are we thinking big enough?
Perhaps one of the roles of tomorrow’s government services is a platform atop which everyone can build their own apps and tools, whether that’s simply analyzing the vast troves of data governments hold on our behalf, or creating dashboards for our digital selves. But to do otherwise is to ignore a century of change in computing—and to deny our citizens their digital agency.