• To go on a bit about what I think a computer could look like, how physicists and electrical engineers talk about exactly what computers do physically in a real CPU is not needed to think about computers in less of a “magical” way.

    Kids might also have heard people say “an abacus is the simplest computer” or “the pocket calculator you use in your maths class is a particularly simple example of a real computer”.

    You can make a kind of “computer” just by putting a bunch of switches or levers in a row and manually flicking them on and off to run simple bytecode I believe. Computers are just as much, if not more, about mathematical abstraction than they are about physical abstraction and engineering.

    All you really need to make a kind of computer is things laid out in front of you that can be in a particular state (either 0 or 1). You could write a few 0s and 1s on a bit of paper or do the same thing with on/off switches. Then all you need is to decide some rule where you look at what the states are and it allows you to change them or write out the new states.

    I believe it should even be possible, especially if you are still manually changing the states, to make something that looks very similar to a basic pocket calculator using this as a proof of concept.

    You could try making a low resolution screen out of a grid of square bits of paper or cardboard that are different colours on each side and spin around like when there’s a secret room in a mystery novel. Those are basically pixels that are on or off, making a black and white terminal.

    Displaying digits and the symbols like + and - by turning these pixels on and off should also be possible but could be a bit of a pain. In fact, you might have noticed some of the simplest pocket calculators don’t even bother making a screen as complicated or versatile as what I’m describing, because they don’t need to.

    The point is that when you can display numbers and symbols on your screen based on how the long line of levers are set up, and essentially do addition, the only other thing you need is user input.

    “User input” here is kinda tricky. If you actually made this computer and did it by manually changing the levers, you don’t necessarily care since you have such an intricate understanding of this computer and can just change around the other levers describing what the screen should look like and what numbers are stored, and I’m not gonna say too much about trying to make buttons. You could just press a button and then store that the button was pressed during that cycle of executing your kind of bytecode. It would be pointless if you were manually flipping all the levers anyway though.

    If you were following and most of what I said seemed right, you can hopefully see that you’ve hypothetically built something like a pocket calculator. You’re probably not actually unironically doing all the steps and making a proper computer if you’re reading this, but hopefully this computer has been built up as more of a mathematical object that changes state based on certain rules.

    Anybody who isn’t a computer genius would probably cry (or at least tell someone to fuck off) if they were asked to correctly run through every step to put in 2+2, have this calculator work out the result, then display it on the screen.

    You can imagine a pissed off “IT worker” going through and actually doing all of this though. This can be compared to someone who’s skilled with an abacus.

    But the abacus person would be way faster. It would be no competition. That’s because your silly excuse for a computer has a CPU frequency of the reciprocal of whatever unimaginably tedious period of time it takes your “IT worker” to figure out what they’re even supposed to be doing in this crazy job and do all the steps correctly.

    So at this point I think at least I have some idea on what a computer is. You can think about it as just being a CPU that’s usually somehow connected to an input and a display where a bunch of levers (or something else) move around and it does calculations. Every calculator is arguably a computer.

    A computer is a kind of mathematical concept of a state machine that does calculations by constantly changing state.

    I think a computer is more of a concept though. It doesnt even have to actually exist. If you can do binary arithmetic in your head by thinking of some binary numbers and trying to add them or move them around based on certain rules, you’re creating a very simple computer in your own mind.

    People are sort of computers. Binary numbers are obviously analogous to base 10 or whatever else. If you can do a basic calculation that isn’t by literally dragging little bits of food together like an infant, then you’re using some kind of computer. That should be especially clear if you try to remember numbers to use in future calculations. You might even remember certain values in your own head like pi. Look up the Boltzmann brain if you want to as well.

    A lot of people don’t like the idea people are computers though, and I am in no way convinced everyone is an automaton. People could just have a computer sometimes running in their head. Also the mind seems to go beyond being just a concept itself, even though computers can be used to write computer-based procedures (sets of instructions).

    Getting back to the point, the original calculators were workers in offices who used abacuses. That was just the word for them.

    Alan Turing invented computers because he wanted to solve the enigma problem and decode what the Nazis were saying. I don’t know whether he imagined the stuff people would be doing with computers now.

    There’s one more thing I wanted to say that you may already have realised: computers can do a lot more than be calculators. They can use a set of mathematical steps (a program) to decide what to do or display. If you made your screen out of a grid of spinning card like I said before (and you had enough levers), there’s nothing stopping you from having more complicated rules describing how to make a text terminal or run Doom. An easier example would be a computer that does absolutely nothing except run the original Pong game. Very old consoles that didn’t even have any kind of boot menu were like this.

    Everything else we do on computers nowadays should be possible if you’re clever enough and work it all out (though going beyond a certain point would be very tedious and not very useful to anyone). Also if you wanted something like wifi, you’d also need to figure out how to connect it to some kind of radio.

    The first step in going a lot further, though, would be to make a full text terminal rendering letters and try to implement your own version of assembly language you could convert to bytecode.

    Anyway, I am not a computer expert at all and have 0 computer science qualifications, but that’s how I understand computers. Feel free to correct me or tell me if you read it all, you fully agree and you are a computer expert. That description may have lacked some details but hopefully I built up the concept of a computer and didn’t make any serious mistakes.

    I don’t really care about every single detail of how tiny computer chips work in real life, at least to have a concept of what a computer is.

    TL;DR: I think of computers more as calculators, think that explains most of what they do, and haven’t spent countless years studying how computers work. I’ve never even done this but I can explain the idea behind creating a fully fledged computer in minecraft or something like that (wiring systems make some of this a lot less manual).

    • The ideas behind making a general purpose modern computer like we have today from there are not hugely difficult to get some intuition for.

      If you wanna get only actually slightly more complicated than everything I explained there about how I see computers, you can switch gears to imagining actually trying to create a Minecraft computer you can code and play computer games on.

      Having a wiring system like redstone could automate all the logic of how the states change if you were clever enough with how you designed it. You could also design a similar, more mechanical computer using switches that actually flip on and off based on an initial state and go from there in real life. A computer scientist from the future in a society that do very little, if anything, with electricity like we have today could actually make a computer like this over a long course of time and wow the people of the past before computers got invented.

      Turing was actually just solving a maths problem, so any kind of invention that can be made to solve the same kind of problem (especially in an automated way) was a device essentially equivalent to Turing’s computer.

      You could design a computer in Minecraft (or in real life), that actually just had switches or buttons for user input, with all the other switches being hidden somewhere else and flipping automatically in complicated ways to activate or deactivate pixels on a screen. This would be way faster than the “Information Technology worker” (who would need a lot of specialist knowledge and skills about how the computer was built) from before, and you could much more easily make a general purpose computer from there.

      That more mechanical or Minecraft-y computer is only in theory a little bit more sophiscated (since it follows the same basic rules anyway), but can be made to do anything we do on computers today by building up enough coding. Someone with enough technical knowledge could write their own full compiler tower for a computer/CPU architecture and go all the way to writing high level interpreters so their machine can run modern Python code.

      In the early days of computers, when some very technically skilled people had access to computers they could write assembly on, the only next step would be to write a general purpose operating system (instead of just having one set of instructions to do a particular thing), which is easier than it sounds for a handful of experts across the world.

      A general purpose computer with an operating system in the early days would have started up to a command prompt where you could start processes in the shell that the kernel would store values from and flip between while the switches were going crazy and making flipping (or probably a lot of whirring) noises.

      • You could write and run any program on that kind of computer with a general purpose OS, and the operating systems we use today were written by teams of very clever people only a few decades in the past who were quick to think about this.

        Programs stored on your computer could be ran from the shell, and you could easily write your own simple ones to do more complicated calculations like factorials. You could start trying to do things like give a very precise estimate for pi based on a mathematical series by writing a computer program. Again, computers are basically calculators and any kind of calculator you can find that makes you not have to manually add stuff on bits of paper is arguably a kind of computer.

        These computers were the best kind of calculators ever invented, though. They were a cutting edge technology that a kind of scientist with coding skills could use to run complicated calculations. This was the start of “computer labs” and “computer scientists” that were glorified coders, but not glorified engineers.

        If we go back to what this kind of computer looks like internally, a computer lab was only actually one computer. The computer was made from huge stacks of towers of circuits that were connected and did different things. The computer would have rows of different early keyboards with outdated cables paired with old screens that only displayed monochrome text to people sat at desks doing maths.

        If you were an early computer scientist and what you cared about was doing mathematical and scientific calculations, this is what computing was like in around the 70s (I think).

        There weren’t any videogames or consoles that anyone cared about. A pong console or Atari were available around this time, but very few if any people actually enjoyed playing them.

        If you’ve ever seen the Angry Video Game Nerd, or AVGN, (especially his early videos) on Youtube, he introduced these kinds of computers to a far younger generation. His videos show what it’s like to play games on them. Kids with more advanced consoles would watch him to see what playing videogames was originally like.

        AVGN amused the vast majority of people in his target audience just by showing these games and cussing at them while trying to play. The videos were in a way set in the 70s, and his humour in the early AVGN videos was about complaining about gaming in the 70s and early 80s. He’s a collector of old computers who acts out what it would be like to live in a time where that was your computer. You could even think of his persona as being a computer scientist who only had digital videogame machines (a limited form of computer) and couldn’t do anything he wanted on them.

        But for all the computer scientists had, computer labs had no graphics. Early consoles had graphics. The advantage of a computer scientist’s computer was that you could write out calculations to do and make your own programs.

        Most consoles AVGN shows from around the 70s or early 80s were not general purpose computers at all. They didn’t run an operating system or kernel, really.

        From there, we can get to a “modern computer” that “feels like magic” easily. You could do the same things even if your computer was huge as you could on an actual modern laptop. It would be way slower because of the less intricate and fast-running circuitry, but the modern fully general purpose computer that does almost anything you want it to is quite trivial to make from this point given the time.

        What we actually want is a general purpose computer that’s almost just like what we have today. It should be able to program stuff AND render graphics. How this actually happened in computing history started with Unix, and parts of Unix are still a standard on almost every computer used today that’s hard to ever change.