In the land of Potato PCs
Ittesaf Yeasir
Posted on July 2, 2024
The term "Potato Computer" is used for essentially underperforming PCs. For instance, if you get 30 FPS on a relatively low resource heavy game, you have yourself a Potato PC.
Here's an actual potato PC for reference
But how much of a potato is it, really?
Okay let's picture this for a second: You take a PC and start stripping it off its features. You take away a lot of its RAM and Storage, maybe some of its I/O, and keep going down that route. What you will have eventually is something we can't even call a computer, literally. But where exactly do we draw that line? What exactly IS a computer?
No matter how much you strip its features down, as long as it is able to perform three applications: process data, store data, and manipulate data- it will still be considered a computer. Now what kind of processing are we talking about here? It can be as simple as having the ability to add two numbers.
With that knowledge, a computer that can just add two numbers would be hell of a potato, wouldn't it?
Let's get back to the analogy of stripping features off.
Let's say you have a calculator app which you want to strip features off without losing functionality. First thing you could do is get rid of the Graphical Interface. So now, your calculator looks kinda like this:
You can even go one step further and take away this interface that we are using to write the code. That will leave you with what we call "Machine Code", or essentially just zeroes and ones.
Interacting with the computer at such capacity is known as low-level computing. Essentially the more features we strip, the lower we are.
Operating at a very low level makes us understand the true architecture of a computer and really appreciate the ingenuity of the engineering behind it. And apparently it's something that's taught in Uni as well, so I naturally had a curiosity towards it. But I had no idea where to start. I didn't want it to get painfully boring because I knew I'd quit, but I also didn't want it to be too "too interesting" because that strips me off a lot of fundamentals I need to know in order to truly appreciate low level programming.
This is when I discovered NAND to Tetris: A course designed by University Professors as an Introduction to Low Level Programming. The name of the course: NAND to Tetris outlines the mechanism of how the course operates. You start with a NAND gate and gradually make your way to coding Tetris on your machine.
The course is split in two parts: a Hardware part- where you gradually progress to make a 16-bit Computer, and a Software part- where you code an OS for your computer, then create a language, and then code Tetris in your language. Hence the name: NAND to Tetris.
I only completed the first half of the course because the second half didn't interest me as much. With some prior understanding of coding, I had very little problem going through each module.
Separated in five weeks, the first week of the course focuses on building the basic Logic Gates (AND, OR, XOR, etc.). Then we move on to creating more complex circuits like Half Adders, Adders, Incrementors, Program Counters, the ALU, the RAM, and finally it all connects to create the Computer.
What this course really did for me is that it gave me another spectacle to view operations that I already know about. For instance, IF statements are now Multiplexers, a Loop is intertwining the I/O of two different components, and so on.
I can now read stuff like this:
0000000000000010
1110110000010000
0000000000000011
1110000010010000
0000000000000000
1110001100001000
or this: (The following code is not complete)
@R2
M=0
@R0
D=M
@STEP
D;JGT
@END
0;JMP
and have at least some understanding of how these things work.
One thing I should mention is this course teaches its own version of HDL, and uses its own syntax for writing code. Which is done intentionally to make the course a bit easier.
Was this absolutely necessary to learn? No.
Was it fun? Hell yeah, it was.
So what's next?
Maybe learning some actual VHDL and build a machine myself? Raspberry Pi?? Other microcontrollers??? I have absolutely no idea. But I am excited to find out.
Posted on July 2, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.