Binary in fiction is often laughable to an engineer like me. You’ve all seen the same scenes I have. A scientific research facility gets a signal from outer space. After several try/fail cycles, one of the junior scientists has a Eureka! moment.
“Eureka! It’s binary!” the scientist says. [At least, I hope not. I really hope the writer does a better job than that. But, you get the idea.]
Within hours (or even minutes), the engineer translates the entire message to perfection (with images and blueprints even!) and we understand the message being sent by the alien civilization. We can then march off and build our new teleportation device, space ship, poverty-fixing widget, or, perhaps, merely learn the meaning of life.
Only one problem. Binary doesn’t simply translate directly to English words in a vacuum. Trust me…it doesn’t even do it in a Roomba. (Ba-dump-tis)
The upshot is this: You cannot simply translate binary to English (or any other language) without additional information.
Let’s find out why and how to fix it so that your scientist can at least appear to know what they’re talking about.
Who the hell is this guy, and why is he talking about binary instead of writing?
First, I’ll give you some background so that you’ll understand this isn’t merely an opinion of some random guy on the internet.
Many of you have been here for a while, so you know I’m a speculative fiction writer who mentors other up-and-coming writers. But, many don’t know that I’ve been a professional software engineer for more than 20 years, with nearly 20 years of personal experience prior to that. For those of you doing the math in the background, you’re correct. I’ve been programming computers since I was 10 years old.
Among my various bona fides, I have several years of university-level engineering mathematics under my belt, including a healthy dose of “discrete mathematics”, which is used in the realm of computer and software engineering, and is where Computer Science and other engineering students typically learn how to interact with binary.
In short, I’m the guy who’s going to read your book, or watch the movie based on your screenplay, and leave the experience shaking my head. The problem is that I’m likely your target demographic (geeky “computer dude/dudette” who is immersed in various fandoms, is likely a gamer, and is definitely an avid speculative fiction reader).
So, how do we turn this hot mess around? We learn just enough binary to be dangerous.
Don’t run away scared just yet! We’ll keep this simple. But, when it’s over, your readers will think you’re an expert.
What is Binary?
Binary is nothing more than the base-2 number system.
“Wait! You’ve lost me already! Base WHAT?! What’s a base?”
You’re already familiar with the decimal system. It’s the numbers you use every day, from buying gas, to paying your bills, to counting how many words you didn’t write today.
Sorry. Couldn’t resist. Back to the numbers…
What you may not be aware of is the decimal system is also referred to as the “base-10” system. Why? Because every digit can have 10 possible values, from 0 to 9.
In binary, which is “base-2”, each digit, which we call a “bit”, can have 2 possible values: 0 or 1.
That’s it. No magic. Nothing particularly scientific about it. Not any more than base 10, at least. It’s just a number system.
Okay. So what?
The numbers themselves don’t mean anything. There isn’t some direct “binary-to-English” or “binary-to-whatever-language” converter that doesn’t have to make certain assumptions first.
The binary number “01001110”, by itself, is just a number. It’s meaningless by itself. It contains 8 “bits”, which, in computer speak is known as a “byte”. So, how do we derive English (or some other language) from this byte of binary?
It’s simple. You need a lookup table.
In most computer systems, we use a lookup table known as an ASCII table. ASCII is an acronym that stands for American Standard Code for Information Interchange.
Here’s an example:
That table is particularly nice in that it includes not only decimal, but binary, octal (base-8) and hexadecimal (base-16). Many tables are abbreviated and only include the decimal values.
How do we work with binary?
Let’s take a look at how binary converts to something we all know. The decimal system. The first thing you have to understand is how the “weight” of the digits increase when we move from right to left. In the decimal system, the weight is “10”, so the number increases by powers of 10 as we move from right to left. The number 128, for example, has an “8” in the “ones” slot, a “2” in the “tens” slot, and a “1” in the “hundreds” slot. 100 + 20 + 8 = 128.
With me so far? We can put this into a handy table as follows. The top row is the “weight” of the digit. Notice how the weight increases by a power of 10 as we move from right to left.
Easy so far, right? This is stuff we learned back in elementary school.
In binary, since the system is base-2, the numbers increase in “weight” by powers of 2 as we move from right to left. Let’s take our binary number from above, 01001110, and plug it into a handy table. The top row is the “weight” of the digit in the decimal system. Notice how the “weight” doubles from right to left.
If we were to add one more binary digit to the left of this table, its “weight” would be “256”. After that, “512”, and so on.
Using the above table, we can easily convert the binary to decimal: 64 + 8 + 4 + 2 = 78.
So, the binary number 01001110 is the same as 78 in the decimal system.
Where’s our binary message from outer space?
In order to get there, we’d have to take the ASCII table above (or some other lookup mechanism of your devising that an alien civilization would plausibly know about) and combine it with our binary digit conversion (01001110 = 78) to arrive at the capital-letter “N”.
Hopefully, you can now see why you can’t just take a random binary message and convert it into English. Too many assumptions have to be made for that to be successful. First, both the sender and receiver would have to read English. Second, both the sender and receiver would have to agree on a data interchange format. We looked at ASCII as a data interchange format, but there are many others, and computer programmers routinely create ad hoc interchange formats to meet specific needs.
And we haven’t even looked at how to transmit/receive binary data, such as images, video, or sound!
Before we go, I’ll leave you with this exercise. See if you can translate the binary message in the image using an ASCII table.
Additional Info (Optional Reading)
[Update – July 3, 2019]
Even if all of the above is in place (you have a lookup table that both you and your alien civilization would plausibly agree on, you understand how to convert from binary to decimal, your alien civilization speaks English, etc.) you can’t begin translating the message until you know the “word size” of the message. In our example here, I used a “word size” of 8 bits, also known as 1 byte. There’s nothing special about that number outside of human computers. Not to the universe, at least. So, you and your alien civilization would have to arrive at the same “word size” before any communication could begin, or else you’ll be plugging the values into the wrong columns on your table.
If you’ve been around computers for any length of time, you’ve likely heard the expression “32-bit operating system” or “64-bit processor,” etc. That number is telling you the “word size” of the computer in question. A 32-bit computer can process 32-bits of information at one time. Double for a 64-bit computer. In these systems, the binary number contains a host of information, from the “sign” of the number (positive or negative), to computer “op codes” (numbers that trigger certain operations within the processor), to memory addresses, to values of all different kinds (characters, floating-point values, integers, etc.)
“So, modern computers have moved from 32-bit systems to 64-bit systems because they’re faster that way, right?”
Yes. But, more than that, you’ve learned enough about binary now to understand that a 32-bit system can’t refer to (or “address”, as we call it) as much memory as a 64-bit system, because a 64-bit system has literally twice the digits! If the address on your house, for example, only used 3 digits, then the maximum number you could fit in your address would be 999. If you used 4 digits, it would be 9999, and so on.
“Right, so a 64-bit system can address twice the memory that a 32-bit system can address?”
No. The increase is exponential. Because math. A 32-bit system can address a maximum of 4 Gigabytes of RAM (Random Access Memory). A 64-bit system, on the other hand, can theoretically address 17 billion Gigabytes (we’ll call that 16 Exabytes) of memory!
Yes, I was cautious and said “theoretically”. That’s not because the numbers are wrong. It’s because we will have moved on to other architectures long before our software requires 16 Exabytes of RAM. Computers have other problems to address before they’ll be able to work with that much data. For example, over the fastest USB connection we currently have at our disposal (USB-C at both ends of the connection), it would take 174,980 days to transfer 16 Exabytes of data. That’s 479 years. The current worldwide average internet speed is 22.82 Mbps (megabits per second) according to Ookla, one of the standard speed testing companies out there. We’ll call it 23Mbps to make the math easier. At that internet speed, it would take 77,904,206 days to download that 16 Exabyte file. That’s 213,436 years.
So, while it’s theoretically possible to work with 16 Exabytes of data on a consumer computer system, it won’t be even remotely practical until technology and infrastructure advance by several generations.
- Transfer speeds calculated with
Sign up for the free Erindor Press newsletter. Stay Informed. Be a better writer. Your contact information will NEVER be shared for ANY reason. Join Nat on Facebook for additional content that he doesn’t post on the blog or on Twitter. Be part of the conversation! Head on over to The Mukhtaar Estate and see what everyone’s talking about!