Categories
Hardware Programming

Supplementary UC Baseline notes #1: The connection between binary and hexadecimal numbers

For the benefit of my classmates in the UC Baseline program (see this earlier post to find out what it’s about), I’m posting a regular series of notes here on Global Nerdy to supplement the class material. As our instructor Tremere said, what’s covered in the class merely scratches the surface, and that we should use it as a launching point for our own independent study.

Photo: A slide showing 4 rows of 8 lightbulbs displaying different binary values. Inset in the lower right corner: UC Baseline instructor Tremere lecturing.
The “binary numbers” portion of day 1 at UC Baseline. Tap to see at full size.

There was a lot of introductory material to cover on day one of the Hardware 101 portion of the program, and there’s one bit of basic but important material that I think deserves a closer look, especially for my fellow classmates who’ve never had to deal with it before: How binary and hexadecimal numbers are related.

The problem with binary
(for humans, anyway)

Consider the population of Florida. According to the U.S. Census Bureau, on July 1, 2019, that number was estimated to be 21,477,737 in base 10, a.k.a. the decimal system.

Here’s the same number, expressed in base 2, a.k.a. the binary system: 1010001111011100101101001.

That’s the problem with binary numbers: Because they use only two digits, 0 and 1, they grow in length extremely quickly, which makes them hard for humans to read. Can you tell the difference between 100000000000000000000000 and 1000000000000000000000000? Be careful, because those two numbers are significantly different — one is twice the size of the other!

(Think about it: In the decimal system, you make a number ten times as large by tacking a 0 onto the end. For the exact same reason, tacking a 0 onto the end of binary number doubles that number.)

Hexadecimal is an easier way to write binary numbers

Once again, the problem is that:

  • Binary numbers, because they use only two digits — 0 and 1 — get really long really quickly, and
  • Decimal numbers don’t convert easily to binary.

What we need is a numerical system that:

  • Can represent really big numbers with relatively few characters, and
  • Converts easily to binary.

Luckily for us, there’s a numerical system that fits this description: Hexadecimal. The root words for hexadecimal are hexa (Greek for “six”) and decimal (from Latin for “ten”), and it means base 16.

Using 4 binary digits, you can represent the numbers 0 through 15:

Decimal Binary
0 0000
1 0001
2 0010
3 0011
4 0100
5 0101
6 0110
7 0111
8 1000
9 1001
10 1010
11 1011
12 1100
13 1101
14 1110
15 1111

Hexadecimal is the answer to the question “What if we had a set of digits that represented the 16 numbers of 0 through 15?”

Let’s repeat the above table, this time with hexadecimal digits:

Decimal Binary Hexadecimal
0 0000 0
1 0001 1
2 0010 2
3 0011 3
4 0100 4
5 0101 5
6 0110 6
7 0111 7
8 1000 8
9 1001 9
10 1010 A
11 1011 B
12 1100 C
13 1101 D
14 1110 E
15 1111 F

Hexadecimal gives us easier-to-read numbers where each digit represents a group of 4 binary digits. Because of this, it’s easy to convert back and forth between binary and hexadecimal.

Since we’re creatures of base 10, we have the single characters to represent the digits 0 through 9, but no single character to represent 10, 11, 12, 13, 14, and 15, which are digits in hexadecimal. To work around this problem, hexadecimal uses the first 6 letters from the Roman alphabet: A, B, C, D, E, and F.

Let’s try representing a decimal number in binary, and then hexadecimal. Consider the number 49,833. It’s the number for the Unicode character for ©, the copyright symbol. Here’s its representation in binary:

1100001010101001

That’s a hard number to read, and if you had to manually enter it, the odds are pretty good that you’d make a mistake. Let’s convert it to its hexadecimal equivalent.

We do this by first breaking that binary number into groups of 4 bits (remember, a single hexadecimal number represents 4 bits, and “bit” is a portmanteau for “binary digit”):

1100     0010     1010     1001

Now let’s use the table above to look up the hexadecimal digit for each of those groups of 4:

1100     0010     1010     1001
C           2           A         9

There you have it:

  • The decimal representation of the number is 49,833,
  • its binary representation is 1100001010101001,
  • in hexadecimal, it’s C2A9,
  • and when you interpret this number as a Unicode character, it’s this: ©

How to indicate if you’re writing a number in decimal, binary, or hexadecimal form

Because we’re base 10 creatures, we simply write decimal numbers as-is:

49,833

To indicate that a number is in binary, we prefix it with the number zero followed by a lowercase b:

0b1100001010101001

This is a convention used in many programming languages. Try it for yourself in JavaScript:

# This will print "49833" in the console
console.log(0b1100001010101001)

Or if you prefer, Python:

# This will print "49833" in the console
print(0b1100001010101001)

To indicate that a number is in hexadecimal, we prefix it with the number zero followed by a lowercase x:

oxC2A9

Once again, try it for yourself in JavaScript:

# This will print "49833" in the console
print(0xc2a9)
print(0xC2A9)

Or Python:

# Both of these will print "49833" in the console
print(0xc2a9)
print(0xC2A9)

Common grouping of binary numbers and hexadecimal

4 bits: A half-byte, tetrade, or nybble

A single hexadecimal digit represents 4 bits, and my favorite term for a group of 4 bits is nybble. The 4 bits that make up a nybble can represent the numbers 0 through 15.

“Nybble” is one of those computer science-y jokes that’s based on the fact that a group of 8 bits is called a byte. I’ve seen the terms half-byte and tetrade also used.

8 bits: A byte

Two hexadecimal digits represent 8 bits, and a group of 8 bits is called a byte. The 8 bits that make up a byte can represent the numbers 0 through 255, or the numbers -128 through 127.

In the era of the first general-purpose microprocessors, the data bus was 8 bits wide, and so byte was the standard unit of data. Every character in the ASCII character set can be expressed in a single byte. Each of the 4 numbers in an IPv4 address is a byte.

16 bits: A word

Four hexadecimal digits represent 16 bits, and a group of 16 bits is most often called a word. The 16 bits that make up a word can represent the numbers 0 through 65,535 (a number sometimes referred to as “64K”), or the numbers -32,768 through 32,767.

If you were computing in the late ’80s or early ’90s — the era covered by Windows 1 through 3 or Macs in the classic chassis — you were using a 16-bit machine. That meant that it stored data a word at a time.

32 bits: A double word or DWORD

Eight hexadecimal digits represent 32 bits, and a group of 32 bits is often called a double word or DWORD; I’ve also heard the unimaginative term “32-bit word”. The 32 bits that make up a word can represent the numbers 0 through 4,294,967,295 (a number sometimes referred to as “4 gigs”), or the numbers −2,147,483,648 through 2,147,483,647.

32-bit operating systems and computers came about in the mid-1990s. Some are still in use today, although they’d now be considered older or “legacy” systems.

The IPv4 address system uses 32 bits, which means that it can represent a maximum of 4,294,967,29 internet addresses. That’s fewer addresses than there are people on earth, and as you might expect, we’re running out of these addresses. There are all manner of workarounds, but the real solution is for everyone to switch to IPv6, which uses 128 bits, which allows for over 3 × 1038 addresses — enough to assign 100 addresses to every atom on the surface of the earth.

64 bits: A quadruple word or QWORD

16 hexadecimal digits represent 64 bits, and a group of 64 bits is often called a quadruple word, quad word, or QWORD; I’ve also heard the unimaginative term “64-bit word”. The 64 bits that make up a word can represent the numbers 0 through 18,446,744,073,709,551,615 (about 18.4 quintillion), or the numbers -9,223,372,036,854,775,808 to 9,223,372,036,854,775,807 (minus 9.2 quintillion through 9.2 quintillion).

If you have a Mac and it dates from 2007 or later, it’s probably a 64-bit machine. macOS has supported 32- and 64-bit applications, but from macOS Catalina (which came out in 2019) onward, it’s 64-bit only. As for Windows-based machines, if your processor is an Intel Core 2/i3/i5/i7/i9 or AMD Athlon 64/Opteron/Sempron/Turion 64/Phenom/Athlon II/Phenom II/FX/Ryzen/Epyc, you have a 64-bit processor.

Need more explanation?

The Khan Academy has a pretty good explainer of the decimal, binary, and hexadecimal number systems:

Categories
Hardware Process Tampa Bay What I’m Up To

Scenes from Day 3 of the “UC Baseline” cybersecurity program at The Undercroft

Wednesday: Day 3 continued the heavy hands-on portion of Hardware 101, the first segment of my five weeks at UC Baseline, the cybersecurity training program offered by Tampa Bay’s security guild, The Undercroft.

After taking apart and reassembling a desktop, it was time to up the ante and do the same with at least one laptop. I started with a Dell Latitude E5500, a bulky beast by today’s laptop standards, but one that’s more user-serviceable — and more easily taken apart — than most.

First step: Removing the battery.

The bottom panel was easy to pop open. It was held in place by nothing fancier than standard Phillips screws, which provided easy access to the RAM.

Next on the removal list: The optical drive. Once again, pretty straightforward — remove some anchoring screws, and then use a flathead screwdriver tip to push the the drive casing out.

The fan was quite easy to remove, as was the CPU heat sink.

Unlike the previous day’s desktop machines’ CPUs, which were in ZIF (zero insertion force) slots, laptop CPUs aren’t typically swappable, as they’re generally soldered onto the motherboard. This machine had a notebook-grade Core 2 Duo, which was typical for a mid-level laptop in the Windows 7 era.

It was also pretty easy to remove the keyboard…

…and once that was done, detaching the screen was a simple process.

With the disassembly complete, I laid out and labeled the parts that I’d extracted:

“All right, next challenge,” said Tremere, our instructor for the Hardware 101 portion of the course. “Disassemble, then reassemble the small one…”

I flipped it over, pleasantly surprised to see standard Phillips screws that were easy to access:

At this size, a laptop’s battery-to-actual-computer ratio jumps significantly:

This machine was still intended to be somewhat user-serviceable, so the battery and RAM were still easy to remove:

The drive didn’t take much effort to liberate, either:

The fan/heat sink combo didn’t put up much of a fight:

This is a machine made specifically for writing TPS reports and not much else, judging from its CPU. Still, I’m sure it could still do a serviceable job running a modern lightweight Linux — assuming it survives my disassembly and subsequent attempt to put it back together again.

Here are both patients, spread out across the operating table…

Re-assembly took a little longer, and I didn’t bother with photos of that process. I did manage to get it back together again, and with no extra parts!

I even the screen reattached! Later, I found a power adapter, and the machine managed start and get up to the BIOS screen, although the screen looked a little dim. Since I’m not trying out for a CompTIA hardware certificate, I’ll simply declare the procedure a success and not get too bogged down with fussy minutae such as “functioning” and “usable”.

Categories
Hardware Process Tampa Bay What I’m Up To

Scenes from Day 2 of the “UC Baseline” cybersecurity program at The Undercroft

Photo: A red brick building with a wrought iron balcony in a neighborhood of early 1900s brick buildings.
The Undercroft’s building, as seen from its parking lot. Tap to see at full size.

Tuesday was Day 2 of the UC Baseline cybersecurity training program offered by Tampa Bay’s security guild, The Undercroft. I lucked out and got into the inaugural cohort, which means that I’ll spend 8 hours each business day in the classroom (masked and distanced, of course) for the next four weeks.

UC Baseline is made up of a number of separate units, which The Undercroft also provides individually. Week 1 is taken up by the Hardware 101 course, which is all about hardware and providing the class — some of whom have a deep technical background, while others don’t — a baseline knowledge of how the machines that make up the systems that we’re trying to secure.

I suspect that there’s an additional goal of removing any fear of tinkering.

Day 1 of Hardware 101 was mostly lectures about hardware, starting with logic gates and working all the way up to CPUs and SOCs, and Days 2 and 3 were the “tear down/rebuild” days. Day 2 focused on taking apart and then rebuilding desktops, and Day 3 took it up a notch by doing the same thing with laptops.

One of the goodies that we got (and get to keep) is the toolkit pictured below:

The first exercise was a teardown-only one. We could choose from a selection of old computers at the back of the room to tear apart, and I thought it might be fun to try and take apart this old Power Mac G5 from the mid-2000s. These machines are notoriously opaque, and I thought it might be fun to try to dig through its guts:

The Power Mac G5 was aimed at Apple’s “power use” customer — typically creatives who need serious computing horsepower. This particular machine was used by an advertising agency to do 3D rendering. As such, it’s one of the few Macs that’s easy to open, at least superficially. Take a look at this beautiful Jony Ive-designed latch:

Opening the latch reveals the machine’s aesthetically-pleasing innards, which were covered by a plastic shield. I popped off the shield and got to work.

By the way, that yellow clip in the photo above is connected to my anti-static wrist harness (another goodie we got as part of the course fee). Nobody expected these machines to survive the teardown process, but it never hurts to consistently follow standard safe electronics practices!

The fans slid out surprisingly easily. I was surprised that the machine had a reasonable number of fans, given Steve Jobs’ famous dislike of fan noise, but this computer’s twin G5 processors gave off ridiculous amounts of heat. There’s a reason that Apple switched to Intel processors.

I then removed the cards from the two expansion slots. One was a high-speed network card; the other was pretty nice 2005-era graphics card:

Next up: The RAM!

After that came the Airport Extreme wireless NIC, freeing it from both the PCIe slot and its antenna wire:

That took care of the easy part. Time for a photo op:

Here’s what I yanked out so far. Note my screw management technique!

And now the hard part: getting to the processors. They’re encased in a pretty anodized aluminum box, and it turned out that the only way into it was to break the “warranty pin” — a plastic pin that acts as proof that a non-Apple-authorized person took a peek inside:

Behind the G5 door were the twin processors and their twin heat sinks:

I finished the teardown by identifying the components I’d extracted.

It was then time to move onto the next patient, a “TPS Reports”-writing desktop computer that we would have to disassemble and reassemble:

These are machines whose innards would need to be accessed by a mid-size office IT department, so it opens easily:

Modern computers largely fit together like Lego pieces. Even so, I kept notes on which cables went where.

Here, I’ve relieved the machine of its power supply and optical drive. It was missing a hard drive, so I retrieved one of the spare from the back of the room:

The final part of the assignment: Identify and retrieve the processor. It’s fairly obvious:

Here’s the processor, without the heat sink obscuring it. It’s an AMD Athlon II, which dates from around 2009 / 2010, when Windows 7 was a new thing:

The processor sat in a ZIF (zero insertion force) socket, which makes it easy to remove and then re-seat:

Look at all those pins. We’re a long way from my first processor, the 6502, which had only 40 pins.

Rebuild time! The machine had no RAM, so I grabbed two sticks from the back of the room and inserted into the primary slots, then put the rest of the machine back together again:

The final test — does it power up?

Success! A quick attachment to a monitor and keyboard showed an old Windows screen. Not bad for my first teardown/reassembly.

Categories
Current Events Hardware

Understanding Apple silicon (lots of videos)

Yesterday, I posted an article positing that WeWork’s CEO might just be indirectly and accidentally responsible for drastically changing the processor industry:

What if WeWork’s jamoke CEO accidentally changed the processor industry?

The article got a record number of pageviews, and I got a number of emails and direct messages asking all sorts of questions about Arm chips, from “What makes Arm processors so different?” to “Has anyone seen an Arm-based Mac in action yet?”

Here are some videos that should provide lots of background material to better help you understand Arm chips and Apple’s move to their own custom silicon.

Let’s start with this CNET supercut of the parts of the WWDC keynote where Tim Cook and company talk about Apple’s transition from Intel chips to their own Arm-based ones:

This is Max Tech’s best guess as to what the Arm-based Mac release timeline will look like:

Many people have a take on what Apple’s move to Arm means. Here are CNET’s top 5 guesses:

Here’s a video from a year ago that asks “Is Intel in trouble? Is ARM the future?”. It’s worth watching for its history lesson about Arm:

Here’s a really quick (under 6 minutes) look at Arm CPUs:

Here’s a more hardcore explanation of how CPUs (in general) work:

CPUs used to be stand-alone things, but we’ve been migrating to SOCs (systems on a chip) for some time. Here’s an explainer:

This Gary Explains video explains the differences between Arm’s and Intel’s architectures:

Here’s a reminder from Computerphile that Arm design chips — they don’t make them. There’s a difference:

Here’s a treat: an unboxing of Apple’s “developer transition kit”, which registered Apple developers can apply to try out to test their apps on Apple silicon. It’s a Mac Mini powered by an Apple A12z chip, which is the same processor that drives the iPad Pro.

Categories
Current Events Hardware

What if WeWork’s jamoke CEO accidentally changed the processor industry?

 

In order to understand this story, you need to be aware of this news item: Softbank is considering the options of selling outright, selling part of, or making a public offering of Arm, the British chip design firm behind the chips that power just about every smartphone, a whole lot of IoT devices (including the Raspberry Pi), a fair share of Chromebooks, and soon, Apple’s computers.

Softbank is considering this move because it needs the money. It has an activist investor that wants to see some changes, because it’s made some embarrassing investments leasing to considerable losses of both money ($16.5 billion for the financial year ending March 2020) and face.

One of those embarrassing losses is the fault of Adam Neumann, cofounder and CEO of WeWork, and the jamoke pictured at the top of this article. You may remember the story from last year, where the company — effectively a Regus pretending to be a Netflix — had to delay its IPO due to concerns about its pretend profitability and flaky, cult-of-personality non-leadership.

These concerns led investors to take a closer look at their numbers and Neumann’s aberrant behavior and business dealings. This in turn led to Neumann stepping down as CEO in September, SoftBank taking control of their investment, and paying Neumann $1.7 billion to leave the board.

Simply put, Neumann’s hijinks cost Softbank a lot of money, and they now have an investor putting serious pressure on them to sell off assets to raise cash. Arm could be one of those assets.

At the same time, there are a number of interesting developments where Arm chips are concerned…

At WWDC 2020, Apple announced that they were moving their computers off Intel x86 chips, whose notoriously bad design is really showing its age these days, and to their own custom Arm-based chips. (Arm has “standard” chips, but if you’re a big player, you can work with them to have them design custom chips for you.) The Arm-based processors in the current line of iPhones run circles around not just the processors in Samsung’s flagship phones, but also most laptops as well.

Any talk about what Arm chips will mean for Apple is all speculation right now, but if you want to hear some really good speculation, as well as a decent Arm vs. Intel discussion, check out episode 777 of This Week in Tech:

In that episode of This Week in Tech, host Leo Laporte and his panel agree that Windows PC OEMs will probably end up switching to Arm processors, and they’re not the only ones saying it.

Arm also had a moment in the sun on the mainframe front: The new holder of the title of “world’s fastest supercomputer”, the Fugaku, is powered by Arm chips.

There’s a pretty good chance that Arm will end up being the de facto chip design to rule them all in the 2020s — and their maker is up for sale. In fact, there’s an unnamed interested buyer. I have a guess, and I’m not the only person to have the same idea:

(In case you’re wondering: Apple had $245 billion in their cash reserves last year, and Softbank bought Arm for $32 billion a few years ago.)

What do you think?

Categories
Hardware Humor

Two printer posts; one printer truth

I saw these two posts about printers this morning — one on Twitter, the other on Facebook, in a neighborhood forum where someone was asking for office equipment and furniture that people were no longer using:

I find that I use our home printer about once a year, typically for printing a letter that I need to enclose with a paper form that I’m sending via snail mail.

How often do you use your printer at home (if you have one) these days?

Categories
Current Events Hardware Players Tampa Bay

Win a System76 Thelio Linux desktop in The Mad Botter’s Fourth of July contest!

Mike Dominick’s Tampa Bay-based consultancy The Mad Botter — which develops automation/integration software — has a Fourth of July contest for high school or university undergrad students where the prize is one of System76’s gorgeous Thelio desktop Linux systems!
Mad Botter Fourth of July content icon (Mad Botter “Bot” dressed as Uncle Sam in front of American flags, fireworks, and balloons)

This is an election year, and The Mad Botter’s contest is an election contest. Contestants are asked to develop an open source project that addresses ballot access or in some other way assists with voting. Perhaps something to help people find the closest polling station? Virtual “I voted” stickers? An aggregator for open information about candidates? A “Yelp” for polling places? (You can find more ideas here.)

Here are the contest details:

  • No purchase is required to enter.
  • Your solution must be posted to a publicly accessible Github repository with the appropriate license included.
  • You must be a US high-school or undergraduate college student.
  • If you are below the age of 18, you must provide written parental consent to have your submission considered; this can be done via email.
  • In the event that you win, The Mad Botter INC is granted the right to post a picture of you in the winning announcement and other applicable venues; if you are below the age of 18 your parent or guardian also provides permission for this by consenting to your entering the contest.
  • The winning entry will be the one that shows the most practical potential and creativity and will be selected by The Mad Botter team.
  • All submissions should be sent to sales@themadbotter.com and include a brief bio, explanation of the solution, and a link to the Github repository.
  • Submissions will be accepted until 9/1/2020.

You can find out more at The Mad Botter’s Fourth of July contest page.

Also worth checking out

Mike has a podcast, The Mike Dominick Show, which covers technology and open source.

I was a recent guest on the show (Episode 25), and we talked about how the Toronto tech scene changed from dismal to dynamic,  how I stumbled into developer evangelism, learning iOS programming via raywenderlich.com and then joining them, SwiftUI, Python and Burning Man, the hidden opportunities that come with having to stay inside during the pandemic, and more!