Among the many technological breakthroughs of NASA’s Apollo project to land a man on the Moon was the Apollo Guidance Computer that flew onboard Apollo spacecraft. In an era when most computers were refrigerator-sized—if not room-sized—the AGC weighed only about 70 pounds. It was one of the first computers to use integrated circuits.
A team of computer historians got its hands on one of the original AGCs and got it working. A member of the team, Ken Shirriff, then decided to see if the computer could be used for bitcoin mining.
Mining is a key part of the process for maintaining bitcoin’s shared transaction ledger, or blockchain. To win the right to add a block to the blockchain, you have to solve a difficult problem: finding a block whose SHA-256 hash starts with a minimum number of zeros. The only known way to accomplish this is by brute force: miners create a block with a random nonce and compute its hash value. If the hash value doesn’t have enough leading zeros, the miner changes the nonce and tries again.
The required number of zeros is automatically adjusted so that the network produces a new block once every 10 minutes, on average. Currently, a block’s hash needs at least around 18 zeros (in its hexadecimal representation) to be accepted by the network—which translates to around 1022 trials to find a valid block.
Today, most bitcoin mining is done using specialized hardware capable of computing trillions of hashes per second. Shirriff’s software for the Apollo Guidance Computer was quite a bit slower than that: each bitcoin hash calculation takes about 10 seconds.
The Apollo Guidance Computer isn’t a very good bitcoin miner
“The computer is so slow that it would take about a billion times the age of the universe to successfully mine a bitcoin block,” Shirriff wrote.
This mostly reflects 50 years of progress in computing hardware. Thanks to Moore’s law, modern chips have vastly more transistors and can operate at much higher clock rates. Custom mining ASICs can compute a huge number of hashes in parallel.
But Shirriff also had to struggle with idiosyncrasies of the AGC that made it a poor fit for bitcoin mining. For example, the AGC used a 15-bit word, in contrast to modern computers that generally use 32- or 64-bit words. The SHA-256 algorithm performs a lot of 32-bit operations, so Shiriff had to split each 32-bit integer into three pieces—a 4-bit piece and two 14-bit pieces—and perform calculations on them separately.
The AGC also lacked the shift and rotate instructions that are standard on modern computers—and heavily used in a SHA-256 calculation—forcing Shirriff to write subroutines to perform these operations.
The AGC’s limited memory was also a handicap:
The AGC, like most computers of the 1960s, used magnetic core memory, storing each bit in a tiny magnetized ferrite ring. Since core memory was fairly bulky, the AGC had just 2K words (approximately 4K bytes) of RAM. The AGC’s addressing scheme made things more complicated since you could only access 256 words unless you used an inconvenient bank-switching mechanism. The problem is that the SHA-256 algorithm uses eight (32-bit) hash values, a 64-word message table, and 8 words of intermediate values. These three arrays alone used up 240 AGC words, leaving about 16 words for everything else (temporary values, subroutine return addresses, loop counters, pointers, etc.) I managed to get everything to fit in one bank by reusing these 16 words for multiple purposes, but I spent a lot of time debugging problems when a variable clobbered a location still in use.
This is not the first time Shirriff has implemented bitcoin mining on ancient hardware. A few years back he implemented bitcoin mining on an old IBM 1401 computer from the mid-1960s. This machine was even slower than the AGC, taking 80 seconds to compute a single hash. He also programmed a 1970s Xerox Alto to mine bitcoin—it could compute 1.5 hashes per second.
https://arstechnica.com/?p=1533085