Hexbyte  Hacker News  Computers Long Before Trees Overtook the Land, Earth Was Covered by Giant Mushrooms

Hexbyte Hacker News Computers Long Before Trees Overtook the Land, Earth Was Covered by Giant Mushrooms

Hexbyte Hacker News Computers


SmartNews
Keeping you current

Hexbyte Hacker News Computers 24 feet tall and three feet wide, these giant spires dotted the ancient landscape

Digging up a Prototaxites fossil
(University of Chicago)

smithsonian.com


From around 420 to 350 million years ago, when land plants were still the relatively new kids on the evolutionary block and “the tallest trees stood just a few feet high,” giant spires of life poked from the Earth. “The ancient organism boasted trunks up to 24 feet (8 meters) high and as wide as three feet (one meter),” said National Geographic in 2007. With the help of a fossil dug up in Saudi Arabia scientists finally figured out what the giant creature was: a fungus. (We think.)

The towering fungus spires would have stood out against a landscape scarce of such giants, said New Scientist in 2007.

“A 6-metre fungus would be odd enough in the modern world, but at least we are used to trees quite a bit bigger,” says Boyce. “Plants at that time were a few feet tall, invertebrate animals were small, and there were no terrestrial vertebrates. This fossil would have been all the more striking in such a diminutive landscape.”

Fossils of the organisms, known as Prototaxites, had peppered the paleontological findings of the past century and a half, ever since they were first discovered by a Canadian in 1859. But despite the fossil records, no one could figure out what the heck these giant spires were. The University of Chicago:

For the next 130 years, debate raged. Some scientists called Prototaxites a lichen, others a fungus, and still others clung to the notion that it was some kind of tree. “The problem is that when you look up close at the anatomy, it’s evocative of a lot of different things, but it’s diagnostic of nothing,” says Boyce, an associate professor in geophysical sciences and the Committee on Evolutionary Biology. “And it’s so damn big that when whenever someone says it’ssomething, everyone else’s hackles get up: ‘How could you have a lichen 20 feet tall?’”

That all changed in 2007 when a study came out that concluded the spires were a fungus, like a gigantic early mushroom.

But not everyone was sold on the idea that Prototaxites was an early fungus. No one’s questioning the spires’ existence—people just have trouble trying to imagine that such a huge structure could be a fungus. Researchers trying to refute the fungus idea thought that Prototaxites spires were gigantic mats of liverworts that had somehow rolled up. But in a follow-up study, the scientists who had proposed the fungus idea doubled down on their claim. So science is messy, and despite more than a century of digging, we still don’t really know, for sure, what these huge spires that dominated the ancient Earth really were.

But even though the spire-like mushrooms of yore—or whatever they were—are long gone, don’t feel too bad for funguskind. The largest organism on Earth, says ABC, is still a huge fungal mat, a single organism spread over 2,200 acres of forest in eastern Oregon.

More from Smithsonian.com:

A Fungus-Like Disease Threatens Europe’s Supply of Gin

The Fungus in Your Cheese Is Having Weird SexInsulation Made Out of … Mushrooms?

Like this article?

SIGN UP for our newsletter

Read More

Hexbyte  Tech News  Wired Before Using Birth Control Apps, Consider Your Privacy

Hexbyte Tech News Wired Before Using Birth Control Apps, Consider Your Privacy

Hexbyte Tech News Wired

Score one for the quantified self-surveillance movement. Last week, the US Food and Drug Agency approved the first-ever, over-the-counter digital contraceptive—a polished and almost preternaturally upbeat mobile app called Natural Cycles. Basal body temperature readings and monthly menstruation data feed into an algorithm that tells users whether or not they should be having unprotected sex. Like most forms of birth control, it’s not foolproof; the app has been dogged by reports of unwanted pregnancies that prompted two ongoing investigations by European authorities into its Swedish maker’s marketing claims.

But that hasn’t hurt Natural Cycles’ popularity.

Read More

The Land Before Binary

The Land Before Binary

In the early days of computing, 1 + 1 didn’t always equal 10

In the summer of 2016, USDS assigned me to the IRS to work on a project called Online Accounts. When on assignment I tend to wander, because helping out big complex software projects rarely involves simple straight forward advice or solutions. You have to figure that before you showed up the agency has probably been through many rounds of expensive consultants, working groups, maybe even a few research and development centers. If getting things on the right track was a matter of saying “go to the cloud”, someone else would have charged the government at least $350K to hear them say it.

Fortunately, bureaucracies are slow so there is always plenty of time to explore what else is going on around a project and from that garner insight into why the problems directly relevant to your mission have become so intractable … and hopefully learn to solve them.

The IRS has a lot of mainframes. And as millions of Americans recently found out, many of them are quite old. So as I wandered about meeting different types of engineers and chatting about their day-to-day blockers I started to learn much more about how these machines worked. It was a fascinating rabbit hole that exposed me to things like “decimal machines” and “2 out of 5 code”. It revealed something to me that I had not ever considered:

Computers did not always use binary code.

Computers did not always use base 2. Computers did not always operate on just an on/off value. There were other things, different things that were tried and eventually abandoned. Some of them predating the advent of electronics itself.

Here’s a little taste of some of those systems

In West World the computers used balanced ternary arithmetic and — uh — were also made of wood

When you’re asking the question “why binary?” it’s useful to consider the humble logic gate. As a basic component of a circuit, logic gates take boolean expressions that should be familiar to all programmers (AND, OR, NOT…) and gives them physical form. A calculation can be thought of as a chain reaction, one logic gate feeding into another logic gate until pulses of voltage represented by 1s and 0s are rearranged into an end state. (For a detailed primer on logic gates check out Brian Jenkins’s post on the basics of logic gates)

But what if there was a third state? 0 for off, 1 for on, 2 for … really really on? This is technically possible but difficult to implement. We often talk about binary being off/on, but in reality “off” and “on” represent a range of acceptable values on either side. Creating a stable and predictable third state is difficult. In the Soviet Union one approach to this problem was to use two vacuum tubes to represent three distinct states (00, 01, 11), but otherwise a wide variety of factors might cause the voltage to creep up or down that off/on range throughout the life of the electronics.

But what if you’re not using voltage? What if you use a medium where the distinction between states are more strictly drawn. Like, for example, if you build your logic gates out of wooden rods.

The ternary calculating machine of Thomas Fowler is one such machine. Build in 1840, six feet wide by three feet deep and one foot high, it did multiplication and division problems with a series of rods set on a raised platform. They could either slide directly into the middle (0 value), overhang on one side (+1) or overhang on the other side (-1). Representing three distinct states the machine could clearly identify and manipulate.

Demonstration of the Fowler machine

It’s called balanced ternary. So how does it work? If we think about the conventional binary we all know, each bit represents an increment of base 2 and you get the value by adding up all the on bits. (FYI I’m going to reverse the conventional order so that the 2⁰ is the left most throughout this post) For example:

binary representation of the number 11

With ternary you do the same thing, except you subtract negative values and add positive ones. Like so:

balanced ternary representation of number 11

The first bit is 3⁰ which is 1, 3¹ equals 3, 3² is 3*3 which equals 9. So 11 is 9+3–1. You may notice that ternary can represent this number with fewer bits than binary. You may also notice that with a -1 value on the table, you eliminate the need for a separate sign bit to determine whether a number is positive or negative.

Once you think about including a third state into the mix there are actually a couple of different ways of doing ternary. In addition to {-1,0,1} you could also do {0,1,2} (unbalance ternary) at which point the number 11 would be 2-0-1 (1+1+9). Or you might prefer fractions {0, 1/2, 1}

Addition works basically the same way it does in binary: you have a truth table spelling out which combinations of -1, 0, 1 lead to what output and a carry table. The main advantage is you’re using exponentially fewer bits the higher the numbers go and therefore the process is more efficient and uses less electricity. Subtraction on the other hand just means NOTing one number and doing addition instead, so much nicer than binary. Multiplication and division are easier mainly because you have fewer bits for each number and therefore fewer steps.

Ternary circuits build on these concepts and interesting in their own right. R. X. Seger did a series of technical posts about building ternary logic gates if you’re interested in more detail.

Todo List: 1) fight Nazis, 2) Build negabinary machine

When the Nazis invaded Poland in 1939 they shut down all the universities, including Poland’s version of MIT, Warsaw University of Technology.

Jan Czochralski, whose research on monocrystals made silicon chips possible decades later, convinced the Nazis to let parts of the school reopen under the understanding that they would help the war effort. So by day they worked on maintenance and repair for the German army and by night they manufactured pistols and grenades for the resistance. Czochralski’s team iterated on the weapons used by the Polish Home Army, particularly their explosives. At the same time a different group of professors analyzed the vulnerabilities of radio and steering devices in German V-2 rockets and delivered that information to the resistance.

Meanwhile much of the university that was not seen as useful to the war effort was still operating underground. Thousands of students took clandestine courses, close to 200 earning engineering degrees during the war years. Twenty PhD candidates wrote and subsequently defended their theses.

After the war, resistance fighters started enrolling in the newly reopened university as students. Jacek Karpiński nearly blew himself up building homemade bombs for the resistance, got shot in the spine during the Warsaw Upraising, then went on to build the world’s first PC (K-202) in 1970 … five years before the MITS Altair 8800. It would take decades for the industry to match its memory capacity and performance.

Leon Łukaszewicz served in the Home Army as a sniper and went on to design several computers that were cutting edge for the time: the ARR, the XYZ, the ZAM-2. He also organized and mentored a team of blind programmers.

Andrew (Andrzej) Targowski survived a German firing squad and crawled out from a pile of dead bodies at the age of seven, then as an adult came up with the concept of the INFOSTRADA (a project the Soviets quickly killed) which eventually became Al Gore’s internet.

From within this mix saw the development of the BINEG, a binary negative computer.

In negabinary the base is negative. So instead of 2⁰, 2¹, 2² … it is -2⁰, -2¹, -2². If you’re quick you might have already figured out the strange impact this has on our bits: multiplying a negative by a negative produces a positive. Therefore even other bit in negabinary is positive.

Jesus Christ, what? 11 in negabinary

Like ternary, negabinary makes negative numbers easier to handle because you do not need a sign bit. Unlike ternary you end up using more bits in general and even more than you would use in plain ordinary binary. It’s not really a surprise why this one did not catch on.

Incidentally, if you’re interested in early computers from other countries Nick Doiron has compiled a pretty comprehensive list here.

2 out of 3 is not just for rock-paper-scissor

Enough of fiddling with different base options! Let’s talk about machines that used base 2 but interpreted it a completely different way than a modern machine would.

I mentioned at the top of this post that my introduction to the land before binary started with an old mainframe at the IRS and something called “2 out of 5 code”.

Also called constant weight code, m out of n code works the way the name suggests. A word must have exactly m bits set to 1 and the rest set to 0. Although this might seem like a strange idea, in the context of super old mainframes and big, unreliable data stores it makes a lot of sense. By requiring an m number of bits to be set to 1 you have built in error checking. The machine would have to accidentally change multiple bits at once in a very specific way in order for data corruption to go unnoticed.

Constant weight code can technically be used on any base numeric system, but 2 out of 5 code was especially popular for decimal machines (base 10). Now you’re probably thinking that if base 3 needed three possible states and that was too difficult, than how did decimal machines get ten possible states out of those same parts?

The answer is they didn’t. Instead the architects of these machines found various ways to encode base 10 with hardware that was binary in nature. Not surprising, this is called binary-coded decimal or BCD. What’s tricky about the combination of binary-coded decimal and 2 out of 5 code is that you can’t actually represent all numbers from 0–9 in binary with only two bits on. The number 0 in traditional binary has no bits on, for example. The number 7 is 2⁰+2¹+2² (or 1+2+4), that’s three bits on. So to make it work you have to fiddle with what each bit represents (their weight). Because there are different ways of doing that, there are different flavors of 2 out of 5 code. Some of them assign the values 0, 1, 2, 3, 6 to the five bits, which can produce the numbers 1–9 very sensibly with two bits and only needs something weird for 0: 01100.

Other variations of 2 out of 5 code are more familiar because they are used in bar codes. POSTNET (the old barcode system the Post Office used to route mail up until a few years ago) uses the sequence 0, 1 , 2, 4, 7. This offers a more elegant solution to the problem of representing 0 with two bits: 00011 (4+7) would add up to 11 otherwise.

If you’re up for exploring more hijinx caused by binary coded decimal, Daniel Treccia has a great post on Y2k, Bitcoin hashes and BCD -vs- HEX.

I’m not straight; I’m Bi-quinary.

1953 was the beginning of mass produced computers with the launch of IBM’s 650. It was a bi-quinary decimal based machine, which means its words represented numbers 0–9 like decimal but the bits making up the word were split into two groups.

The first group had two bits, one representing the number 0 and the other representing the number 5.

The second group had five bits representing the numbers 0–4.

The value of a word in bi-quinary is calculated by adding which ever bit is on in the first part (0 or 5) to whichever bit is on in the second part. As a result you get 0:0, 0:1, 0:2, 0:3, 0:4, 5:0, 5:1, 5:2, 5:3, 5:4. Like 2 out of 5 code it error checks automatically by requiring two bits to be on for all legal values.

If you don’t care so much about error checking you can ditch the second bit on the bi side of bi-quinary. Remington’s UNIVAC 60 and UNIVAC 120 did just that, and also changed the weights on the other side in such a way that the bi side bit being on meant either +1 or +9 depending on context.

This all seems bananas until you understand what using these machines was like.

First you punched your FORTRAN program on a key punch machine, along with any data and control cards. But since the 650 had no disk, the FORTRAN compiler was not resident. So to compile your program, you fed the FORTRAN compiler deck into the card reader, followed by your FORTRAN source program as data. After some time, the machine would punch the resulting object deck. Then you fed the FORTRAN run-time library object deck and your program’s object deck into the card reader, followed by any data cards for your program. Your program would run and results would be punched onto yet another deck of cards. To see the results, you would feed the result deck into another machine, such as an IBM 407, to have it printed on paper (if the computer itself had no printer, as original 650s did not). Source

Mainframes of this era did not have displays. Their interfaces consisted of knobs, buttons and lights. Bi-quinary didn’t just provide machine error checking, a user could look at the panel and know exactly what data was in any part of the machine at any one time. Take a look:

The console of the 650

So while 2 out of 5 code and bi-quinary can both do error checking, it is easier to look at a bi-quinary number and understand what its meaning is in decimal. The experience ends up being a little more user friendly.

Read More