Adactio: Journal—Forgetting again

Adactio: Journal—Forgetting again.

I didn’t include the most pernicious and widespread lie of all:

The internet never forgets.

This truism is so pervasive that it can be presented as a fait accompli, without any data to back it up. If you were to seek out the data to back up the claim, you would find that the opposite is true—the internet is in constant state of forgetting.

Laing writes:

Faced with the knowledge that nothing we say, no matter how trivial or silly, will ever be completely erased, we find it hard to take the risks that togetherness entails.

Really? Suppose I said my trivial and silly thing on Friendfeed. Everything that was ever posted to Friendfeed disappeared three days ago:

You will be able to view your posts, messages, and photos until April 9th. On April 9th, we’ll be shutting down FriendFeed and it will no longer be available.

What if I shared on Posterous? Or Vox (back when that domain name was a social network hosting 6 million URLs)? What about Pownce? Geocities?

These aren’t the exceptions—this is routine. And yet somehow, despite all the evidence to the contrary, we still keep a completely straight face and say “Be careful what you post online; it’ll be there forever!”

The problem here is a mismatch of expectations. We expect everything that we post online, no matter how trivial or silly, to remain forever. When instead it is callously destroyed, our expectation—which was fed by the “knowledge” that the internet never forgets—is turned upside down. That’s where the anger comes from; the mismatch between expected behaviour and the reality of this digital dark age.

Being frightened of an internet that never forgets is like being frightened of zombies or vampires. These things do indeed sound frightening, and there’s something within us that readily responds to them, but they bear no resemblance to reality.

If you want to imagine a truly frightening scenario, imagine an entire world in which people entrust their thoughts, their work, and pictures of their family to online services in the mistaken belief that the internet never forgets. Imagine the devastation when all of those trivial, silly, precious moments are wiped out. For some reason we have a hard time imagining that dystopia even though it has already played out time and time again.

I am far more frightened by an internet that never remembers than I am by an internet that never forgets.

The promise of the web — Medium

The promise of the web — Medium.

I’m going to steal the phrase and say that if the web didn’t exist, it would be necessary to invent it.

To illustrate what I mean, let’s consider a www-less universe where there is a duopoly of personal communicating devices; for the sake of this argument, let’s call them Apple and Android. Both environments have different programming languages, different layout engines, and different interfaces for accessing platform capabilities.

In this alternative universe, companies start to realize that building twice is costly and inefficient. Beyond the labor costs, the communication cost of implementing features on two platforms hurts their ability to iterate and innovate.

So they start to develop tooling that abstracts above the differences between the platforms. They create a common scripting language and transcompiler, a declarative language for UIs, and a common API that delegates to the underlying platform. They write their application once, and it can run on both Apple and Android.

Another problem in this world is interoperability between applications. How does a social network application — let’s call it Facebook — reference a spreadsheet created in another program — say, Excel? Both Apple and Android recognize the issue and independently create an addressing system. The systems are similar; applications have a universal identifier and a namespace for referencing resources within the application.

Now that applications can link to each other, things are better. But there is still a bad experience when you don’t have the right application installed. Eventually the platform providers realize this need and add the ability to on-demand install an application. They also add a “light install” that automatically removes the application if you don’t continue to use it.

Sound familiar?

Philip K. Dick Theorizes The Matrix in 1977, Declares That We Live in “A Computer-Programmed Reality” | Open Culture

In the interview, Dick roams over so many of his personal theories about what these “unexpected things” signify that it’s difficult to keep track. However, at that same conference, he delivered a talk titled “If You Find This World Bad, You Should See Some of the Others” (in edited form above), that settles on one particular theory—that the universe is a highly-advanced computer simulation. (The talk has circulated on the internet as “Did Philip K. Dick disclose the real Matrix in 1977?”).

Finally, Dick makes his Matrix point, and makes it very clearly: “we are living in a computer-programmed reality, and the only clue we have to it is when some variable is changed, and some alteration in our reality occurs.” These alterations feel just like déjà vu, says Dick, a sensation that proves that “a variable has been changed” (by whom—note the passive voice—he does not say) and “an alternative world branched off.”

Dick, who had the capacity for a very oblique kind of humor, assures his audience several times that he is deadly serious. (The looks on many of their faces betray incredulity at the very least.) And yet, maybe Dick’s crazy hypothesis has been validated after all, and not simpy by the success of the PKD-esque The Matrix and ubiquity of Matrix analogies. For several years now, theoretical physicists and philosophers have entertained the theory that we do in fact live in a computer-generated simulation and, what’s more, that “we may even be able to detect it.”

via Philip K. Dick Theorizes The Matrix in 1977, Declares That We Live in “A Computer-Programmed Reality” | Open Culture.

A Brief History of Hackerdom

A Brief History of Hackerdom.

Prologue: The Real Programmers

In the beginning, there were Real Programmers.

That’s not what they called themselves. They didn’t call themselves `hackers’, either, or anything in particular; the sobriquet `Real Programmer’ wasn’t coined until after 1980, retrospectively by one of their own. But from 1945 onward, the technology of computing attracted many of the world’s brightest and most creative minds. From Eckert and Mauchly’s first ENIAC computer onward there was a more or less continuous and self-conscious technical culture of enthusiast programmers, people who built and played with software for fun.

The Real Programmers typically came out of engineering or physics backgrounds. They were often amateur-radio hobbyists. They wore white socks and polyester shirts and ties and thick glasses and coded in machine language and assembler and FORTRAN and half a dozen ancient languages now forgotten.

From the end of World War II to the early 1970s, in the great days of batch processing and the “big iron” mainframes, the Real Programmers were the dominant technical culture in computing. A few pieces of revered hacker folklore date from this era, including various lists of Murphy’s Laws and the mock-German “Blinkenlights” poster that still graces many computer rooms.

Some people who grew up in the `Real Programmer’ culture remained active into the 1990s and even past the turn of the 21st century. Seymour Cray, designer of the Cray line of supercomputers, was among the greatest. He is said once to have toggled an entire operating system of his own design into a computer of his own design through its front-panel switches. In octal. Without an error. And it worked. Real Programmer macho supremo.

The `Real Programmer’ culture, though, was heavily associated with batch (and especially batch scientific) computing. It was eventually eclipsed by the rise of interactive computing, the universities, and the networks. These gave birth to another engineering tradition that, eventually, would evolve into today’s open-source hacker culture.

A Brief History of Hackerdom

A Brief History of Hackerdom.

Abstract

I explore the origins of the hacker culture, including prehistory among the Real Programmers, the glory days of the MIT hackers, and how the early ARPAnet nurtured the first network nation. I describe the early rise and eventual stagnation of Unix, the new hope from Finland, and how `the last true hacker’ became the next generation’s patriarch. I sketch the way Linux and the mainstreaming of the Internet brought the hacker culture from the fringes of public consciousness to its current prominence.

When Nerds Collide — Medium

When Nerds Collide — Medium.

You might not consider hackers to be a tribe apart, but I guarantee you that many — if not most — hackers themselves do. Eric S. Raymond’s “A Brief History of Hackerdom,” whose first draft dates to 1992, contains a litany of descriptions that speak to this:

They wore white socks and polyester shirts and ties and thick glasses and coded in machine language and assembler and FORTRAN and half a dozen ancient languages now forgotten .…

The mainstream of hackerdom, (dis)organized around the Internet and by now largely identified with the Unix technical culture, didn’t care about the commercial services. These hackers wanted better tools and more Internet ….

[I]nstead of remaining in isolated small groups each developing their own ephemeral local cultures, they discovered (or re-invented) themselves as a networked tribe.

When Nerds Collide — Medium

Don’t get me wrong, I’m thrilled to bits that every day the power to translate pure thought into actions that ripple across the world merely by the virtue of being phrased correctly draws nearer and nearer to the hands of every person alive. I’m even more delighted that every day more and more people, some very similar to me and others very different, join the chorus of Those Who Speak With Machines.

But I fear for my people, the “weird nerds,” and I think I have good reason to. Brain-computer interfaces are coming, and what will happen to the weird nerds when we can no longer disguise our weirdness with silence?

via When Nerds Collide — Medium.