We draw on the same brain networks when we’re reading stories and when we’re trying to guess at another person’s feelings.
We are here because the editor of this magazine asked me, “Can you tell me what code is?”
“No,” I said. “First of all, I’m not good at the math. I’m a programmer, yes, but I’m an East Coast programmer, not one of these serious platform people from the Bay Area.”
I began to program nearly 20 years ago, learning via
oraperl, a special version of the Perl language modified to work with the Oracle database. A month into the work, I damaged the accounts of 30,000 fantasy basketball players. They sent some angry e-mails. After that, I decided to get better.
Which is to say I’m not a natural. I love computers, but they never made any sense to me. And yet, after two decades of jamming information into my code-resistant brain, I’ve amassed enough knowledge that the computer has revealed itself. Its magic has been stripped away. I can talk to someone who used to work at Amazon.com or Microsoft about his or her work without feeling a burning shame. I’d happily talk to people from Google and Apple, too, but they so rarely reenter the general population.
There are lots of other neighborhoods, too: There are people who write code for embedded computers smaller than your thumb. There are people who write the code that runs your TV. There are programmers for everything. They have different cultures, different tribal folklores, that they use to organize their working life. If you told me a systems administrator was taking a juggling class, that would make sense, and I’d expect a product manager to take a trapeze class. I’ve met information architects who list and rank their friendships in spreadsheets. Security research specialists love to party.
What I’m saying is, I’m one of 18 million. So that’s what I’m writing: my view of software development, as an individual among millions. Code has been my life, and it has been your life, too. It is time to understand how it all works.
Every month it becomes easier to do things that have never been done before, to create new kinds of chaos and find new kinds of order. Even though my math skills will never catch up, I love the work. Every month, code changes the world in some interesting, wonderful, or disturbing way.
As the economy changes, the skills required to thrive in it change, too, and it takes a while before these new skills are defined and acknowledged.
For example, in today’s loosely networked world, people with social courage have amazing value. Everyone goes to conferences and meets people, but some people invite six people to lunch afterward and follow up with four carefully tended friendships forevermore. Then they spend their lives connecting people across networks.
People with social courage are extroverted in issuing invitations but introverted in conversation — willing to listen 70 percent of the time. They build not just contacts but actual friendships by engaging people on multiple levels. If you’re interested in a new field, they can reel off the names of 10 people you should know. They develop large informal networks of contacts that transcend their organization and give them an independent power base. They are discriminating in their personal recommendations since character judgment is their primary currency.
The test of a first-rate intelligence is the ability to hold two opposed ideas in mind at the same time and still retain the ability to function
Similarly, people who can capture amorphous trends with a clarifying label also have enormous worth. Karl Popper observed that there are clock problems and cloud problems. Clock problems can be divided into parts, but cloud problems are indivisible emergent systems. A culture problem is a cloud, so is a personality, an era and a social environment.
Since it is easier to think deductively, most people try to turn cloud problems into clock problems, but a few people are able to look at a complex situation, grasp the gist and clarify it by naming what is going on.
Such people tend to possess negative capacity, the ability to live with ambiguity and not leap to premature conclusions. They can absorb a stream of disparate data and rest in it until they can synthesize it into one trend, pattern or generalization.
Such people can create a mental model that helps you think about a phenomenon. As Oswald Chambers put it, “The author who benefits you most is not the one who tells you something you did not know before, but the one who gives expression to the truth that has been dumbly struggling in you for utterance.”
We can all think of many other skills that are especially valuable right now:
Making nonhuman things intuitive to humans
This is what Steve Jobs did.
Many people go through life overwhelmed by options, afraid of closing off opportunities. But a few have fully cultivated moral passions and can help others choose the one thing they should dedicate themselves to.
Scott Fitzgerald wrote, “The test of a first-rate intelligence is the ability to hold two opposed ideas in mind at the same time and still retain the ability to function.” For some reason I am continually running across people who believe this is the ability their employees and bosses need right now.
In a world dividing along class, ethnic and economic grounds some people are culturally multilingual. They can operate in an insular social niche while seeing it from the vantage point of an outsider.
One gets the impression we’re confronted by a giant cultural lag. The economy emphasizes a new generation of skills, but our vocabulary describes the set required 30 years ago. Lord, if somebody could just identify the skills it takes to give a good briefing these days, that feat alone would deserve the Nobel Prize.
What we should be saying is that we need more designers who know about code.
The reason designers should know about code, is the same reason developers should know about design. Not to become designers, but to empathize with them. To be able to speak their language, and to understand design considerations and thought processes. To know just enough to be dangerous, as they say.
This is the sort of thing that breaks down silos, opens up conversations and leads to great work. But the key is that it also does not impede the ability of people to become true experts in their area of focus.
When someone says they want “designers who can code”, what I hear them saying is that they want a Swiss Army knife. The screwdriver, scissors, knife, toothpick and saw. The problem is that a Swiss Army knife doesn’t do anything particularly well. You aren’t going to see a carpenter driving screws with that little nub of a screwdriver, or a seamstress using those tiny scissors to cut fabric. The Swiss Army knife has tools that work on the most basic level, but they would never be considered replacements for the real thing. Worse still, because it tries to do so much, it’s not even that great at being a knife.
Professionals need specialized tools. Likewise, professional teams need specialized team members.
Technology and society or technology and culture refers to cyclical co-dependence, co-influence, co-production of technology and society upon the other (technology upon culture, and vice versa). This synergistic relationship occurred from the dawn of humankind, with the invention of simple tools and continues into modern technologies such as the printing press and computers. The academic discipline studying the impacts of science, technology, and society and vice versa is called (and can be found at) Science and technology studies.
Technology has become a huge part of every day societies life. When societies knows more about the development in a technology, they become able to take advantage of it. When an innovation achieves a certain point after it has been presented and promoted, this technology becomes part of the society.Digital technology has entered each process and activity made by the social system. In fact, it constructed another worldwide communication system in addition to its origin.
Since the creation of computers achieved an entire better approach to transmit and store data. Digital technology became commonly used for downloading music, and watching movies at home either by DVDs or purchasing it online. Digital music records are not quite the same as traditional recording media. Obviously, because digital ones are reproducible, portable and free.
Game design is the art of applying design and aesthetics to create a game to facilitate interaction between players for playful, healthful, educational, or simulation purposes. Game design can be applied both to games and, increasingly, to other interactions, particularly virtual ones (see gamification).
Game design creates goals, rules, and challenges to define a sport, tabletop game, casino game, video game, role-playing game, or simulation that produces desirable interactions among its participants and, possibly, spectators.
Academically, game design is part of game studies, while game theory studies strategic decision making (primarily in non-game situations). Games have historically inspired seminal research in the fields of probability, artificial intelligence, economics, and optimization theory. Applying game design to itself is a current research topic in metadesign.
I didn’t include the most pernicious and widespread lie of all:
The internet never forgets.
This truism is so pervasive that it can be presented as a fait accompli, without any data to back it up. If you were to seek out the data to back up the claim, you would find that the opposite is true—the internet is in constant state of forgetting.
Faced with the knowledge that nothing we say, no matter how trivial or silly, will ever be completely erased, we find it hard to take the risks that togetherness entails.
You will be able to view your posts, messages, and photos until April 9th. On April 9th, we’ll be shutting down FriendFeed and it will no longer be available.
What if I shared on Posterous? Or Vox (back when that domain name was a social network hosting 6 million URLs)? What about Pownce? Geocities?
These aren’t the exceptions—this is routine. And yet somehow, despite all the evidence to the contrary, we still keep a completely straight face and say “Be careful what you post online; it’ll be there forever!”
The problem here is a mismatch of expectations. We expect everything that we post online, no matter how trivial or silly, to remain forever. When instead it is callously destroyed, our expectation—which was fed by the “knowledge” that the internet never forgets—is turned upside down. That’s where the anger comes from; the mismatch between expected behaviour and the reality of this digital dark age.
Being frightened of an internet that never forgets is like being frightened of zombies or vampires. These things do indeed sound frightening, and there’s something within us that readily responds to them, but they bear no resemblance to reality.
If you want to imagine a truly frightening scenario, imagine an entire world in which people entrust their thoughts, their work, and pictures of their family to online services in the mistaken belief that the internet never forgets. Imagine the devastation when all of those trivial, silly, precious moments are wiped out. For some reason we have a hard time imagining that dystopia even though it has already played out time and time again.
I am far more frightened by an internet that never remembers than I am by an internet that never forgets.