The Prototype (film) – Wikipedia, the free encyclopedia

The Prototype (film) – Wikipedia, the free encyclopedia.

The Prototype is an upcoming American science fiction film directed by Andrew Will. It stars Neal McDonoughJoseph Mawle and Anna Anissimova. The film is about how a thesis written by Dr. Maxwell (Joseph Mawle) about how human will evolve into machines became a reality…and Dr. Maxwell himself becoming a machine first…and the first Prototype.

In the Beginning… Was the Command Line – Wikipedia, the free encyclopedia

In the Beginning… Was the Command Line – Wikipedia, the free encyclopedia.

In the Beginning… Was the Command Line is an essay by Neal Stephenson which was originally published online in 1999 and later made available in book form (November 1999, ISBN 0-380-81593-1). The essay is a commentary on why the proprietary operating systems business is unlikely to remain profitable in the future because of competition from free software. It also analyzes the corporate/collective culture of the MicrosoftMacintosh, and free software communities.

Stephenson explores the GUI as a metaphor in terms of the increasing interposition of abstractions between humans and the actual workings of devices (in a similar manner to Zen and the Art of Motorcycle Maintenance)[citation needed] and explains the beauty hackers feel in good-quality tools.

He does this with a car analogy. He compares four operating systems, Mac OS by Apple Computer to a luxury European car, Windows by Microsoft to a station wagonLinux to a free tank, and BeOS to a batmobile. Stephenson argues that people continue to buy the station wagon despite free tanks being given away, because people do not want to learn how to operate a tank; they know that the station wagon dealership has a machine shop that they can take their car to when it breaks down.

Because of this attitude, Stephenson argues that Microsoft is not really a monopoly, as evidenced by the free availability of other choice OSes, but rather has simply accrued enough mindshare among the people to have them coming back. He compares Microsoft to Disney, in that both are selling a vision to their customers, who in turn “want to believe” in that vision.

Stephenson relays his experience with the Debian bug tracking system (#6518). He then contrasts it with Microsoft’s approach. Debian developers responded from around the world within a day. He was completely frustrated with his initial attempt to achieve the same response from Microsoft, but he concedes that his subsequent experience was satisfactory. The difference he notes is that Debian developers are personally accessible and transparently own up to defects in their OS distribution, while Microsoft “makes no bones about the existence of errors.”

Snow Crash – Wikipedia, the free encyclopedia

Snow Crash – Wikipedia, the free encyclopedia.

Snow Crash is Neal Stephenson‘s third novel, published in 1992. Like many of Stephenson’s other novels it covers historylinguisticsanthropologyarchaeology,religioncomputer sciencepoliticscryptographymemetics, and philosophy.

Stephenson explained the title of the novel in his 1999 essay In the Beginning… was the Command Line as his term for a particular software failure mode on the early Apple Macintosh computer. Stephenson wrote about the Macintosh that “When the computer crashed and wrote gibberish into the bitmap, the result was something that looked vaguely like static on a broken television set—a ‘snow crash’ ”.

The book presents the Sumerian language as the firmware programming language for the brainstem, which is supposedly functioning as the BIOS for the human brain. According to characters in the book, the goddess Asherah is the personification of a linguistic virus, similar to a computer virus. The god Enki created a counter-program which he called a nam-shub that caused all of humanity to speak different languages as a protection against Asherah (a re-interpretation of the ancient Near Eastern story of the Tower of Babel).

Putting people first » Are we becoming cyborgs?

Putting people first » Are we becoming cyborgs?

Also the New York Times is turning up the cyborg theme, but luckily more intelligently than CNN.

All the technology and internet use has changed how we interact. But are we also changing what we are?

The New York Times put that question to three people who have written extensively on the subject, and brought them together to discuss it with Serge Schmemann, the editor of the NYT magazine.

The participants: Susan Greenfield, professor of synaptic pharmacology at Oxford. She has written and spoken widely on the impact of new technology on users’ brains. Maria Popova, the curator behind Brain Pickings, a Web site of “eclectic interestingness.” She is also an M.I.T. Futures of Entertainment Fellow and writes for Wired and The AtlanticEvgeny Morozov, the author of The Net Delusion: The Dark Side of Internet Freedom. He is a contributing editor to The New Republic.

First computer to sing – Daisy Bell – YouTube

“Daisy Bell” was composed by Harry Dacre in 1892. In 1961, the IBM 7094 became the first computer to sing, singing the song Daisy Bell. Vocals were programmed by John Kelly and Carol Lockbaum and the accompaniment was programmed by Max Mathews. This performance was the inspiration for a similar scene in 2001: A Space Odyssey.

via First computer to sing – Daisy Bell – YouTube.

General semantics – Wikipedia, the free encyclopedia

General semantics is a program begun in the 1920s that seeks to regulate the evaluative operations performed in the human brain. After partial program launches under the trial names “human engineering” and “humanology,”[1] Polish-American originator Alfred Korzybski[2] (1879–1950) fully launched the program as “general semantics” in 1933 with the publication of Science and Sanity: An Introduction to Non-Aristotelian Systems and General Semantics.

General semantics is not generalized semantics. Misunderstandings traceable to the program’s name have greatly complicated the program’s history and development

General semantics – Wikipedia, the free encyclopedia.


Optical character recognition – Wikipedia, the free encyclopedia

Optical character recognition, usually abbreviated to OCR, is the mechanical or electronic conversion of scanned images of handwritten, typewritten or printed text into machine-encoded text.

It is widely used as a form of data entry from some sort of original paper data source, whether documents, sales receipts, mail, or any number of printed records. It is crucial to the computerization of printed texts so that they can be electronically searched, stored more compactly, displayed on-line, and used in machine processes such as machine translationtext-to-speech and text mining.

OCR is a field of research in pattern recognitionartificial intelligence and computer vision.

Early versions needed to be programmed with images of each character, and worked on one font at a time. “Intelligent” systems with a high degree of recognition accuracy for most fonts are now common. Some systems are capable of reproducing formatted output that closely approximates the original scanned page including images, columns and other non-textual components.

via Optical character recognition – Wikipedia, the free encyclopedia.