Kay: Yeah. But the thing that traumatized me occurred a couple years later, when I found an old copy of Life magazine that had the Margaret Bourke-White photos from Buchenwald. This was in the 1940s — no TV, living on a farm. That’s when I realized that adults were dangerous. Like, really dangerous.
On Computing as Pop Culture –
The lack of interest, the disdain for history is what makes computing not-quite-a-field.
I think the same is true of most people who write code for money. They have no idea where [their culture came from] — and the Internet was done so well that most people think of it as a natural resource like the Pacific Ocean, rather than something that was man-made. When was the last time a technology with a scale like that was so error-free? The Web, in comparison, is a joke. The Web was done by amateurs.
On Programming –
The most disastrous thing about programming — to pick one of the 10 most disastrous things about programming — there’s a very popular movement based on pattern languages. When Christopher Alexander first did that in architecture, he was looking at 2,000 years of ways that humans have made themselves comfortable. So there was actually something to it, because he was dealing with a genome that hasn’t changed that much. I think he got a few hundred valuable patterns out of it. But the bug in trying to do that in computing is the assumption that we know anything at all about programming. So extracting patterns from today’s programming practices ennobles them in a way they don’t deserve. It actually gives them more cachet.
The best teacher I had in graduate school spent the whole semester destroying any beliefs we had about computing. He was a real iconoclast. He happened to be a genius, so we took it. At the end of the course, we were free because we didn’t believe in anything. We had to learn everything, but then he destroyed it. He wanted us to understand what had been done, but he didn’t want us to believe in it.
“If we make an unbiased examination of the accomplishments made by mathematicians to the real world of computer programming, we are forced to conclude that, so far, the theory has actually done more harm than good. There are numerous instances in which theoretical “advances” have actually stifled the development of computer programming, and in some cases they have even made it take several steps backward!”
“Thus the theoretical calculations, which have been so widely quoted, have caused an inferior method to be used. An overemphasis on asymtotic behavior has often led to similar abuses.”
(This is common in certain groups of programmers. Group-think ?)
“Another difficulty with the theory of languages is that it has led to an overemphasis on syntax as opposed to semantics. For many years there was much light on syntax and very little on semantics; so simple semantic constructions were unnaturally grafted onto syntactic definitions, making rather unwieldy grammars, instead of searching for theories more appropriate to semantics.”
(Is this why the current generation of programmers has such an unfounded aversion to languages that do not read like Java ?)
“You see that computer science has been subject to a recurring problem: Some theory is developed that is very beautiful, and too often it is therefore thought to be relevant.”
“My experience has been that theories are often more structured and more interesting when they are based on real problems; somehow such theories are more exciting than completely abstract theories will ever be.”
The Dangers of Computer Science Theory – Donald E. Knuth
[An invited address presented to the International Congress on Logic, Methodology and Philosophy of Science in Bucharest, Romania, September 1971. Originally published in Logic, Methodology and Philosophy of Science 4 (Amsterdam: Noth-Holland, 1973) 189-195.] Selected Papers on the Analysis of Algorithm – Donald E. Knuth
“This is a personal story of the educational process at one of the world’s great technological universities. Pepper White entered MIT in 1981 and received his master’s degree in mechanical engineering in 1984. His account of his experiences, written in diary form, offers insight into graduate school life in general—including the loneliness and even desperation that can result from the intense pressure to succeed—and the purposes of engineering education in particular. The first professor White met at MIT told him that it did not really matter what he learned there, but that MIT would teach him how to think. This, then, is the story of how one student learned how to think.”
“New approaches to artificial intelligence spring from the idea that intelligence emerges as much from cells, bodies, and societies as it does from evolution, development, and learning. Traditionally, artificial intelligence has been concerned with reproducing the abilities of human brains; newer approaches take inspiration from a wider range of biological structures that that are capable of autonomous self-organization. Examples of these new approaches include evolutionary computation and evolutionary electronics, artificial neural networks, immune systems, biorobotics, and swarm intelligence—to mention only a few. This book offers a comprehensive introduction to the emerging field of biologically inspired artificial intelligence that can be used as an upper-level text or as a reference for researchers.”
“Filled with first-hand accounts of ambition, greed, and inspired engineering, this history of the personal computer revolution takes readers inside the cutthroat world of Commodore. Before Apple, IBM, or Dell, Commodore was the first computer maker to market its machines to the public, eventually selling an estimated 22 million Commodore 64s. These halcyon days were tumultuous, however, owing to the expectations and unsparing tactics of founder Jack Tramiel. Engineers and managers share their experiences between 1976 and 1994 of the groundbreaking moments, soaring highs, and stunning employee turnover that came with being on top of the PC world in the early computer business.”
“At first, this hefty new tome from Oxford physicist Penrose (The Emperor’s NewMind) looks suspiciously like a textbook, complete with hundreds of diagrams and pages full of mathematical notation. On a closer reading, however, one discovers that the book is something entirely different and far more remarkable. Unlike a textbook, the purpose of which is purely to impart information, this volume is written to explore the beautiful and elegant connection between mathematics and the physical world. Penrose spends the first third of his book walking us through a seminar in high-level mathematics, but only so he can present modern physics on its own terms, without resorting to analogies or simplifications (as he explains in his preface, “in modern physics, one cannot avoid facing up to the subtleties of much sophisticated mathematics”). Those who work their way through these initial chapters will find themselves rewarded with a deep and sophisticated tour of the past and present of modern physics. Penrose transcends the constraints of the popular science genre with a unique combination of respect for the complexity of the material and respect for the abilities of his readers. This book sometimes begs comparison with Stephen Hawking’s A Brief History of Time, and while Penrose’s vibrantly challenging volume deserves similar success, it will also likely lie unfinished on as many bookshelves as Hawking’s. For those hardy readers willing to invest their time and mental energies, however, there are few books more deserving of the effort.”
“Full of incredible characters, amazing athletic achievements, cutting-edge science, and, most of all, pure inspiration, Born to Run is an epic adventure that began with one simple question: Why does my foot hurt? In search of an answer, Christopher McDougall sets off to find a tribe of the world’s greatest distance runners and learn their secrets, and in the process shows us that everything we thought we knew about running is wrong.”