Does The Tech Industry Need A History Lesson?
from the looking-back-before-you-look-forward dept
Someone anonymously pointed us to a fascinating interview with Alan Kay, famed computer scientist who is partly responsible for an awful lot of the technology you use today. The interview touches on a variety of interesting subjects (including why he dislikes what computers have become), but perhaps the most interesting is his complaint that the tech industry always looks forward and never looks back. Specifically, he's talking about how few people seem to recognize the ideas that Doug Englebart showed the world almost forty years ago. Basically, he's upset that in always looking forward, we're either recreating what was done before or completely missing out on some of the better ideas that came before. This is quite interesting, as we've said plenty of times, innovation is an ongoing process rather than brilliant ideas that come out of nowhere. And, part of that process is building on the ideas of those who came before you to make them better. There is something to be said for coming up with alternative routes -- either to the same idea or to different ones -- but it's always helpful to look at what those who came before you have said, to see if there's more that can be built on. So, while there are plenty of stories of history (unfortunately) repeating itself in Silicon Valley, is it time that folks who work in this industry started signing up for history lessons to help them better think about what the future could hold?Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in chronology ]
Another part is that most code (if you want to be that specific) is not generic enough to be used. Many applications that were designed to be "modular" and "re-usable" really aren't. Look at how bloated Windows is, for example. And they haven't reached a point in their life-cycle that modularity is paying off.
A third part is related to patents and trademarks - a topic you touch on frequently. Companies will try something new just to avoid being sued as a "that looks / feels / acts just like our product", and a lawsuit ensues.
A fourth part is related to the need to be new and different. Any product that is built upon another product is going to be compared to the previous product. Not many starups / ventures are going to want to be saddled with being labeled a "clone" or having the "baggage" of a previous product. So they go their own way.
[ link to this | view in chronology ]
Re: not invented here
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Doomed to repeat history.
There is a train of thought required to create new ideas. Where ever you do you best thinking, you don't come up with new ideas because it "just comes to you." You come up with new ideas because you are thinking about ways to improve what you are already doing. If you don't have any problems, you are not likely to think of new ways to do things better.
[ link to this | view in chronology ]
Dunno
[ link to this | view in chronology ]
Old topic
(just kidding)
[ link to this | view in chronology ]
If even
[ link to this | view in chronology ]
The sentiment is right, but...
He also falls into the too common techie belief that easy to use it stupid. Now that computers are about as common as cars, you can't have a device that works for lots of people (we still have plenty of flaws), and expect them to go through some kind of extensive learning curve. Sure, there could be benefits as you learn something more sophisticated and more power once you know how to use it, but the computer wouldn't be a ubiquitous today if it wasn't so "easy to use" (again, it's not really easy to use yet, but I digress).
I like the sentiment he throws out, and part of the problem that highlights correctly is because of the education we get in technology at school. Back in his day, the computer scientist was really an engineer. Today a computer scientist is a computer scientist and more often a coder than a traditional engineer. We aren't looking to change the tools, but to use the tools to their optimum.
I could go on, but he's right and he's wrong. Although, when isn't that true?
[ link to this | view in chronology ]
History is very important!
[ link to this | view in chronology ]
Yep.. Great Idea like the Tri-nary computer was ig
-- Tony Fu
[ link to this | view in chronology ]
Um..Trinary?
Thought not
[ link to this | view in chronology ]
Die Rats...
[ link to this | view in chronology ]