Utility Computing Still The Next Next Big Thing

from the it-doesn't-matter-that-IT-doesn't-matter dept

Computing as a utility has been heralded as "next big thing" for several years now. Although it has yet to make it big, some like Nicholas Carr are convinced that it's finally starting to take off. For Carr, it's the logical extension to the idea that a company's IT setup is not sufficient to gain any competitive advantage, and thus unnecessary to keep in house. It's a point he's been making for some time, though moves out of Microsoft, Google and Sun would appear to back him up that things are happening now. But companies making moves does not in itself vindicate the vision. While broadband and cheap hardware may make it easier to build a gigantic server farm, other factors may mitigate its ascent. Hackers are moving upstream, attacking centralized points like DNS servers -- concentrated "power plants" will make particularly inviting targets. The rise p2p architecture, as a way of reducing bandwidth congestion, also enforces the importance of having intelligence and power on the edge of the network. Advocates of utility computing like to draw an analogy to electricity (which is definitely a commodity), but even there the industry is moving in a different direction. Instead of building massive power plants, a lot of R&D is focused on distributed, on-site power generation. Also, unlike actual power plants, which are heavily regulated and have enormous barriers to entry, an equivalent computing plant will have very little protection for the market. Instead companies will be investing a lot of money into rapidly depreciating equipment that will be built and sold for much less in a few years. While there's likely to be some useful applications of the model, it's way to early to say that it's finally arrived. As for the companies betting heavily on it, it may turn out to be a costly mistake.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Billy Bones, 8 May 2006 @ 6:04pm

    100 percent hacker free

    The societal blemish that is hackers will always be there... like a planter's wart with roots in your brain. We will never be rid of them... so all we can do is keep finding new ways to make them angry and frustrated.

    Eventually these feebs grow up and find out that hacking isn't a very lucrative career and hopefully they get a real job and stop annoying people. But those that don't will most certainly die white, pasty, and alone for lack of sun and companionship. After all, how many hackers are there that look like Angelina Jolie?

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 8 May 2006 @ 6:48pm

    Hogwash.

    link to this | view in chronology ]

  • identicon
    Danno, 8 May 2006 @ 6:55pm

    Well, if everyone starts writing distributed, concurrent, fault tolerant code in platform independent languages, sure, computing will become a utility.

    But, uh, that's not going to happen for at least 10 years. Java's stranglehold is still too great, and in any case, it's a hard programming model to fit to most applications.

    Distributed databases (maybe not in the relational sense though) that work well will be the herald though.

    link to this | view in chronology ]

  • identicon
    Mathew Schlabaugh, 8 May 2006 @ 11:23pm

    I think the questions here would be on marketability. How exactly are we going to market with this? Has anyone done any market research to determine the demand for such high-powered computing? For example, say I built the world's biggest airplane... Let's call it the "Spruce Goose"... :-) Sure I might marvel at it... But... Who's buying? Money is the bottom line after all. If someone thinks they can make money by doing this I hope they have done their homework first. I think to speak intelligently about this I would have to see the numbers. This IT model would involve some large risks so for those of you attempting this I cross my fingers for you...

    link to this | view in chronology ]

  • identicon
    Bert Armijo, 10 May 2006 @ 10:32pm

    You're interpreting utility computing very narrowly. For instance, are subscribers to Salesforce.com buying into utility computing? Absolutely. They're using a computing resource they don't own or maintain, which is the very definition of utility computing.

    The more narrow question of whether enterprises will utilize offsite resources for computing of internally developed and administered applications is still open to debate. In discussions with CIOs of major Fortune 1000 firms the idea is VERY appealing to them because it attacks the 70% of their budget locked up in operations, allowing them to devore more to development. Several large hosting and outsourcing firms confirm their large clients are asking for this capability.

    IMHO, what's made many people scoff at the whole concept of utility computing is the accepted (in fact, ingrained) practice of installing software on servers to operate it. Systems that make this unecesary are being shipped now. This means IT departments can now shift applications seamlessly between internal servers and external providers, without rewriting code, downtime, or massive amounts of manual labor.

    Once you can move even large, distributed applications easily between providers, why will you continue to sink massive ammounts of money into building your own data centers?

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.