Tech Optimism Is Back

from the wave-2.0 dept

It looks like tech optimism is back in fashion. Here come the glowing press reports about "the next big thing." USA Today starts us off by noticing that the internet is now poised to go to a new level. While we've pretty much reached the point on the web that matched many of the initial predictions, it has people wondering where do we go from here? USA Today notes that the pieces have been laid down over the past few years to expand the internet into a much more useful tool. Wireless devices and technologies help expand the web outside of the computer. The core infrastructure of the internet is now a commodity and the concept of standard web services is being accepted. The tools to build internet businesses are now understood and cheap. All that's left is really creating the innovative services and applications. Of course, everything isn't just about the internet. Business Week is running a big issue on the Innovation Economy, with a ton of articles about what they expect will be our new age of innovation over the next 75 years, covering biotech to nanotech to infotech to energy. If you've been down about the prospects for technology going forward, read through some of this and realize there's still plenty of innovation on the way.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    dorpus, 1 Oct 2004 @ 12:36pm

    Will we ever have decimal computers?

    For the foreseeable future, computers will continue to store data represented in 0's and 1's. Nobody has talked about storing data in values ranging from 0 to 9 for each bit. And for the foreseeable future, humans will continue to use the decimal number system.

    The problem with binary computing is that the decimal number "0.1" cannot be accurately stored in memory.

    Computers can represent fractional numbers as negative powers of 2, e.g.

    0.5 = 1/2
    0.75 = 1/2 + 1/4
    0.375 = 0/2 + 1/4 + 1/8
    etc.

    However, it is mathematically impossible to represent the decimal number 0.1 as an exact sum of powers of 2.

    Proof by contradiction:

    Suppose 0.1 can be represented as a finite sum of powers of 2, namely

    n(1)/2 + n(2)/4 + n(3)/8 + ... n(N)/2^N = 1/10

    where each n(.) is either 0 or 1.

    The above equation is equivalent to

    n(N) + 2*n(N-1) + 4*n(N-2) + ... + 2^(N-1)*n(1) = (2^N)/10

    The left side of this equation adds up to some integer, whereas the right side cannot be an integer:

    (2^N)/10 = 2^(N-1)/5

    A power of 2 cannot be divided by 5, since both are prime numbers.

    This leads to a contradiction, therefore 0.1 cannot be represented as a finite sequence of negative powers of 2.


    This has vast implications for computer science: computers cannot be fully trusted with calculations involving decimal-point numbers. Computer makers compensate by adding many digits to represent very small fractions, but there are still small errors; 0.1 stored in computer memory is really something like 0.0999999999998713.

    So yes, if there is a vast conspiracy by the CIA, Trilateral Commission, Microsoft, etc. to hide the truth, it is that computers do make math mistakes. If you add up millions of small numbers, errors do accumulate.

    link to this | view in thread ]

  2. identicon
    Gumby, 1 Oct 2004 @ 11:33pm

    Re: Will we ever have decimal computers?

    >> computers cannot be fully trusted with calculations involving decimal-point numbers.
    Actually, just as you can program computers to perform spell checking of text - you can program computers to perform flawless decimal arithmetic. It's slower than native binary - but if you want your money to add up - it is fairly straightforward to accomplish.
    This is usually more of an issue for financial systems where pennies count rather than scientific systems where all bases are more or less equally valid.

    link to this | view in thread ]

  3. identicon
    dorpus, 2 Oct 2004 @ 12:01am

    Re: Will we ever have decimal computers?

    Still, people have to remember that programs need to perform such tricks behind the scenes. I demonstrated this fact today to people who have advanced degrees in computer science or related fields, and they didn't remember it right away.

    link to this | view in thread ]

  4. identicon
    xman, 2 Oct 2004 @ 1:18am

    Re: Will we ever have decimal computers?

    The first computers were decimal. (especially if you go back as far as Babbage) - we switched to binary because 10-state electronic devices are hard to build.

    link to this | view in thread ]

  5. identicon
    Raymond, 11 Mar 2005 @ 4:52pm

    Re: Will we ever have decimal computers?

    Actually, IO state electronic devices are not necessary for decimal computing.
    Approximately 3.3 times (2^3.3 = IO) more traces on a chip are needed to represent decimal numbers with digital devices, but the devices are binary.

    Binary devices aren't even the most efficient use of electronics.
    A wire (or trace) can actually hold 3 on/off states: + voltage, O voltage, & - voltage.
    Imagine logic circuits made up of single pole, double throw switches instead of s.p.s.t. switches and one will see what I mean.
    Trinary logic circuits, as this arrangement is called, have been designed, but, as far as I know, never implemented.
    It's too bad since they are inherently the most efficient use of chip space.

    One doesn't have to use a trinary number system with trinary logic circuits.
    Ordinary binary numbers can be used with signed digits allowing arithmetic without using the 2s compliment system.
    The mixed signed digit combinations can be used to represent everything other than numbers - words, instructions, addresses, etc.

    link to this | view in thread ]

  6. identicon
    Raymond, 11 Mar 2005 @ 4:58pm

    Re: Will we ever have decimal computers?

    It's a fact that the translation from decimal arithmetic to binary arithmetic and back again entails inaccuracies.
    The problem comes in the rounding.
    There are situations where only binary coded decimal is used because of this problem.

    I'd recommend building decimal computers again, since the hardware is advanced enough to build decimal machines with acceptable performance, but the binary system really is superior to the decimal one.
    If we switched to the binary system generally, there'd by no need to memorize a multiplication table with 45 entries!

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.