Damien Broderick wrote,
> >The creators of the device said that the trillion cells, acting
> together, can
> >perform a billion operations per second, with 99.8 percent accuracy.
>
> I might be misunderstanding this, but...
>
> Damn! You wouldn't want to run that puppy for five minutes--you'd be down
> to 54.8 percent accuracy. :)
You are right about the number of errors. However, errors are not problems
if you can detect them.
Error correcting algorithms, using checks-sums and parity bits, for example,
can detect and correct these errors as they occur. Thus, we don't end up
using bad data. Networking systems can have very high error rates is some
cases. But this does not cause bad data to be transmitted. Checksums are
checked so that bits can be corrected. If there are too many errors for
correction, the packet is retransmitted. If too many retransmissions occur
to get the data through, an error tells the user that the file was not
transmitted. Although error rates can be high, we don't actually use the
bad data. It either works or it doesn't, but we shouldn't (theoretically)
get poor accuracy.
-- Harvey Newstrom <www.HarveyNewstrom.com> Principal Security Consultant, Newstaff Inc. <www.Newstaff.com> Board of Directors, Extropy Institute <www.Extropy.org> Cofounder, Pro-Act <www.ProgressAction.org>
This archive was generated by hypermail 2b30 : Sat May 11 2002 - 17:44:22 MDT