Question

I don't mean programming mistakes, which are actually done by a human somehow, but something when performing something as simple as adding two numbers.

What is the range of x to expect a mistake in 1/x?

Was it helpful?

Solution

In terms of CPUs, there are three possible sources of mistakes which seem to be in the scope of your question:

  1. Floating point rounding errors. This seems to be what you are getting at with your division example. This type of error is completely deterministic in practice, not random at all! However, if the programming language you are using leaves floating point behaviour underspecified, you may get different errors on different computers.
  2. Design mistakes in a CPU, such as the infamous Intel Pentium FDIV bug. It's hard to put a probability on this, but fortunately modern CPUs are extensively tested, and even formal methods are used to mathematically prove their correctness to some extent.
  3. Hardware errors caused by radiation, such as cosmic rays. Unless you put your computer inside the reactor of a nuclear power station or something, the probability of errors caused by radiation should generally be negligible. Interestingly, this is actually relevant to certain programming techniques such as hashing in revision control systems. You can make the argument "Well, it's more likely that we get an error due to a cosmic ray, than a hash collision, so it's not worth worrying about the possibility of a hash collision".

Other components of a computer, such as storage devices and display devices, are much, much more likely to exhibit hardware errors leading to data corruption, than a CPU.

OTHER TIPS

Following on from @Robin Green's answers, there are actually a few more potential causes of hardware error besides cosmic rays:

  • Electrical Noise: Thermal noise is present in all electronic circuits, as are effects such as inductive coupling.
  • Quantum events: As features on semiconductors become ever smaller (particularly gate dielectrics) and the number of electrons involved in each state change becomes smaller, the finite (but small) probability of electrons being in high energy state and affecting a logic state get significant.

There are design solutions to all of these problems, but they come at a price we might not want to accept in terms of size, power consumption, integration density. Radiation hardened semi-conductors are notable in their low integration density, relatively performance (and high cost).

It's also worth noting that in communications and storage, hardware error are commonplace and rather than preventing them in the first place, the strategy is to recover from them with error detection and correction techniques.

I have this conversation with a friend of mine who is very anti-automation - he's a train driver...

"How many mistakes does your PC make when you boot it up? How many decisions does it make during that process? How many mistakes have you made when driving a train?"

(3 in 9 years if you're interested)

Yes, there will be the odd read error due to marginal design and media ageing at some point, but does a computer make a mistake or do humans just cut corners?

Is a cosmic ray a mistake by the computer or the designer? I suspect in the future computers will become complex enough to make what we term a "mistake", but they would need to exhibit their own intent to properly be guilty of that charge.

Never. - One reason is that the notion of 'mistake' is a human category and does not apply to machines. Computers are stupid (it's human programming that makes them look clever), and they cannot fail.

Machines act in accordance with their construction and - in case of computers - in accordance with the programs they're running. And this is always deterministic - otherwise it would mean that some laws of nature are broken and the entire human science is, well, some kind of guesswork.

The outcome may not always be as expected by humans, but this always is explainable by human factors. There simply is (can't be) such a thing as a 'computer mistake'.

...otherwise it would mean that some laws of nature are broken and the entire human science is, well, some kind of guesswork.

If you look into this, indeed the entire human science is actually some kind of guesswork to a certain degree. There are NO ABSOLUTE facts known about anything yet. Just approximates and best guesses. And even the very core of the science and physics is a faulty model. By a very small factor, but faulty nevertheless.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top