My last posting described a situation in which the Java programming language knowingly produces wrong results. In the example I gave, Java added two positive numbers, produced a negative result and didn't consider it an error. Specifically:
I write this blog for a general audience, so I opted to leave out the technical details of how and why this happens. But, if you're not a computer programmer (the official term now being "developer") it may be inconceivable that a programming language can't do addition. Here, in a brief detour into nerdville, I'll try to explain it.
You can think of the problem as two pounds of baloney in a one pound bag (the reference being to an episode of the Honeymooners where Ralph gets stuck between two large pipes).
There are two types of programming languages, typed and non-typed. In a typed language, such as Java, programmers are required to specify data types for each variable. The numbers in the example were assigned to the "int" (short for integer) data type (the actual Java code is in the prior posting).
A number of the "int" type in Java can range from -2,147,483,648 up to 2,147,483,647. Another type, called "short" is used for integers up to 32,767. Smaller integer numbers can be assigned to the "byte" type which maxes out at 127. See Primitive Data Types for more.
Java stores "int" variables using 32 binary digits (bits). A binary digit is either a zero or a one. Everything to do with computers boils down to a bunch of bits at the lowest level.
The leftmost bit of a Java "int" variable represents the sign, the remaining 31 bits are the number itself. If the leftmost bit is zero, the number is positive, if it's a one, the number is negative. To illustrate, this is what a positive three and a negative three look like.
positive three: 00000000000000000000000000000011
negative three: 11111111111111111111111111111101
For the sake of simplicity, we can ignore the details of how negative numbers are represented, other than the fact that they start with a one bit.
At this point you can see that the mistake Java makes is easily detectable. If you add two "int" type numbers where the bit on the left is zero, then the result must also have a zero in the leftmost bit. At least the correct result has a zero there. If you add two positive numbers the result is also positive.
Where exactly did Java go wrong?
In the decimal number system the largest value that fits in three digits is 999, which is also 10 to the 3rd power minus one. The same formula applies to the binary number system. The largest value that fits in 31 binary digits is 2 raised to the 31st power minus one.
You can see this using the calculator built into Windows. Change the view from standard to scientific. It's helpful to also turn on digit grouping, another option under the View menu.
Click on 2, then the pink x-to-the-power-y button, 31, and equals. Subtracting one yields the largest possible integer in a Java "int" variable: 2,147,483,647.
Now click the Bin (for binary) radio button. The calculator shows 31 binary digits, all ones (see above). This is the binary equivalent of all 9s in the decimal number system.
To see the two pounds of baloney in the one pound bag, add one to this binary number.
Much like adding 1 to 99 results in 100 (an extra digit is needed and the low order digits are all zero), this results in a number that needs an extra binary digit on the left, and the remaining binary digits are all zero.
This is where Java goes wrong. While it does the addition exactly like the Windows calculator, it then maps the result back to the "int" data type. Thus, it considers this sequence of bits a negative number because the bit on the left is a one.
In other words, Java adds the two numbers as if they were 32 bit numbers. But they are not, they are 31 bit numbers with a sign bit on the left. Oops.
This is Java addition at the breaking point:
You can see this dynamically at the Inner Int Java Applet by Bill Venners, author of Inside the Java Virtual Machine.
To be clear, this is a Java issue. The results are the same on Windows, Linux, Mac OS X and the many other operating systems that provide a Java Runtime Environment (JRE).
What was the mindset when Java was being developed that thought returning wrong results was better than raising an error condition? Java treats division by zero as an error, but willingly allows integers to overflow such that you can add two positive numbers and get a negative result.
At the very least, computers should be able to compute.