I just read this article talking about a bug in sort algorithm that has been dormant for 50 years:
TL;DR -- the bug is in this line of code:
int mid = (low + high) / 2;
...which "sets m to the average of l and u, truncated down to the nearest integer." The bug occurs when the sum of (low + high) is greater than the maximum positive
int value (231 - 1).
As data sets get bigger and bigger these kinds of errors are going to crop up more frequently.