Why does this supposedly infinite loop program terminate?

124 Views Asked by At

I was talking to my friend about these two pieces of code. He said the python one terminates, the C++ one doesn't.

Python:

arr = [1, 2, 3]
for i in range(len(arr)):
  arr.append(i)
print("done")

C++:

#include <iostream>
#include <vector>
using namespace std;

int main() {
  vector<int> arr{1,2,3};
  for(int i = 0; i < arr.size(); i++){
    arr.push_back(i);
  }
  cout << "done" << endl;
  return 0;
}

I challenged that and ran it on 2 computers. The first one ran out of memory (bad alloc) because it had 4gb of ram. My mac as 12gb of ram and it was able to run and terminate just fine. I thought it wouldn't run forever because the type of size() in vector is an unsigned int. Since my mac was 64 bit, I thought that it could store 2^(64-2)=2^62 ints (which is true) but the unsigned int for the size is 32 for some reason.

Is this some bug in the C++ compiler that does not change the max_size() to be relative to the system's hardware? The overflow causes the program to terminate. Or is it for some other reason?

1

There are 1 best solutions below

0
Bathsheba On

There is not a bug in your C++ compiler manifesting itself here.

int is overflowing (due to the i++), the behaviour of which is undefined. (It's feasible that you'll run out of memory on some platforms before this overflow occurs.) Note that there is no defined behaviour that will make i negative, although that is a common occurrence on machines with 2's complement signed integral types once std::numeric_limits<int>::max() is attained, and if i were -1 say then i < arr.size() would be false due to the implicit conversion of i to an unsigned type.

The Python version pre-computes range(len(arr)); that is subsequent appends do not change that initial value.