'MSVC 2019 and 2022 compiler hangs and is erratic compiling relatively large std::complex arrays in c++17 and c++20
Using the latest C++ MSVC 2019 and 2022 the following code hangs while compiling
#include <complex>
// const int N = 10000; // about 3 secs
// const int N = 40000; // about 40 secs
const int N = 200000; // hangs (over 10 min)
//const int N = 2000000; // normal, 1 sec
std::complex<double> x[N];
int main()
{
for (size_t i = 0; i < 200; i++)
x[i] = std::complex<double>(1,2);
}
Oddly, it compiles, at different speeds as shown including much larger sizes. Looks like a bug and I have reported it. Works fine if in a vector with the same sizes. Any idea what is causing this. Concern is that this may pop up elsewhere or produce bad compilations.
Solution 1:[1]
Move the variable inside main and you`ll see a fast compile; but don't panic if it crashes while debugging. Currently you have declared a huge static object and the compiler is most probably trying to allocate disk space for it. Assess the size of final generated binary (.exe file). It should be about 10xN. you may find compile optimization switches that sacrifice execution speed in favor of memory or executable size. Now this big file is not alone. There a bunch of steps before generation of final binary that - thanks to you - deal with some more gigantic files, including the object (.obj) file. This simple (?) program is taking up gigabytes of your storage.
Edit:==============================
After a certain threshold - which I guess can be tuned with a compiler switch - the compiler resorts to dynamically allocating memory for the huge static object; from the programmer's perspective however, that is still a static object.
Anyhow, such huge objects should not be placed on static storage, nor on automatic storage. They must be placed on dynamic store; either via a std::vector
or via a std::unique_ptr
or some other smart object:
std::vector<std::complex<double> >bigvec{bigN,0.0};
auto bigptr=std::make_unique<complex<double>[]>(bigN);
Because in the end(one way or another), huge objects will impose run-time overhead for initialization. I can't imagine how the OS is supposed to load and run a GB executable; That just creeps me.
Sources
This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.
Source: Stack Overflow
Solution | Source |
---|---|
Solution 1 |