Insertion sort stops working after million input

Your program is running in O(n²) time, since you have two nested loops that both depend on the size of the input. So once you go from 10,000 to 1,000,000 elements, your program will take 100² = ten thousand times longer to complete. Furthermore, it can be that your dataset fit in the processor’s cache before, but with 1 million elements it no longer does, so this will slow it down even more.

O(n²) algorithms make things go very slow very quickly. With an input size of 10⁶, it means your program will take in the order of 10¹² operations to complete. Assuming your processor runs at a few 10⁹ operations per second at most, and that your algorithm will surely use more than one operation per step, it will take more than 10³ seconds for your program to complete.

Leave a Comment