int A;
int B;
int C;
C=A%B;
So, you calculate C based on values you did not even set yet,A
and B
. They can be anything, and hence, what they actually are is undefined, and so is what happens when you calculate A%B
. Probably B
happens to be 0, and that yields an arithmetic error in your CPU.