I know. And if a compiler stores a number in a variable which classified as integer, then only the numbers before the decimal point get stored.
So my variable DMG is an integer in C.
I calculate something, and the result is 3.14
I store it in the variable DMG.
The next time the program calls on my variable, it will read 3.
Maybe that's what's happening here. So 2.5 becomes 2.
3.643796337899 would become 3.
And 146.953479544 would become 146.