My understanding is that 'div' gives the quotient for integer division and 'mod' gives the remainder.
So, when 15 is divided by 6, the integer quotient (div) is 2 and the remainder (mod) is 3.
15 div 6 = 2
3 div 6 = 0 (integer quotient is 0, remainder is 3)
If I'm correct, then 20 mod 0 is undefined because division by zero is not allowed.
For me 'minint', or 'min int', is the smallest (therefore, usually negative) integer for a particular computer configuration. This value depends upon whether you have an 8-bit, 16-bit, 32-bit, 256-bit, etc. system.
Additions or corrections, anybody?
My understanding is that 'div' gives the quotient for integer division and 'mod' gives the remainder.
So, when 15 is divided by 6, the integer quotient (div) is 2 and the remainder (mod) is 3.
15 div 6 = 2
3 div 6 = 0 (integer quotient is 0, remainder is 3)
If I'm correct, then 20 mod 0 is undefined because division by zero is not allowed.
For me 'minint', or 'min int', is the smallest (therefore, usually negative) integer for a particular computer configuration. This value depends upon whether you have an 8-bit, 16-bit, 32-bit, 256-bit, etc. system.
Additions or corrections, anybody?