What is the difference between conversion specifiers %i and %d in formatted IO functions (*printf / *scanf)


What is the difference between %d and %i when used as format specifiers in printf?

8/21/2017 7:26:12 AM

Accepted Answer

They are the same when used for output, e.g. with printf.

However, these are different when used as input specifier e.g. with scanf, where %d scans an integer as a signed decimal number, but %i defaults to decimal but also allows hexadecimal (if preceded by 0x) and octal (if preceded by 0).

So 033 would be 27 with %i but 33 with %d.

8/15/2019 11:49:31 AM

These are identical for printf but different for scanf. For printf, both %d and %i designate a signed decimal integer. For scanf, %d and %i also means a signed integer but %i inteprets the input as a hexadecimal number if preceded by 0x and octal if preceded by 0 and otherwise interprets the input as decimal.

Licensed under: CC-BY-SA with attribution
Not affiliated with: Stack Overflow