I'm curious why the IFC specifies default, short, and long, but not long long for fundamental type precision?
The IFC writes out:
long long = int w/ 64-bit precision
long = int w/ long precision
int = int w/ default precision
This seems like an odd discrepancy (some builtin types represented via bitwidth and some are represented via a label)?