C-Sharp | Java | Python | Swift | GO | WPF | Ruby | Scala | F# | JavaScript | SQL | PHP | Angular | HTML
Typically C# programmers prefer the C-style numeric types, which are easier to read. We show the Int16, Int32, and Int64 types are equivalent to more commonly used types.
Type information Int16 -> short Int32 -> int Int64 -> long
Example. This program simply declares Int16, Int32, and Int64 variable instances. It then prints the Type object corresponding to them in the runtime. Then it does the same exact thing for the short, int and long types.
Next: The program's output shows that Int16 is equal to short, Int32 is equal to int, and Int64 is equal to long in the runtime.
C# program that uses Int16, Int32, Int64 types using System; class Program { static void Main() { { Int16 a = 1; Int32 b = 1; Int64 c = 1; Console.WriteLine(a.GetType()); Console.WriteLine(b.GetType()); Console.WriteLine(c.GetType()); } { short a = 1; int b = 1; long c = 1; Console.WriteLine(a.GetType()); Console.WriteLine(b.GetType()); Console.WriteLine(c.GetType()); } } } Output System.Int16 System.Int32 System.Int64 System.Int16 System.Int32 System.Int64
Discussion. Should you prefer Int16 to short, Int32 to int, or Int64 to long? Usually not. Conventions favor the short, int and long types. But in programs where the number of bytes is important, you might prefer Int16, Int32 or Int64.
Tip: In methods where you use int variables, it is a poor choice to use Int32 variables instead, as int is standard.
Summary. Int16, Int32 and Int64 are important. The short, int and long types are aliased to them. From the perspective of the runtime, your programs that specify short, int and long are actually using Int16, Int32 and Int64.
However: For the most readable programs, the int type is usually preferred over the Int32 types.