Setting variable to an integer 1 returns 1.0, but setting the same variable to 2 returns 2.
num = 1
num = 2
This can be run from the console.
application.output(num) results in a real number;
num entered in the console shows real if it equals 1 and integer if greater than one.
application.output(num) always shows real
application.output('num '+num) always show integer
What’s up with this? Nature of javascript always handling numbers as real?
JavaScript Numbers are Always 64-bit Floating Point
Unlike many other programming languages, JavaScript does not define different types of numbers, like integers, short, long, floating-point etc.
JavaScript numbers are always stored as double precision floating point numbers, following the international IEEE 754 standard. (source w3schools.com)
In your application output string concatenation is used which causes implicit conversion of the number to a string. This is also explained in the page linked above.