Setting variable to an integer 1 returns 1.0, but setting the same variable to 2 returns 2.
num = 1
num = 2
This can be run from the console.
application.output(num) results in a real number;
num entered in the console shows real if it equals 1 and integer if greater than one.
application.output(num) always shows real
application.output('num '+num) always show integer
What's up with this? Nature of javascript always handling numbers as real?
thanks,
--Joe.