Page 1 of 1

Why does setting a variable to one, return 1.0?

PostPosted: Fri Aug 31, 2018 1:13 am
by joe26
Setting variable to an integer 1 returns 1.0, but setting the same variable to 2 returns 2.

num = 1
num = 2
This can be run from the console.

application.output(num) results in a real number;
num entered in the console shows real if it equals 1 and integer if greater than one.
application.output(num) always shows real
application.output('num '+num) always show integer


What's up with this? Nature of javascript always handling numbers as real?

thanks,
--Joe.

Re: Why does setting a variable to one, return 1.0?

PostPosted: Fri Aug 31, 2018 1:58 pm
by omar
Hi Joe,

JavaScript Numbers are Always 64-bit Floating Point
Unlike many other programming languages, JavaScript does not define different types of numbers, like integers, short, long, floating-point etc.
JavaScript numbers are always stored as double precision floating point numbers, following the international IEEE 754 standard. (source w3schools.com)

For more info see: https://www.w3schools.com/js/js_numbers.asp

In your application output string concatenation is used which causes implicit conversion of the number to a string. This is also explained in the page linked above.

Re: Why does setting a variable to one, return 1.0?

PostPosted: Fri Aug 31, 2018 2:52 pm
by joe26
Hi Omar,

Thank You.

Helps a lot.