Ints and floats in jsarguments
Feb 09 2009 | 12:25 am
I am trying to distinguish between integers and floating point numbers that are typed into my [js] object as arguments at instantiation. I've got no problems doing this when the floats have digits after the decimal place i.e: "6.789" but when I want to have a floating point representation of a whole number i.e: "3." it seems to parse the number as an integer.
The function below is the one I'm using to test for the type of number. It will search for a decimal point and if none exist then the number is an integer. I thought that this would work but obviously somewhere the decimal point is disappearing and I'm not sure what the problem is. Is there any way to stop this happening? Or am I simply doing something wrong along the way? Any help would be appreciated.
lh
// int or float test
function testargs(x)
{
if (typeof(x) == "number")
{
var numtostring = "a" + x;
if (numtostring.indexOf(".") == -1)
post(x + " is an integern");
else
post(x + " is a floatn");
}
}
// end