Forums > Javascript

Ints and floats in jsarguments

February 9, 2009 | 12:25 am

I am trying to distinguish between integers and floating point numbers that are typed into my [js] object as arguments at instantiation. I’ve got no problems doing this when the floats have digits after the decimal place i.e: "6.789" but when I want to have a floating point representation of a whole number i.e: "3." it seems to parse the number as an integer.

The function below is the one I’m using to test for the type of number. It will search for a decimal point and if none exist then the number is an integer. I thought that this would work but obviously somewhere the decimal point is disappearing and I’m not sure what the problem is. Is there any way to stop this happening? Or am I simply doing something wrong along the way? Any help would be appreciated.

lh

// int or float test
function testargs(x)
{
if (typeof(x) == "number")
{
var numtostring = "a" + x;
if (numtostring.indexOf(".") == -1)
post(x + " is an integern");
else
post(x + " is a floatn");
}
}
// end


February 15, 2009 | 7:11 pm

I assume that this is not possible seeing as no-one has replied and my other attempts have failed. I’m not sure I can find a "proper" way around this but I’ve implemented using symbols i.e. "3.0" and I’m gonna try to test this and hope for the best.

lh


February 15, 2009 | 8:36 pm

AFAIK there’s no way of making the difference because when you get the arguments from the jsarguments array it is already numbers in the JS meaning.


February 16, 2009 | 11:09 am

Cheers for clearing that up Emmanuel. I’ve been dissecting the javascripts in your ejies utilities, they’re very helpful, thanks for sharing them.

lh


Viewing 4 posts - 1 through 4 (of 4 total)