Ints and floats in jsarguments

Feb 9, 2009 at 12:25am

Ints and floats in jsarguments

I am trying to distinguish between integers and floating point numbers that are typed into my [js] object as arguments at instantiation. I’ve got no problems doing this when the floats have digits after the decimal place i.e: “6.789″ but when I want to have a floating point representation of a whole number i.e: “3.” it seems to parse the number as an integer.

The function below is the one I’m using to test for the type of number. It will search for a decimal point and if none exist then the number is an integer. I thought that this would work but obviously somewhere the decimal point is disappearing and I’m not sure what the problem is. Is there any way to stop this happening? Or am I simply doing something wrong along the way? Any help would be appreciated.

lh

// int or float test
function testargs(x)
{
if (typeof(x) == “number”)
{
var numtostring = “a” + x;
if (numtostring.indexOf(“.”) == -1)
post(x + ” is an integern”);
else
post(x + ” is a floatn”);
}
}
// end

#42191
Feb 15, 2009 at 7:11pm

I assume that this is not possible seeing as no-one has replied and my other attempts have failed. I’m not sure I can find a “proper” way around this but I’ve implemented using symbols i.e. “3.0″ and I’m gonna try to test this and hope for the best.

lh

#150925
Feb 15, 2009 at 8:36pm

AFAIK there’s no way of making the difference because when you get the arguments from the jsarguments array it is already numbers in the JS meaning.

#150926
Feb 16, 2009 at 11:09am

Cheers for clearing that up Emmanuel. I’ve been dissecting the javascripts in your ejies utilities, they’re very helpful, thanks for sharing them.

lh

#150927

You must be logged in to reply to this topic.