This is odd, and I can easily work around it, but if I run this appleScript
random number from 1 to 1000 as text
The result isn’t text, it’s a real.
It’s easy to avoid with parens (below) but I’m just not sure why it’s doing that.
The values are between 1 and 1000 so it’s seeing that number.
(random number from 1 to 1000) as text
I guess the your first example is interpreted like this:
random number from 1 to (1000 as text)
As for the explanation I’m not 100% sure, but you’re calling the function
random number with two parameters
to. Each parameter is parsed as an expression. Hence it combines
I got the function syntax from ScriptDebugger (Window > Dictionary, select Scripting Additions, and search for random number):
set theResult to random number number ¬
from number ¬
to number ¬
with seed integer
random number only returns an integer if its direct parameter’s an integer or if both the
to parameters are integers. My guess is that although
(1000 as text) is probably coerced back to an integer for the number value, it’s not an integer at the point where its class is tested.
OK, thanks, I thing I understand it now
AppleScript normally text to integer (not real).
set myNum to (1000 as text) as number
class of myNum
set myNum to (1000.0 as text) as number
class of myNum
So this is an odd behavior of the random number command.
Maybe you’re going to quick there. AppleScript’s number conversion is smart, and converts to integer when it can, and otherwise it converts to real:
set myNum to 123.4 as text
log myNum -->"123.4" (locale depending)
log class of myNum --> text
set myNum to myNum as number
log myNum --> 123.4
log class of myNum --> real