-
Notifications
You must be signed in to change notification settings - Fork 68
Open
Description
some difficult cases in the language need explanation, http://nitlanguage.org/manual/basic_type.html leaves much open.
print 1000*1000*1000 results in -73741824 on a 32bit system - this will be surprising for many programmers. I see 8,16,32 bit integers but nothing like bignums or 64bit integers?
var a:Numeric
a=1
print a
a=1.0
print a
works as expected, whereas
var a:Numeric
a=1.0
print a+1
gives an error. Is that intended? Is there some clever trick to achieve behaviour similar to that of most other languages?
1.0/0.0 and 1/0 are handled differently. This is so in most other languages but does it make much sense? In a typed language a numeric expression returning a "not-a-number" value is a paradox and mabye #2314 could go.
Metadata
Metadata
Assignees
Labels
No labels