Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The "decimal floats" specified in JSON are exactly what is wrong with it; Javascript approaches numerics like Applesoft basic did, you have floats that can often stand in for ints. The JSON specification promises one thing, but it's not supported by the 'reference implementation' that it is based on.

Also it is a lose-lose situation.

Not only are floats wrong in many ways (e.g. I think 0.1 + 0.2 = 0.300000000000002 makes many people decide "computing is not for me") but parsing floats (really any ASCII numbers) is astonishingly slow and people get "frog boiled" into accepting it. (e.g. there is no problem with the speed of parsing 10 floats, but when you are parsing a million floats you have a real problem, and a million floats is just 4 MB of core, well in the range that a web or other application could handle on anything bigger than an 8-bit micro-controller.)

Like the numerous problems that cause programmers to not use the SIMD instructions in your Intel CPU, there are multiple problems with floats, each of which can be dismissed by apologists, but when you add them up it's a major drag on the industry.



You're still complaining about floats, and now JavaScript, neither of which is relevant to JSON. JSON is perhaps named poorly, but it merely "was inspired by the object literals of JavaScript" (quoting the spec) - there is no reference implementation, and it is defined by spec.

I also don't really see what alternative you're implying. If you want a human-readable format, but need to express non-integer numbers, what do you suggest we should do?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: