I think the data issue is more a question of “How is it big?” rather than “Is it big?” mainframes do very well with large amounts of data if the data itself is scaling in number of records… I’m not sure how they fair when size per record scales (which I assume is the case with meteorological data). The older models probably struggle, but then the new models … it would not surprise me at all if they could handle that kind of input. It’s definitely a question worth future exploration.

The reason why there isn’t some modern language with fixed point built in is because for most use cases floating point is faster and easier. I think it might just be a market force situation. The number of programmers that need decimal precision and need it optimized such that a library or module might be burden is pretty small minority.

Author of Kill It with Fire Manage Aging Computer Systems (and Future Proof Modern Ones)

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store