I think the data issue is more a question of “How is it big?” rather than “Is it big?” mainframes do very well with large amounts of data if the data itself is scaling in number of records… I’m not sure how they fair when size per record scales (which I assume is the case with meteorological data). The older models probably struggle, but then the new models … it would not surprise me at all if they could handle that kind of input. It’s definitely a question worth future exploration.
The reason why there isn’t some modern language with fixed point built in is because for most use cases floating point is faster and easier. I think it might just be a market force situation. The number of programmers that need decimal precision and need it optimized such that a library or module might be burden is pretty small minority.