If a value exceeds a threshold (about 10^309) it is "rounded up" to infinity (Inf). This may lead to wrong results even if a proper one could be calculated. For example:
> 10^500 * 0
We definitely know that multiplying any constant with 0 results 0. On the other hand, multiplying infinity with 0 leads to undefined result, i.e. the following is correct:
> Inf * 0
I propose is to introduce a new artificial value (similar to Inf, NA and NaN), let's say Huge. That would be the value when a number is so big that R is unable to handle it. The proposed differences and connections between Inf and Huge are the following:
Huge * 0 = 0
(Inf * 0 = NaN)
Inf > Huge = TRUE
Huge > Huge = NA
Huge == Huge = NA
Inf - Huge = Inf
(Inf - Inf = NaN)
All the other operations with Huge could be the same as in case of Inf.
Similarly, Tiny could be introduced for handling close to 0 value handling.
If you need to work with very large numbers, the Brobdingnag package is a more sophisticated solution.