I'm guessing the reason it's not often used in programming is due to slower performance and larger memory requirements. I typically see rationals (numerator/denominator pair) used in arbitrary-precision libraries.
> ..Arithmetic with rational numbers can become unwieldy very quickly: 1/99 − 1/100 = 1/9900, and if 1/101 is then added, the result is 10001/999900.
> ..Arithmetic with rational numbers can become unwieldy very quickly: 1/99 − 1/100 = 1/9900, and if 1/101 is then added, the result is 10001/999900.
https://en.wikipedia.org/wiki/Arbitrary-precision_arithmetic