> uniquely describe a truly uniformly random number
This is the hard part. Picking a randomly uniform number "close to" Graham's number.
Graham's number itself is easily described of course: I can just say "Graham's Number". The numbers "close to it", (say, +/- 1% of Grahams number), are impossible to describe.
If you don't believe me, then please ship me the impossible number of hard drives that describes one such number, but you will have had to have picked it truly randomly.
Pick one at random.
-------------
EDIT:
> Graham's number can be computed in a few lines of code using Knuth's up-arrow notation.
Also, this is wrong. The first number of the pyramid can be described in up-arrow notation. But even the 2nd number of the pyramid requires g1 (ie: ~7.6 Trillion) up arrows to describe.
I can safely say that g3 (which requires G2 arrows to describe) has more up-arrows involved than there are atoms in this universe. So g3 already cannot be described by a computer program using up-arrow notation alone. And Graham's number is g64, sitting on top of a huge pyramid of such numbers.
Again, you're implicitly assuming the representation has to be a string of digits. I'm not sure that I can convince you, or even what to convince you of, as you aren't accepting the premises of the argument. The numbers +/- 1% of Graham's number can be trivially represented with a program. Likewise, here's some people codegolfing programs to output Graham's number[1]. If it seems like I'm cheating by using too powerful of a method, that's the point that the original person was making: there exist real numbers that can't be described even like this.
> Again, you're implicitly assuming the representation has to be a string of digits
No. I'm implicitly assuming that g64 different numbers requires at least log(g64)/log(character size) different atoms to describe (assuming each atom in this universe can represent say 256 different states, ie a byte, then all the atoms in the world cannot be lined up to uniquely describe the entire set of numbers of g64 +/- 1%).
Lets say you have a program that accurately describes g64, good job. Now g64-1. Now g64+1. Now... g64+2. Etc. etc. I get that, you can represent some of these numbers in a compact form. But I'm asking for something far more difficult.
How many symbols do you need before you can describe g64 +/- (1% of g64)? Well... a lot, it turns out. The number of programs you'd have to write would fill all the hard drives in the universe before you're done writing even a smidgen of them.
-------
And my challenge is effectively closer to describing a random number from 50% of g64 to 100% of g64. This will require log(g64) / log(character size), which is easily beyond the number of atoms or even the arrangement of atoms in this universe. That's just the innate limitation of language in general.
--------
The numbers of g64 +/- 1% are so huge that there's no way any program written on any kind of hard drive can describe those numbers. The shear number of numbers destroys the capabilities of all the hard drives of the world. It doesn't matter what magical notations you use, they're too large to reasonably describe.
-----------
EDIT: Or maybe this is more easy to think about? All programs of size 1MB or smaller can at best, represent 2^(8 * Megabyte) numbers.
All programs of size 1000 Zetabytes or smaller can at best, represent 2^(8 * zetabytes) number of numbers.
As it turns out, 2^(8*zetabyte) is smaller than 1% of g64. So you cannot possibly uniquely represent those numbers (g64 +/- 1%) even with 1000 Zetabytes worth of storage.
If we extend this out to another few dozens sets of magnitude, by multiplying by 10^300 or so, its still smaller than the space of g64 +/- 1%, so even if we used all the atoms of the universe in a giant, intergalactic hard drive, we still can't represent this set of numbers.