I'm interested in the mathematical consequences of randomness if randomness is defined as follows. Let's represent an infinite sequence of numbers by the function f on the natural numbers N to the set {0,1,2,3,4,5,6,7,8,9}.

Suppose we say that the infinite sequence f is random if no algorithm exists that generates the sequence f(n). For example, the digits of pi seem random but there are many elementary formulas that represent the numerical value of pi perfectly. Thus, in theory, pi can be computed to arbitrary accuracy.

Question: Can it be proven that mathematically random sequences exist with this definition?

As John von Neumann humorous said, ``Anyone who attempts to generate random numbers by deterministic means is, of course, living in a state of sin.''

The method I have specified for defining non-random numbers is clearly deterministic. But how do we know that truly random numbers exist?