Standard deviation of a distribution does not depend on the sample size. The standard deviation for a uniform distribution is (b - a)/sqrt(12)
where a
and b
are the limits of your distribution. In your case, a = 0
and b = 1
, so you should expect std = 1/sqrt(12) = 0.288675
for any size sample.
Perhaps what you're looking for is the standard error, which is given by std/sqrt(N)
and will decrease as your sample size increases:
In [9]: sample = np.random.uniform(0, 1, 100)
In [10]: sample.std()/np.sqrt(sample.size)
Out[10]: 0.029738347511343809
In [11]: sample = np.random.uniform(0, 1, 1000)
In [12]: sample.std()/np.sqrt(sample.size)
Out[12]: 0.0091589707054713591