Not really - Benford's law makes the assumption that the logarithm of the numbers is in some sense uniformly distributed, so you'd expect small numbers to appear much more often than large ones.
What is surprising is that this is a reasonable assumption to make - 'real-life' distributions can be modelled as all sorts of things, and yet it seems that many statistics one might collect are fairly evenly spread over a certain range of orders of magnitude.
It seemed a little odd when I read it the first time a while back in a book on Chaos Theory. But once you think about it for a while it doesn't seem that outrageous.
Apparently they use it during the forensic checking of accounts and such which is a little sneaky because if the accounts have been fiddled then the numbers will tend towards an even spread as that what the human mind assumes will appear random.