Benford's Law And Where It Came From?

1371 WordsNov 13, 20146 Pages
Benford’s Law and where it came from? According to Oxford dictionary, Benford’s law is the principle that in any large, randomly produced set of natural numbers, such as tables of logarithms or corporate sales statistics, around 30 percent will begin with the digit 1, 18 percent with 2, and so on, with the smallest percentage beginning with 9. The law is applied in analyzing the validity of statistics and financial records. Benford’s law is a mathematical theory of leading digits that was discovered by American astronomer Simon Newcomb. In 1881 he have noticed, that the pages of logarithms book beginning with number 1 were more worn than pages dealing with higher digits. In comparison to pages starting with 1, they looked more clean and new. He calculated that the probability that a number has any particular non-zero first digit is: P(d)=Log10(1+1/d) Where: d is a number 1,2,3,4,5,6,7,8 or 9 And P is the probability. Using that formula he concluded that all digits don’t appear with equal frequency but number 1 appear as the first digit about 30 % of the time, as supposed to digit 9 that appear less than 5 % of the time. However, he didn’t provide any theoretical explanation for his phenomena he described and it was son forgotten. In 1938, Frank Benford, a physicist, also noticed nonuniform way of digit distribution. He attempted to test his hypothesis by collecting and analyzing his data. After having over 20,000 observations, he noticed that numbers fell into a

More about Benford's Law And Where It Came From?

Open Document