function - Algorithm for determining acceptable variance -


i have application comparing numbers 2 different reports. going alert user when variance between 2 numbers on threshold. going use 10% threshhold, realized when count ex. 10,000 variance of 10% high (meaning if there 999 excessive), when count 10 variance of 10% low (meaning 2-3 / 10 acceptable variance).

just can't figure out how it, besides coding out

if counta <= 10 acceptablerate = 20% if counta > 10 acceptablerate = 15% 

does know how explain trying here mathematically, , how implemented? sure simple question better @ math am.

besides formulas mentioned in previous answers, consider using power of n (where n reference number 1 report or other) tolerance. here python code , results, illustrating several different powers of n:

j=10    in range(6):     print '{:8} {:8.1f} {:8.1f} {:8.1f} {:8.1f}'.format(j, j**0.33, j**.35, j**.37, j**.39)     j *= 10        10      2.1      2.2      2.3      2.5      100      4.6      5.0      5.5      6.0     1000      9.8     11.2     12.9     14.8    10000     20.9     25.1     30.2     36.3   100000     44.7     56.2     70.8     89.1  1000000     95.5    125.9    166.0    218.8 

Comments