From Rod: In our Prealgebra course, we have been studying Box and Whisker plots. Recently, we learned how to decide whether a data point is an outlier or not. The book (Math Thematics, McDougall Littell) gave a process by which we find the interquartile range, then multiply by 1.5. We add this number to the upper quartile, and any points above this are considered to be outliers. We also subtract the number from the lower quartile for the same effect.

My question: where does this 1.5 originate? Is this the standard for locating outliers, or can we choose any number (that seems reasonable, like 2 or 1.8 for example) to multiply with the Interquartile range? If it is a standard, were outliers simply defined via this process, or did statisticians use empirical evidence to suggest that 1.5 is somehow optimal for deciding whether data points are valid or not?