Detection Limits and Precision When Testing Industrial Water
All too often I get asked about using a high-level method for low-level tests so colorimeter users can consolidate their test kit. I have been there — lugging around a bunch of reagents is not always fun. The reason low-level and high-level test methods are developed is because of the different requirements for detection limits and precision.
Let’s use testing for molybdenum as an example. Our high-level test is designed for levels up to 60 ppm as Mo (100 ppm as MoO₄). The lowest detection limit is 0.7 ppm Mo; however, the precision is +/- 0.4 ppm. If trying to test a sample for trace Mo (like down at 1.0 ppm, which is above the method’s detection limit), then this method is not adequate for obtaining an accurate result. In other words, my 1.0 Mo reading could actually be anywhere from 0.6 ppm to 1.4 ppm.
Taylor’s low-level test for molybdenum is designed to test up to 3.30 ppm Mo. The lowest detection limit is 0.03 ppm Mo, and the precision is +/- 0.01 ppm. Using this method to test trace levels of Mo (like the 1.0 ppm sample discussed above), I can have very high confidence in my test result because of the improved precision. A result of 1.00 ppm Mo means the level could be 0.99 ppm to 1.01 ppm. This is what I am looking for.
Of course, the operator plays a big part in the accuracy of the end result too. Glassware needs to be clean, sample sizes measured properly, reagents added correctly, and any wait times observed carefully.
This discussion pertains to all methods of testing, so be sure to take a look at the instructions to be certain you are using the correct test for your intended results.