RSS Feed Print
Sample pH adjustment and temperature for BOD test
Posted: Tuesday, July 20, 2010 12:06 PM
Joined: 1/26/2010
Posts: 6

We have conflicting information between our pH meter vendor and our regulator on pH adjustment for BOD. The regulator states that sample temperature needs to be as close to 20 degrees C as possible when recording the pH of the sample even with temperature compensation. The pH meter vendor states that temperature compensation will take into consideration the temperature when giving a pH value and that having the sample temperature near 20 degrees C is unnecessary to get an accurate pH measurement. What do you all think? 

Posted: Tuesday, July 20, 2010 4:39 PM
Joined: 12/31/2009
Posts: 40

They are both right from their own viewpoint and area of concern.  The vendor is correct in stating that the temperature compensation that is built into the meter eliminates the need to have all samples at the same temperature to achieve comparable measurements.  However, the regulator is correct that for BOD's the method states the samples should be brought to 20 C before analysis.  I'm not sure what the exact wording is among the different editions, but generally speaking the language is "Prior to X (where X is some portion of the BOD analysis) bring the sample to 20 ± 2 or 3 (not sure which it is).  

Perry Brake
Posted: Tuesday, July 20, 2010 5:28 PM
Joined: 12/16/2009
Posts: 69

I'll approach the question from a little different perspective and say that they are both wrong, at least a little bit.


The range over which temperature compensation corrects for temperatures different than 20° is limited.  If, for example you are reading the pH of a sample at 28° (for whatever test...pH, BOD, whatever), the temperature compensation capability will be overcome and you won't get a true pH reading.  I think your pH meter vendor would agree with that.  Just what the limits are depends of the meter/probe.


Your regulator is also a little bit wrong in requiring that the temperature be as close to 20° as possible.  That would require it be at, for example, 20 ± 0.1° (or ± 0.05° if you prefer) if your NIST-traceable thermometer has 0.1° increments.  The 21st Edition of Standard Methods says for the BOD test that the sample must be at 20 ± 3° before dilution.  I would imagine your regulator meant to say as close to 20° as feasible, or as practical.  I agree with that and haven't yet subscribed to the lenient ± 3° limits allowed by Standard Methods.  Why knowingly allow an avoidable source of random error?....  There are already too many sources of random error in the test that can't be avoided.