I'm guessing the initial overshoot happened due to a number of reasons. (Residual heat in the stainless steel base of the keg needing time to transfer to the water? With 10+ gallons of water in the keg, convection to equalize temperature may not be efficient? Lag time in the thermal probe?) I'd think an ideal control algorithm should anticipate that with a large commanded temperature change, cutting off the burner early is necessary to prevent an overshoot, but that would be unnecessary with just a small 0.5 degree F change. I'm guessing this is a common control systems problem, so rather than reinvent the wheel, is there a standard algorithm that handles this? Or can PID be used in a situation like this?