Question: Should I be Re-Evaluating the Average and Limits Every Time?

This question came from a webinar viewer:

“You said that a minimum of 20 data points is a good place to start in order to have a realistic view of the process.

What I was wondering is: Do you re-evaluate the average every time a new data point enters the lot or do you set a window of time which will serve as a sort of benchmark for the rest?”

The short answer is “no.”

My longer response:

Once you’ve established the baseline, you should NOT continually recalculate the average and limits. You should really only do so when there’s been s shift in the system (like a “Rule 2” signal with eight or more consecutive data points above or below the baseline average).

An exception to this might be if I create a PBC with just six data points to start. I might continue revising the average and limits until I get to 15 data points or so, but then I’d lock that in as the baseline.

Comments

  1. What if I having a rolling baseline?

    Meaning this week I received data for the past 12 weeks and next week I will receive data for the last 12 weeks (thus dropping the 1st week from the data set I received this week).

    And for various reasons, I can’t append the new data to the previous 12 weeks.

    Is this process valuable in an evolving situation like this?

    1. Thanks for your question, JM. I’d rather see more than the most recent 12 data points. I’d rather see more like 24.

      That said, do NOT continually recalculate an average and limits based on the most recent (and visible) 12 data points.

      DO calculate an average and limits based on historical data (or the current 12 data points that show). Then, LOCK IN the average and limits until you see evidence of a signal.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.