### Present Remotely

Send the link below via email or IM

Present to your audience

• Invited audience members will follow you as you navigate and present
• People invited to a presentation do not need a Prezi account
• This link expires 10 minutes after you close the presentation
• A maximum of 30 users can follow your presentation

Do you really want to delete this prezi?

Neither you, nor the coeditors you shared it with will be able to recover it again.

# Systematic Errors

No description
by

## Roger Barlow

on 16 December 2013

Report abuse

#### Transcript of Systematic Errors

What they are
Systematic
Errors

How to handle them
Systematic Error: Reproducible inaccuracy introduced by faulty equipment, calibration or technique
Bevington
WRONG!
An error is not a mistake
Systematic Effects is a general category which includes effects such as background, scanning efficiency, energy resolution, angle resolution, variation of counter efficiency with beam position and energy, dead time etc. The uncertainty in the estimation of such a systematic effect is called a systematic error
Orear
RIGHT!
Energy in a calorimeter

Errors on a,b are systematic
Track momentum

Error on B is systematic
Branching ratio

Errors on eta, B are systematic
Systematic errors are different
They do not fall as the data increases
They do not show up in chi-squared
So we have to work a little harder
Systematic errors must be added linearly, not in quadrature
Taylor
WRONG!
Systematic errors are errors, obeying the Central Limit Theorem: Variances add, standard deviations add in quadrature. Separation into statistical and systematic errors

is just a convention, not universal outside particle physics
Just use combination-of-errors with the full form
How to combine results -
two channels, two run periods, two experiments...
This works even for non-Gaussian errors.
The only thing that affects is the 68%=one sigma etc
This will only matter for the final result, and the CLT will save you
Vary all your parameters, cuts, etc
WRONG!
(1) Explicit Systematic Errors
(Usually Experimental or MC)
Use combination of errors formula
(2) Implicit Systematic Errors
Examples
Background modelling
Signal modelling
Parametrised
Non-parametrised
Two models give results R1, R2
Quote:

if you prefer model 1

if you rate them equally

if you rate them equally and the two models are extreme cases

This gives you a ball-park figure. Don't push it.
(Usually Theoretical)
Do combination-of-errors numerically
Adjust parameter plus and minus one sigma
Read off sigma on result
Q:What if the positive and negative
shifts are different?
A: Draw a straight line through them if you possibly can. Avoid asymmetric errors.
Q:What about the error on the slope?
A: Do not add - this is timid and wrong. Subtracting in quadrature is technically correct. Ignoring it is strongly advised.
Q: Do I need to adjust all my cuts?
A: Yes. But ...
'Systematic errors' are not 'mistakes'
But mistakes still happen
You need to find these 'unknown unknowns'
Techniques
Think
Check
Repeat analysis with
Different channels (electron/muon, positive/negative...)
Different time periods
Different experimental conditions
Different cuts
etc
If your analysis is robust against these, it becomes credible - to you and others
Do analysis - result R
Repeat{
Change something - result R'
If (R' compatible with R) {Tick box, move on}
else {Find problem}
}
1. Check test, correct mistake
2. Check analysis, correct mistake
3. Find reason why this change might affect result after all.
...
99. Incorporate difference in systematic errors
''Compatible"
Exact equality is too demanding
Equality within errors is not demanding enough as R and R' share data
The difference (in quadrature) of the two errors is a good measure.
R=10 +- 4, R'= 12 +- 5 OK
R=10 +- 4, R'= 19 +- 5 not OK
Enumerate all the effects involved in your analysis, and find the impact of their uncertainty on your result. Add these in quadrature
(2A) Parametrised
(2B) non-parametrised
Roger Barlow

Actually this separation is tricky.
For some uncertainties, especially Bayesian, "theory" errors, this is obvious.
Uncertainties in expermental quantities like calibration constants are often determined by an 'ancillary' experment. More data there would help, but not more data in your experiment.
Sometimes the ancillary experiment is another analysis in your experiment (e.g. your B decay channel has background from another B decay channel, whose branching ratio is measurement is being improved by the student next door.) Whether you call that 'statistical' or 'systematic' doesn't really matter (and experiments have done both), but you do have to explain what you're doing.
Varying cuts...
Your analysis involves some quantity X (could be mass, transverse momentum...)
Your standard analysis uses X>2.5
You try x>2.4 and X>2.6
Is this a systematic error evaluation or a check?
Could be a systematic
You are measuring the cross section for X>2.5
But you are systematically unsure of X (jet energy scale?) at the 0.1 level, and your '2.5' could be 2.4 or 2.6
Varying the cut is the same as varying the uncertain X-scale
Could be a check
Cutting on X removes background - also signal
You optimised at 2.5
2.4 / 2.6 gives higher/lower efficiency with more/less background
Results should all be compatible
Differences
For a systematic the cut variation is prescribed. For a check it is arbitrary
For a systematic you expect a change. For a check you do not.
RIGHT
The 'errors on errors' puzzle
It can be confusing
Not one big table, but two:
one big, one little
Examples
Calibration constants
Efficiency
Background
Enumerating all the uncertainties is a challenge
Think!
Ask colleagues for advice (but do not always take it)
Read up, and understand, similar analyses
Think!
There are known knowns. These are things we know that we know. There are known unknowns. That is to say, there are things that we know we don't know. But there are also unknown unknowns. There are things we don't know we don't know.
Donald Rumsfeld

WRONG
RIGHT
Full transcript