Plenary session at Heidelberg
News from the LEP W Working Group |
E. Lancon
|
Eric presented the new LEP W Working Group settled to improve the collaboration
between the LEP experiments in the
WW physics area. The main concept being the fusion of the LEPC groups
in charge of producing the combined results
and the WW-workshop groups created at the time of the workshops held
at Crete and at CERN. The LEP W Working
Group will have three main streams:
Mark Thomson (OPAL) and Marco Verzocchi (OPAL) will coordinate the efforts
on the W mass and 4f group.
Oliver Buchmuller (ALEPH) and Paul de Jong (L3) will coordinate the
efforts on the FSI group.
Martin Gruenewald (L3) will coordinate the efforts on the TGC group.
Anyone interested to join the efforts of these groups is welcome to
contact them + Eric and Frederic.
Review of the LEP W mass combination |
F. Teubert
|
Frederic presented a review of the LEP W mass combination for Moriond
2000, unchanged since 1998.
ALEPH, L3 and OPAL provided preliminary W mass measurements using 97-99
data, and DELPHI using 97-98 data.
The LEP energy calibration group provides a correlation matrix between
the diferent years used in the combination.
Every experiment provides two measurements (lnqq and 4q) for each year.
A covariance matrix is built under the following
hypothesis:
Frederic emphasized the importance to have a better understanding of
the correlations between channels and experiments.
For instance, if the correlation between channels from the fragmentation
error was zero, the ALEPH W mass measurement
will improve by 10%! moreover if the correlation between experiments
is zero the LEP W mass measurement will improve
by 17%!!!
Another question that needs to be answered is what is the correlation
between BE and CR. If they happen to be 100% anticorrelated
the ALEPH W mass measurement will improve by 5%, while if they are
100% correlated will be 16% worse.
What is the correlation between detector systematic errors and fragmentation???
The effect of neglecting some of this correlations can change the error
by 10-20%, which is much large than any
of the "improvements" shown so far... but, of course the best is to
understand better the source of this systematic uncertainties,
and if the errors are small enough, no need to worry about correlations.
Review of the LEP-TGC combination |
J. B. Hansen
|
Jorgen made a review of the LEP-TGC combination. Standard TGCs (DKg,
Dgz and Lambda_g) are combined
using 1D, 2D and 3D fits. Neutral TGCs from Zg events (8 couplings)
and from ZZ events (4 couplings) are
combined too using 1D and 2D fits.
Quartic Gauge Boson Couplings (QGC) are expected to be combined for ICHEP 2000.
The measurements from the LEP experiments are combined using the likelihoods
that each experiment provide
for each coupling, including both statistical and systematic errors.
The combination is done adding the likelihood
curves. The main problem with this method is that no account of correlations
between experiments and channels
is taken into account. So far, the measurements were dominated by the
statistical component of the error, but this
is not more the case. On the other hand, there has not yet been
a careful comparison on the evaluation of systematic
errors by each LEP experiment.
It's not clear what is the solution to this problem for ICHEP 2000.
The obvious solution would be to do the combination
at the level of observables, but every LEP experiment has a different
observable to start with (maybe this is posible
for ALEPH and OPAL).
The near future of FSI at LEP |
O. Buchmuller
|
Oliver gave a summary of the plans of the recently created LEP FSI group.
The aim of the group for the near future is to understand the different
conclusions between ALEPH/L3 and DELPHI
w.r.t. BE correlations between W's. The different methods have been
extensively compared, but
no definitive conclusion has been agreed on, yet. The idea is to use
the common MC files,
apply each method and compare results. This should be ready for the
Lisbon WW workshop in November. If the
differences are understood, an attempt to combine the results from
the different experiments could be possible.
The FSI group will also focus on the search for CR effects. Two variables
have been looked at so far, the inclusive
multiplicity of W's decays, and the particle flow in the region between
W's. Since the particle flow has a higher
sensitivity it should get a higher priority at LEP. Of course, any
new idea is welcome to increase the sensitivity to
CR effects. The plans is to get the results of the four LEP experiments
ready for the Lisbon WW workshop and
attempt a combination afterwards.
Status of WW Monte Carlo Production |
J. Ward
|
Jason gave a report on the status of the final 1999 WW MC production,
and the 2000 WW MC production.
In total 6.5 Million WW events have been produced for the analysis
of 1999 data
(900k at 192GeV, 2600k at 196 GeV, 2400k at 200 GeV and 600k at 202
GeV). The impact of the Linux
cluster is evident!!!
All the information related to the 1999 MC production can be found
at
http://alephwww.cern.ch/~wardj/WWMC1999/WWMC1999.html
The status of the 2000 MC production is also going quite well. There
are 1350k CC03 WW events
already produced at 204, 205, 206, 207 and 208 GeV. Moreover there
are 750k 4f WW events
already produced at 204, 206 and 208 GeV. Have a look at Jason's transparencies
for more details.
Status of some recent/new generators | B.Bloch |
Brigitte gave a short summary on the status of the two new MC generators she's interfacing with KINGAL:
The status of KK2f is basically ready for use. No difference larger
than 1% observed comparing equivalent
programs (Zfitter, KKsem) in same conditions.
The status of KoralW 1.43 is less satisfactory. Work started long time
ago but a lot of technical problems
need to be solved. Brigitte would appreciate some help to debug/test
KRLW03 version. Some hope to
get it available in KINGAL soon.
News on Zgamma since the ALEPH week | B. Trocme |
Benjamin gave a report on the recent progress in the understanding of
the LEP energy measurement
from Zgamma events. The combination of the hadronic and muonic channels
at energies between
183 GeV and 202 GeV gives a shift of -462 +- 77(stat) +- 100(syst)
w.r.t. the LEP energy calibration
group.
The large discrepancy between data and MC observed on the s'/s distribution
for very colinear muon events,
is probably explained by the differences in the resolution of the angular
measurements. The hypothesis of a wrong
simulation of soft ISR photons is less likely after having seen the
same effect on Z data.
The discrepancy in resolution is mainly due to the fact that the MC
do not include Beam Energy Spread.
However, there are still residual discrepancies to be understood. Benjamin
is working on including the
beam energy spread on KK2f, and at the same time, test the influence
of the better ISR calculation included
in KK2f and not in KORALZ.
Status of the measurement of the Z mass using Zgamma events | G. Leibenguth |
Guillaume has been working on the implementation of a kinematic fit
on radiative events to measure
the Z mass. His analysis is the same than the one from Benjamin/Eugeni,
except that he measures the
Z mass (in place of the LEP energy) using the LEP beam energy measured
by the LEP energy calibration
group as a constraint in the kinematic fit.
The analysis is just starting and should give an interesting xcheck
for both the W mass analysis and
the measurement of the LEP beam energy from radiative events.
Short biased minutes of the
Improvements in semileptonic selections
Anne Ealet
Anne presented her present studies to improve
the selections of the WW semileptonic channels. The tau channel has been
being studied in collaboration with Djamel.
The traditional "topological" tvqq selection
has been compared with a new selection based on Djamel's tau jet reconstruction.
The performances of the selection based on the
new tau reconstruction only are: 52.5% efficiency, 0.795 purity with a
tau jet in all the events. Combining this selection with the traditional
"global" analysis the performances are: 57% efficiency, 0.78 purity and
100% of the selected events with reconstructed taus to be compared with
the traditional "global+topological" selection which have 54% efficiency
and 0.78 purity and 5% of the selected events without reconstructed taus.
Anne showed that tau multiplicity, tau momentum
and invariant mass of the hadronic part are better reconstructed in the
new topological selected events than in the traditional ones, both because
the tau jet is better reconstructed and because a tau jet is reconstructed
in all the events. The present conclusions are that the gain due to the
new selection is not so large for the cross section but can be relevant
for the mass.
Studying the preselection of the semileptonic
events Anne proposed a possible improvement in the cuts by rejecting events
with visible mass smaller than 50 GeV. This cut should help to remove the
two photon background (and beam-gas background) which is not very well
simulated. This cut has almost no impact on the efficiencies
Finally Anne presented the results of her attempt
to introduce a NN-based cut in the evqq and mvqq selection replacing the
traditional probability cut and a NN-based cut in the tvqq selection she
had just presented.
The preliminary results, obtained comparing 200
GeV data, show an improvement in the electron channel, while the muon channel
has only a small improvement, probably because the limiting factor in this
channel, which has a higher efficiency than the electron channel, is in
the preselection. For the tvqq channel a first attempt has been done with
a NN with 12 input variables; possible limitations in the improvements
can be removed by reviewing the preselection of the tvqq channel.
In general the improvement in the statistical
errors is about 2-3% (relative). At 189 GeV the gain is smaller.
Anne will continue to put together all the studies
she has been doing in a new selection procedure.
Systematic studies: lepton id and E12 Maria Girone
Maria updated with the 1999/2000 data the studies
she did in the past years on the inefficiency due to the E12 cut and on
the comparison of the lepton id (e and mu) performance between real and
simulated data.
With the year 2000 data so far collected the
inefficiency due the E12 cut and the beam related background or the detector
noise, estimated with the method of the random trigger, has been evaluated
being about 7%. Maria presented a preliminary attempt to reduce this inefficiency
by trying to remove from the computation of E12 those calorimetric energy
deposits which are likely being beam related background and/or detector
noise. Since about 67% of the random trigger events with E12 different
from zero have energy in HCAL only, Maria started her study from HCAL.
She removed from the evaluation of E12 those HCAL objects in the orizontal
plane which do not have LCAL energy in the corresponding towers in
front of them, and the HCAL objects in the vertical plane if there is no
signal in the LCAL scintillators placed between the LCAL semi-modules.
Applying this "cleaning" of E12 the inefficiency due to the beams and to
the noise is about 2% only. Maria has been asked to check the level of
the two photons background with this new definition of E12, to investigate
the simulation of the LCAL scintillator efficiency and to check in the
real data if the number of selected events increase as expected due to
the increased efficiency. Similar studies are going on in the SUSY Task
Force.
The lepton id performances in 1999 data have
been studied in real and simulated data by using low multiplicity (2 tracks)
events selected to create two high purity samples of electrons and muons
by applying cuts based on HCAL to select the electrons and on ECAL and
the analog signal from HCAL (not used in the muon identification algorithm)
to select the muons.
Comparing the number of identified leptons in
real data and in MC and correcting for the different momentum spectrum
and angular distribution between the samples used in this study and the
leptons from fully semileptonic WW events, the corrections to be applied
to the lepton id efficiencies from MC are: -0.78+/-1.08 % for the electrons
and -0.76+/-1.98 % for the muons, by using the 196 GeV data.
Similar figures are obtained with the data collected
at the other center of mass energies.
Maria has been asked to check the evaluation
of the statistical uncertainties of the correction factors which look like
being over estimated. Possibly an asymmetric uncertainty has to be used
to tale into account that the efficiency cannot be larger than one and
that the probability of having a negative correction factor is larger than
that of a positive one if the MC efficiencies are close to the unity.
Semileptonic tau channel selection and mass measurement Djamel Boumediene
Djamel presented the results of his latest step
in the development of a new tvqq selection and mass measurement.
He compared the expected W mass errors obtained
by applying his new tau jet reconstruction to the traditional selection
and to the new selection developed by Anne. Both results have been obtained
both with a 1D fit of the rescaled hadronic mass distribution and with
a 2D fit to the hadronic mass and the its error distributions obtained
by applying an 1C kinematic fit (ABCFIT) to the events. The results on
the 189 GeV data are:
- Old selection 1D fit : 258 MeV expected error
on Mw
- Old selection 2D fit : 240 MeV
- New selection 1D fit : 246 MeV
- New selection 2D fit : 228 MeV
The last figure, rescaled to a luminosity of
200 pb-1, corresponds to an expected error of 215 MeV to be compared to
the previous ALEPH result of 260 MeV and the so far best LEP result (by
OPAL) 210 MeV.
Status of W mass with the new ntuples Andrea Valassi
Andrea told us that KINFIT, for the muon and the
electron semileptonic channels, has been interfaced to the new ntuples
and it will be available for the tau channel in a few days (old tau selection
and mass extraction scheme). The code has been tested on shift50 and on
Linux and he started a comparison of his results with the results obtained
with the old code.
Andrea found small differences BEFORE the kinematic
fit: in the number of preselected events and , taking into account
the common events only, in the distributions of the lepton energy (mainly
electrons) and in the jet energies. Given these differences the distributions
of the reconstructed masses AFTER the kinematic fit are almost identical.
It was agreed at the meeting that the differences in the kinematic fit
input variables have to be understood. Andrea Venturi (and anyone else
who is willing to do it) will look into that.
Andrea's future plans are: release a production
version of the KINFIT interface and produce the official ntuples with the
KINFIT results, interface the parametrization code with the new ntuple.
After that, possible future activities can be (in random order): KINFIT
for the 4q channel, kinematic fit with visible ISR and improving the parametrization.
Plans for Osaka Andrea Venturi
Andrea recalled that the only new result for Osaka,
related to the leptonic group, is the preliminary 2000 cross section analysis.
A decision has to be taken if the new improvements
presented in this meeting (E12 cleaning, NN in the e-mu semileptonic channels,
new tvqq selection) have to be used for the Osaka results or not. The present
general opinion is that we should not.
This relatively low conference pressure should
be used in the best possible way, especially by the mass involved people.
With respect to the systematic studies Andrea marked three points:
1) Bookkeeping: those people who are involved
in developing and applying systematic uncertainty studies should make an
effort to document their work with Web pages, draft notes, talks available
on the Web in order to ease an early open discussion about the systematic
uncertainties evaluation and possible iterations and improvements of the
methods.
2) Common systematic sources between mass and
cross section measurements: taking the list of systematic sources described
in the recent cross section and mass papers, those who are involved in
these measurements should meet together and agree on common methods and
tools to estimate systematic uncertainties from the same sources.
3) New (not yet taken into account) systematic
sources and new (less conservative and avoiding double counting) methods
to estimate the systematic uncertainties: anyone who feels that some uncertainties
have been neglected or are badly evaluated should speak up !
Examples of new sources are: LEP beam energy
spread, reconstructed jet energy linearity simulation. Examples of estimation
method which have to be reviewed are: sagitta corrections, calorimeter
simulation.