Grid Deployment Board (GDB)
Web - Wiki - Agendas - Minutes
8 October 2008 – GDB
Meeting – Agenda
Role of the Tier-2 Sites – Several presentations and
discussions on what the Experiments see as the role of Tier2s in their
computing models. ALICE, ATLAS, CMS and LHCb presented their plans and GRIF
provided an overview of the Tier-2 Sites in France.
CCRC09 – J.Shiers presented the general status of the
WLCG Services and whether one will need to organize a Combined Challenge
for 2009 (CCRC09).
Services and Tier-2 Sites– Several activities and
services regarding the WLCG Tier-2 Sites were presented: Tier-2 Accounting,
Tier-2 Monitoring, SAM Tests for Tier 2 Sites, Storage Support at the
Tier-2 Sites.
Batch Systems - Review of batch systems in use on
WLCG/EGEE was presented as well and the community support for them.
Middleware Deployment Upgrades – Plans for expected upgrade of
Middleware Software and Services until January 2009.
7 October 2008 –
Pre-GDB Meeting – Agenda
Middleware Shutdown Plan – Preparation of the GDB
discussion on the following day. Now that LHC data is delayed should the
middleware plans be changed? Are new upgrades now possible?
Storage Outlook Expected changes over the next six months.
Is there time for planning new development or new changes?
Pilot Jobs Update – Status of the review on the
Pilot Jobs frameworks of the four LHC Experiments.
9 September 2008 –
GDB Meeting – Agenda
Storage Services Performance – Issues with the Storage Services
from the Experiments' perspectives. SRM/dCache at IN2P3-CC. dCache at SARA.
dCache configuration and release procedure. CASTOR report and status. OSG
Storage Status.
Middleware Status – Presentation of the current
gLite recommended WLCG release. Initial roadmap for 2009 and the shutdown
period. WN Client software installation.
Other Topics - LHCb gLexec and pilot job testing. Emergency Tickets in GGUS: Flow of direct GGUS tickets'
routing to Sites.
20 October 2008 – Minutes
No special items. But it was reminded to the WLCG Tier-1
Sites that FTS on SL4 should not yet be deployed.
13 October 2008 -
Minutes
Phasing Out Old MW
Versions - Process for making old versions of middleware services
obsolete is under review. The document is attached to the minutes and
comments are welcome.
Installations of the
gLite WMS and CREAM - ALICE requested that all LCG RBs being run for
them are replaced by gLite 3.1 WMSs and that as many sites as possible
deploy the CREAM CE.
6 October 2008
– Minutes
CREAM Released -
CREAM CE has been released to production, but for now will not yet be
selected by default WMS match-making.
gLite 3.0 is
Obsolete - Next week, many gLite 3.0 services will be made obsolete.
29 September 2008
- Minutes
Scarce attendance by Tier-1 Sites and Experiments. See
minutes but reports are mostly empty.
8 September 2008
– Minutes
Feedback on Software
Distribution - The feedback from most EGEE ROCs to SA3's proposal for
central software distribution has not been favorable.
GFAL, lcg-utils and
BDDI Incompatibilities - The gLite upgrade introduced versions of GFAL
and lcg-utils which were incompatible with BDII V3.0.This negatively
impacted the EGEE production grid.
1 September 2008
- Minutes
CREAM Delayed -
The EMT decided to delay the deployment of the CREAM CE. The WMS currently
in production is not ICE-enabled, but could accidentally match any CREAM
CEs in production. A work-around is being introduced into YAIM.
Software
Distribution Proposal - The release team presented a proposal for a
centralized distribution mechanism for the gLite clients (WN) to all
production sites. It will be discussed at the LCG MB, GDB and EGEE SA1
coordination meetings.
Downtimes Reporting
on Operations Tools – The GOCDB and the CIC portal have now the
required features to declare downtimes for the operations tools. Each
operations tool will be registered in GOCDB. Sites will be able to declare
tool downtimes that will be broadcasted to sites, ROCs and grid operators
on duty.
Architects Forum (AF)
Minutes
- Web
16 October 2008 - Minutes
Configuration LCG_54h. LCG_54h will
be released this week. LCG_55b, the version without any SEAL dependency
left, is not urgently needed and will be completed later.
2009 Production Releases - The 2009
"production" version of AA software stack is requested by
mid-January 2009. The integration tests by the experiments will start when
a stable version of ROOT available (end of November).
SLC5 Platform - Discussed
the inclusion of SLC5 as new platform. A plan of action has been defined.
The major question is whether SLC5 binaries should be produced with the
native compiler gcc 4.1 or with gcc 4.3.
Configuration and Installation Proposals - The two
proposals circulated by S.Roiser, on CMT tag name unification, and
procedure to manage software installations, has been agreed
2 October 2008
- Minutes
Impact of New LHC Schedule –Studying the
new LHC schedule, the Experiments would like to have the full software
stack ready by end of January. The final version of ROOT 5.22 is foreseen
by December 15th. Geant4 9.2 is scheduled for December.
Configuration LCG_55b - New
configuration in preparation (LCG_55b). It will have some minor changes in
the externals; include a separated release of RooFit based on ROOT 5.18 and
changes in the Grid packages.
18 September 2008 –
Minutes
SubVersion Migration – As a
follow-up to the presentations of SubVersion in AA meeting, the Experiments
decided that they will only plan the migration from CVS to SVN during the
2009-2010 winter shutdown.
Geant4 9.2 patch 03 Released - A new fix
for Geant4 version 9.1 (called patch 03) will be released this week.
VC9 Port - Reported
progress in the VC9 port. The main remaining issues are been solved, but
several errors are still observed at runtime.
4 September 2008
– Minutes
Middleware Software Distribution - Discussed a
new proposal for the distribution of the middleware software. A pull mode,
in which sites can decide what they want to install and the possibility to
have parallel installations with different version.
ROOT 5.21/02 Released- The
development version of ROOT 5.21/02 was released at the end of August.
Contrary to what was planned, this release does not include the new
automatic schema evolution.
|
|
Management
Board (MB)
Web - Wiki - Members - Agendas - Minutes
28 October 2008 – Agenda, Minutes
LCG
Operations Weekly Report -
J.Shiers presented a summary of status and progress of the LCG Services.
Notes from each daily meeting can be
found from these pages: Link
Installation
Accounting Status – F.Donno presented the
status and progress of her work, and problems to be solved, in finding what
CPU resources are actually installed at the Tier-2 sites.
22 October 2008 – Agenda, Minutes
Preparation
of Overview Board Meeting – The MB
discussed the main points and issues that will be presented at the OB
meeting on Monday 27 October 2008. The present shutdown of the LHC has a
number of consequences for the planning of WLCG, for the hardware
procurement for 2009, software and service upgrades and validation of the
2009 changes to the WLCG Services.
14 October 2008 – Agenda, Minutes
Mandate
of User Analysis Working Group - I.Bird
proposed a discussion on launching a “User Analysis working group”. Past
discussions involved IT and CERN in order to provide management of disk
space for analysis activity at CAF at CERN.
It is now time to face the general issue to support general User
Analysis at the WLCG Sites.
LCG
Operations Weekly Report Summary
of status and progress of the LCG Operations. The daily meetings summaries
are always available (Link). Regular local
participation from all CERN IT Physics Groups and systematic remote
participation by BNL, RAL, PIC, NIKHEF and GRIF. Other sites only
participate irregularly.
Change
of Benchmarking CPU Units – The MB agreed on the mandate for
the Benchmarking Group. The group needs to define the
conditions under which the new benchmark is run and agree on the conversion
factors to transform the Experiments’ requirement and Sites’ pledges from
the SI2K units to the new benchmarking units. The transition needs to be
prepared carefully and all costs need to be equivalent to the current ones,
but expressed in the new CPU units.
Tier-2
Accounting Reports - Over the summer there was the
request to improve the Tier-2 accounting reports: the MB discussed and
agreed on the proposal presented by I.Bird.
ATLAS
Quarterly Report – D.Barberis presented the
ATLAS Quarterly Report for 2008Q3.
7 October 2008 – Agenda, Minutes
LCG
Operations Weekly Report -
Detailed “post-mortem” reports were delivered this week regarding RAL, PIC
and NL-T1. Notes from each daily meeting are available: Link
ALICE
Quarterly Report – Y.Schutz presented the
ALICE Quarterly Report for 2008Q3.
CMS
Quarterly Report – M.Kasemann presented the
CMS Quarterly Report for 2008Q3.
LHCb
Quarterly Report – Ph.Charpentier presented
the LHCb Quarterly Report for 2008Q3.
30 September 2008 – Agenda, Minutes
LCG
Operations Weekly Report -
J.Shiers presented a summary of status and progress of the LCG Operations.
The report covered the last two weeks. Notes from each daily meeting can be
found from these pages: Link The ATLAS conditions DB
high-load seen at several Tier1 sites – technical discussions held, plan
for resolution being defined. Reminder that problems raised at the daily
operations meeting should have an associated GGUS ticket/elog entry. The
second week was overshadowed by news bulletins (from DG) about LHC delays.
The clear message at WLCG session in EGEE’08 was to continue with the
service as it is now. But there are some pending things that can now be
planned and scheduled in a non-disruptive way. E.g. migration of FTS
services at Tier0 and Tier1s to SL(C)4.
Impact of
Accelerator Delay - The MB discussed the new scenario and agreed that
is important to avoid/minimize effects on 2009 hardware procurement: Now is
possible to perform the upgrades of software and services that had been
postponed, or were not ready. A real CCRC’09 challenge could be needed if
many updates are made. All main upgrades should be finished by end January 2009 in order to be properly
tested.
Service Changes at CERN - T.Cass presented the changes
that could take effect because of the delays. These changes are going to be
possible at CERN: (1) Worker Nodes (WN) migrated to SL(C)5 at CERN once
certification is complete; (2) FTS 2.1 deployment at CERN on SLC4.
16 September 2008 – Agenda, Minutes
2009
Procurement – The MB reviewed the status
and issues concerning 2009 hardware procurement at each Tier-0 and Tier-1
Site. Details on status of purchasing and installations were analyzed.
End
User Analysis at CERN - B.Panzer presented the
proposal, defined by a working group involving IT and the Experiments, on
the organization of end-user analysis at CERN. CERN IT will provide a
CASTOR instance from October 2008 dedicated to user analysis. It will
consist of a storage pool of 100 TB for ATLAS and CMS.
LCG
Operations Weekly Report -
J.Shiers presented a summary of status and progress of the LCG Services.
Notes from each daily meeting can be
found from these pages: Link
9 September 2008 – Agenda, Minutes
High
Level Milestones Update - The
MB reviewed the High Level Milestones for the Tier-0 and Tier-1 sites in
particular focusing on 24x7 Support, VO Boxes Support and 2008/2009
hardware procurement
SRM
V2 Reliability Tests – The new SRMv2 tests are
now in Production, but with alarming turned off, the Data Management tests
and possible improvements to availability calculations. The current
situation in EGEE of the SRM instances is the following: still 278 SRMv1
vs. 296 SRMv2 instances.
Site availability
calculations currently only take SRMv1 interfaces into account, and yet
SRMv2s are predominant. For this reason the SAM team has developed
SRMv2-specific probes that will be put in production in the near future.
General
News and Events
LCG Meetings - Calendar
|
11-12
November 2008
|
Grid
Deployment Board (GDB) and Pre-GDB Meetings at CERN.
Pre-GDB
Agenda, GDB Agenda
|
11-12
November 2008
|
Distributed
Database Operations Workshop at CERN
Agenda
|
13-14
November 2008
|
WLCG 2009
Data-Taking Readiness Planning Workshop at CERN.
Agenda
|
9-10
December 2008
|
Grid
Deployment Board (GDB) and Pre-GDB Meetings at CERN.
Pre-GDB
Agenda, GDB Agenda
|
|