TWiki
>
LCG Web
>
LCGServiceChallenges
>
ServiceSchedule
>
SC4ExperimentPlans
>
AlicePlans
(2007-11-01,
HarryRenshall
)
(raw view)
E
dit
A
ttach
P
DF
-- Main.HarryRenshall - 06 Mar 2006 Last Updated 1.11.2007: Clarify site resource offers as real percentage of requirements and when normalised to 100% of ALICE requirements. Updated 05.10.2007: Add new 2008 Tier-1 pledge percentages (normalised to 100%) and Full Scale Dress Rehearsal (FDR) plans for end 2007/early 2008. Updated 25.06.2007: Split off 2006 plans into a separate linked page and remove LHC engineering run. Updated 5.03.2007: add planning of full scale dress rehearsal of p-p running beginning in April 2007 and continuing throughout the year. Updated 25.10.2006: continue the data export tests till end 2006 and add resource requirements for all of 2007. Updated 18 August: continue the July 300 MB/s export tests until succesful. Updated 4 August: correct T1 cpu resources required for the network/reconstruction stress test. In fact half the resources will come from T2 sites. Updated 12 June: add scheduled dates of 24 July to 6 August for T0 to T1 data export tests. Updated 2 June: add planned July Tier 0 to Tier 1 tests at an aggregate rate from CERN of 300 MB/sec. ---+++ ALICE Tier 1 Resource Requirements Timetable for 2006 |Tier 1 FZK-Karlsruhe to provide 20% of 2006 resources|Tier 1 IN2P3 to provide 9% of 2006 resources| Tier 1 CNAF to provide 7% of 2006 resources|Distributed Tier 1 NDGF to provide 9% of 2006 resources| AliceTimeTable2006 ---+++ ALICE Tier 1 Resource Requirements Timetable for 2007/8 (under-resourced by 42% if no USA site) 2007/1Q2008 Tier-1 Site resource offers averaging cpu+disk+tape: |FZK-Karlsruhe to provide 29% of 2007 Tier-1 cpu resources|IN2P3 to provide 17% of 2007 Tier-1 cpu resources|CNAF to provide 19% of 2007 Tier-1 cpu resources|NDGF to provide 21% of 2007 Tier-1 cpu resources|NIKHEF to provide 6% of 2007 Tier-1 cpu resources|RAL to provide 1% of 2007 Tier-1 cpu resources|USA site to provide 7% of 2007 Tier-1 cpu resources| 2Q2008/9 Tier-1 Site resource offers averaging cpu+disk only. The percent shares normalised to 100% of the experiment requirements are shown in brackets: |FZK-Karlsruhe offers 25% (39%) of 2008 Tier-1 cpu+disk resources|IN2P3 offers 9% (14%) of 2008 Tier-1 cpu+disk resources|CNAF offers 7% (11%) of 2008 Tier-1 cpu+disk resources|NDGF offers 12% (18%) of 2008 Tier-1 cpu+disk resources|NIKHEF offers 3.5% (5.5%) of 2008 Tier-1 cpu resources|RAL offers 1% (1.5%) of 2008 Tier-1 cpu resources|USA site offers 7% (11%) of 2008 Tier-1 cpu resources| ALICE Distribution of activities over 2007/8 |Month|ALICE Requirements| |January 2007|During first quarter build up to a data challenge of 75% of the last quarter (data taking) capacity using new site capacity as and when available. Require up to 2325 KSi2K cpu, 720 TB disk and 1500 TB tape over the 7 Tier-1.| |February|During first quarter build up to a data challenge of 75% of the last quarter (data taking) capacity using new site capacity as and when available. Require up to 2325 KSi2K cpu, 720 TB disk and 1500 TB tape over the 7 Tier-1.| |March|During first quarter build up to a data challenge of 75% of the last quarter (data taking) capacity using new site capacity as and when available. Require up to 2325 KSi2K cpu, 720 TB disk and 1500 TB tape over the 7 Tier-1.| |April|Require up to 2325 KSi2K cpu, 720 TB disk and 1500 TB tape over the 7 Tier-1. Start full scale dress rehearsal of p-p running with raw data (at 50 MB/s) and ESD (10% of the raw) export from CERN, reconstruction at Tier-1 and user analysis and simulation at Tier-2. Export rate from CERN to reach 55 MB/s will be 10 MB/s to CNAF, 9 MB/s to IN2P3, 16 MB/s to FZK, 1 MB/s to RAL, 3 MB/s to NIKHEF, 12 MB/s to NDGF and 4 MB/s to US. The data are to be stored in a Tape1Disk1 class storage but where ALICE will manage the disk space.| |May|Require up to 2325 KSi2K cpu, 720 TB disk and 1500 TB tape over the 7 Tier-1. Maximum export rate from CERN of 230 MB/s will be 38 MB/s to CNAF, 38 MB/s to IN2P3, 60 MB/s to FZK, 4 MB/s to RAL, 23 MB/s to NIKHEF, 38 MB/s to NDGF and 30 MB/s to USA.| |June|Require up to 2325 KSi2K cpu, 720 TB disk and 1500 TB tape over the 7 Tier-1. Maximum export rate from CERN of 230 MB/s will be 38 MB/s to CNAF, 38 MB/s to IN2P3, 60 MB/s to FZK, 4 MB/s to RAL, 23 MB/s to NIKHEF, 38 MB/s to NDGF and 30 MB/s to USA.| |July|Require up to 2325 KSi2K cpu, 720 TB disk and 1500 TB tape over the 7 Tier-1. Maximum export rate from CERN of 230 MB/s will be 38 MB/s to CNAF, 38 MB/s to IN2P3, 60 MB/s to FZK, 4 MB/s to RAL, 23 MB/s to NIKHEF, 38 MB/s to NDGF and 30 MB/s to USA.| |August|Require up to 2325 KSi2K cpu, 720 TB disk and 1500 TB tape over the 7 Tier-1. Maximum export rate from CERN of 230 MB/s will be 38 MB/s to CNAF, 38 MB/s to IN2P3, 60 MB/s to FZK, 4 MB/s to RAL, 23 MB/s to NIKHEF, 38 MB/s to NDGF and 30 MB/s to USA.| |September|Require up to 2325 KSi2K cpu, 720 TB disk and 1500 TB tape over the 7 Tier-1. Maximum export rate from CERN of 230 MB/s will be 38 MB/s to CNAF, 38 MB/s to IN2P3, 60 MB/s to FZK, 4 MB/s to RAL, 23 MB/s to NIKHEF, 38 MB/s to NDGF and 30 MB/s to USA.| |October|Require up to 2325 KSi2K cpu, 720 TB disk and 1500 TB tape over the 7 Tier-1. Maximum export rate from CERN of 230 MB/s will be 38 MB/s to CNAF, 38 MB/s to IN2P3, 60 MB/s to FZK, 4 MB/s to RAL, 23 MB/s to NIKHEF, 38 MB/s to NDGF and 30 MB/s to USA.| |November|Start full scale dress rehearsal of p-p running with detector cosmics plus injected simulated raw data. Raw data export (at up to 60 MB/s) and ESD (10% of the raw) export from CERN after first pass reconstruction at CERN. Export rate from CERN to reach 66 MB/s will be 7 MB/s to CNAF, 9 MB/s to IN2P3, 26 MB/s to FZK, 1 MB/s to RAL, 4 MB/s to NIKHEF, 12 MB/s to NDGF and 7 MB/s to US. The data are to be stored in a Tape1Disk1 class storage but where ALICE will manage the disk space. The injected MonteCarlo data are to be deleted later but the fraction is not yet known.| |December|Continue full scale dress rehearsal of p-p running with detector cosmics plus injected simulated raw data. Raw data export (at up to 60 MB/s) and ESD (10% of the raw) export from CERN after first pass reconstruction at CERN. Export rate from CERN to reach 66 MB/s will be 7 MB/s to CNAF, 9 MB/s to IN2P3, 26 MB/s to FZK, 1 MB/s to RAL, 4 MB/s to NIKHEF, 12 MB/s to NDGF and 7 MB/s to US. The data are to be stored in a Tape1Disk1 class storage but where ALICE will manage the disk space. The injected MonteCarlo data are to be deleted later but the fraction is not yet known. Start reconstruction at Tier-1 and user analysis and simulation at Tier-2 (simulation is a continuous Tier-2 activity).| |January 2008| | |February|Restart full scale dress rehearsal of p-p as the ALICE participation in the CCRC'08 February functional test running (2 weeks planned) with detector cosmics plus injected simulated raw data and building up the online detector algorithms and quality assurance. Raw data export (at up to 60 MB/s) and ESD (10% of the raw) export from CERN after first pass reconstruction at CERN. Export rate from CERN to reach 66 MB/s will be 7 MB/s to CNAF, 9 MB/s to IN2P3, 26 MB/s to FZK, 1 MB/s to RAL, 4 MB/s to NIKHEF, 12 MB/s to NDGF and 7 MB/s to US. The data are to be stored in a Tape1Disk1 class storage but where ALICE will manage the disk space. The injected MonteCarlo data are to be deleted later but the fraction is not yet known. Continue reconstruction at Tier-1 and user analysis and simulation at Tier-2| |March|Continue as February.| |April|Continue as February. For 2008 running require 10100 KSi2K cpu, 4000 TB disk and 5800 TB tape over the 7 Tier-1.| |May|Continue as February as the ALICE participation in the CCRC'08 May full nominal p-p rates running (4 weeks planned).| |June| | |July|Start of Pilot Physics Run|
E
dit
|
A
ttach
|
Watch
|
P
rint version
|
H
istory
: r23
<
r22
<
r21
<
r20
<
r19
|
B
acklinks
|
V
iew topic
|
WYSIWYG
|
M
ore topic actions
Topic revision: r23 - 2007-11-01
-
HarryRenshall
Log In
LCG
LCG Wiki Home
LCG Web Home
Changes
Index
Search
LCG Wikis
LCG Service
Coordination
LCG Grid
Deployment
LCG
Apps Area
Public webs
Public webs
ABATBEA
ACPP
ADCgroup
AEGIS
AfricaMap
AgileInfrastructure
ALICE
AliceEbyE
AliceSPD
AliceSSD
AliceTOF
AliFemto
ALPHA
Altair
ArdaGrid
ASACUSA
AthenaFCalTBAna
Atlas
AtlasLBNL
AXIALPET
CAE
CALICE
CDS
CENF
CERNSearch
CLIC
Cloud
CloudServices
CMS
Controls
CTA
CvmFS
DB
DefaultWeb
DESgroup
DPHEP
DM-LHC
DSSGroup
EGEE
EgeePtf
ELFms
EMI
ETICS
FIOgroup
FlukaTeam
Frontier
Gaudi
GeneratorServices
GuidesInfo
HardwareLabs
HCC
HEPIX
ILCBDSColl
ILCTPC
IMWG
Inspire
IPv6
IT
ItCommTeam
ITCoord
ITdeptTechForum
ITDRP
ITGT
ITSDC
LAr
LCG
LCGAAWorkbook
Leade
LHCAccess
LHCAtHome
LHCb
LHCgas
LHCONE
LHCOPN
LinuxSupport
Main
Medipix
Messaging
MPGD
NA49
NA61
NA62
NTOF
Openlab
PDBService
Persistency
PESgroup
Plugins
PSAccess
PSBUpgrade
R2Eproject
RCTF
RD42
RFCond12
RFLowLevel
ROXIE
Sandbox
SocialActivities
SPI
SRMDev
SSM
Student
SuperComputing
Support
SwfCatalogue
TMVA
TOTEM
TWiki
UNOSAT
Virtualization
VOBox
WITCH
XTCA
Welcome Guest
Login
or
Register
Cern Search
TWiki Search
Google Search
LCG
All webs
Copyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use
Discourse
or
Send feedback