TWiki
>
ArdaGrid Web
>
Geant4ReleaseTesting
(2008-06-23,
AlbertoRibon
)
(raw view)
E
dit
A
ttach
P
DF
---+ Geant4 Release Regression Testing on the Grid %TOC% ---++ Geant4 Production: Instructions ---+++ Introduction This webpage describes all steps required to setup and run the Geant4 production. Two areas are of importance for the production procedure: * *RUN AREA* /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod The required subdirectories are: * run: contains all necessary scripts for executing the production * cand: directory from which the Geant4 candidate is downloaded to the sites * *OUTPUT AREA* /afs/cern.ch/sw/geant4/stat_testing/june08 The required subdirectories are: * results: used for storing all the outputs * code: used for storing the Geant4 candidate versions ---+++ Preparing production run ---++++ Setting up environment (required for each run) Commands assume *bash* shell. ---+++++ Creating output directory Create a new output directory in the output area. As a further step create within this new output directory a directory called "applications" E.g. <verbatim> OUTPUT_DIR=/afs/cern.ch/sw/geant4/stat_testing/june08/results/output_$USER cd $OUTPUT_DIR mkdir -p diane mkdir -p gangadir </verbatim> ---+++++ Placing DIANE application adapter Get the DIANE application adapter from CVS: <verbatim> cd $OUTPUT_DIR/diane cvs -d :kserver:isscvs.cern.ch:/local/reps/diane co apps/G4Production </verbatim> ---+++++ Setting up grid/tool environment Set the grid/tool environment by executing the following command in *bash*: <verbatim> bash source /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/run/prodsetup_slc4.sh </verbatim> The ganga configuration is in: =/afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/run/ganga-geant4-june08-config.ini= Normally you do not need to change it, so make sure that your =~/.gangarc= configuration file does contain unnecessary modifications (use =ganga -g= to create an "empty" configuration file). ---++++ Creating the task scripts describing the physics configurations ---+++++ Creating directory for task scripts Create a subdirectory, which will contain the task scripts, in the directory /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/run/taskscripts. E.g. cd /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/run/taskscripts mkdir cand1_QGSP ---+++++ Creating task scripts Change to the directory created in II.2.A and use the script create_pyscripts.pl in combination with the template executable.template to create the required task scripts. IMPORTANT: Change the reference and candidate names in the template to the current tags before creating the task scripts. E.g. perl create_pyscripts.pl -pydir cand1_QGSP -template executable.template ---++++ Adapting the job description file and placing the candidate version ---+++++ Copying candidate to download directory Copy the current Geant4 candidate tarball to the directory /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/cand ---+++++ Preparing DIANE job description file Introduce following two changes into the DIANE job description file /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/run/runfiles/G4Prod_$USER.run 1. Change the path of local_exe_dir to be in coincidence with the full path of the subdirectory of taskscripts, which was created in II.2.A and which contains the task scripts E.g. local_exe_dir = '/afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/run/taskscripts/cand1_QGSP' 2. Change local_cand_name to the name of the current candidate tar-ball E.g. local_cand_name = 'g4prod-1.tgz' ---++++ Invoking production run (on lxb7232.cern.ch) Run the following command in the directory /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/run =env ORBendPoint=giop:tcp::23001 diane-run /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/run/runfiles/G4Prod_$USER.run= In another window: <verbatim> cd /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/workers diane-env -d `which ganga` LCG.py --diane-worker-number=10 </verbatim> Hint: CERN CE selection =--CE ce117.cern.ch:2119/jobmanager-lcglsf-grid_geant4= (from 101 to 117) Hint: for multiple selection use --CE-list command ---++++ Useful commands Most command accept =--help= Killing master: =diane-master-ping kill= Check if master alive: =diane-master-ping= Run worker interactively for debugging: <verbatim> export VO_GEANT4_SW_DIR=/afs/cern.ch/sw/geant4/stat_testing/june08/code/dir32bits diane-worker-start --workdir=/tmp/blah </verbatim> ---++++ Managing multiple masters Make sure that every diane-run is done on a unique port number (=env ORBendPoint=giop:tcp::23NNN=). Every master (diane-run) starts in its own directory in =$DIANE_USER_WORKSPACE/runs/XXX=. The master prints out its directory at startup. All commands by default use the *last* started master. However you may specify the master number (XXX) to be used. The exact syntax depends on the command (this will be made uniform in the next release). * to submit worker agents to the master XXX add the following option: =--diane-master=workspace:XXX= * to kill master XXX: =diane-master-ping -f $DIANE_USER_WORKSPACE/runs/XXX/MasterOID kill= There is a helper command: =./current_master= which prints the directory of the last master (this command is in the same directory as the other submission scripts =LCG.py, LSF.py= etc). ---+++ NSS2006 Paper Get the source _(restricted access)_ <verbatim> cvs -d /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4Prod/NSS2006_Geant4_paper/cvs co NSS2006_Geant4_paper </verbatim> -- Main.JakubMoscicki - 09 Oct 2006 * [[%ATTACHURL%/Geant4-NSS-SanDiego.ppt][Geant4-NSS-SanDiego.ppt]]: Geant4-NSS-SanDiego.ppt * [[%ATTACHURL%/Geant4-NSS-SanDiego.ppt][Geant4-NSS-SanDiego.ppt]]: Geant4-NSS-SanDiego.ppt
Attachments
Attachments
Topic attachments
I
Attachment
History
Action
Size
Date
Who
Comment
ppt
Geant4-NSS-SanDiego.ppt
r2
r1
manage
89.5 K
2006-10-25 - 22:24
JakubMoscicki
E
dit
|
A
ttach
|
Watch
|
P
rint version
|
H
istory
: r8
<
r7
<
r6
<
r5
<
r4
|
B
acklinks
|
V
iew topic
|
WYSIWYG
|
M
ore topic actions
Topic revision: r8 - 2008-06-23
-
AlbertoRibon
Log In
ArdaGrid
ArdaGrid Web
ArdaGrid Web Home
Changes
Index
Search
Public webs
Public webs
ABATBEA
ACPP
ADCgroup
AEGIS
AfricaMap
AgileInfrastructure
ALICE
AliceEbyE
AliceSPD
AliceSSD
AliceTOF
AliFemto
ALPHA
Altair
ArdaGrid
ASACUSA
AthenaFCalTBAna
Atlas
AtlasLBNL
AXIALPET
CAE
CALICE
CDS
CENF
CERNSearch
CLIC
Cloud
CloudServices
CMS
Controls
CTA
CvmFS
DB
DefaultWeb
DESgroup
DPHEP
DM-LHC
DSSGroup
EGEE
EgeePtf
ELFms
EMI
ETICS
FIOgroup
FlukaTeam
Frontier
Gaudi
GeneratorServices
GuidesInfo
HardwareLabs
HCC
HEPIX
ILCBDSColl
ILCTPC
IMWG
Inspire
IPv6
IT
ItCommTeam
ITCoord
ITdeptTechForum
ITDRP
ITGT
ITSDC
LAr
LCG
LCGAAWorkbook
Leade
LHCAccess
LHCAtHome
LHCb
LHCgas
LHCONE
LHCOPN
LinuxSupport
Main
Medipix
Messaging
MPGD
NA49
NA61
NA62
NTOF
Openlab
PDBService
Persistency
PESgroup
Plugins
PSAccess
PSBUpgrade
R2Eproject
RCTF
RD42
RFCond12
RFLowLevel
ROXIE
Sandbox
SocialActivities
SPI
SRMDev
SSM
Student
SuperComputing
Support
SwfCatalogue
TMVA
TOTEM
TWiki
UNOSAT
Virtualization
VOBox
WITCH
XTCA
Welcome Guest
Login
or
Register
Cern Search
TWiki Search
Google Search
ArdaGrid
All webs
Copyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use
Discourse
or
Send feedback