TWiki
>
ArdaGrid Web
>
NSS2008DemoAndTutorial
(2009-02-25,
JakubMoscicki
)
(raw view)
E
dit
A
ttach
P
DF
---+ Short Course NSS2008 %TOC% ---++Setup environment (AFS) =source /afs/cern.ch/sw/arda/install/DIANE/Geant4/G4DemoNSS2008/env.sh= ---++ Demo 1: compile and run a brachytherapy simulation locally The objective is to understand the basics of a Geant 4 simulation application. <verbatim> $ cd apps/G4Brachy $ source setup.sh $ gmake $ ./workdir/bin/Linux-g++/Brachy macros/default.mac 1 $ show_hist brachytherapy.xml 20 30 $ gmake clean </verbatim> ---+++ Exercise 1: Running a Geant4 application: particle beam Perform the following steps: * Compile the Geant4 particle beam example application. Solution: <verbatim> $ cd apps/G4ParticleBeam $ source setup.sh $ gmake </verbatim> * Execute the particle beam example using the macro file electronbeam.mac. The application simulates 5.5 MeV electrons impinging on a water box and computes the longitudinal energy deposition profile inside the box. By default 100 events are simulated. Solution: <verbatim> ./workdir/bin/Linux-g++/ParticleBeam macros/electronbeam.mac 1 </verbatim> * The simulation program created an xml file (simoutput.xml) containing a histogram "energydeposit". Visualize the computed energy deposition profile by executing <verbatim> show_hist simoutput.xml energydeposit </verbatim> * Rerun the application with an increased number of events (3000) and visualize the output. Solution: Change in the macro file the following command: <verbatim> \run\beamOn 3000 </verbatim> and execute again <verbatim> $ ./workdir/bin/Linux-g++/ParticleBeam macros/electronbeam.mac 1 $ show_hist simoutput.xml energydeposit </verbatim> * As a next step change the incident angle of the beam to 60 deg and rerun the application. Visualize the results. Solution: Change in the macro file the following command: <verbatim> /source/incidentAngle 60.0 deg </verbatim> ---++ Demo 2.1: Ganga Primer The objective here is learn how Ganga works using very basic commands. Create a small executable file (using editor or shell cat command): <verbatim> $ cat > say_hello #!/usr/bin/env bash echo Hello $* ^D $ chmod u+x say_hello $ ./say_hello XXX Hello XXX </verbatim> Submit jobs using ganga (interactively): <verbatim> $ ganga j = Job() j.application.exe=File('say_hello') j.application.args = ['DRESDEN'] j.name='Hello Dresden!' print j j.submit() j2 = j.copy() j2.application.args = ['GRID','AUS', 'DRESDEN!!!'] j2.name='Hello Grid!' j2.backend=LCG() j2.submit() </verbatim> Useful commands and options in the interactive shell: <verbatim> # optionally select the CE, one of: j.backend.CE= jobs jobs[-1] jobs.select(status='submitted') !ls -l $j.outputdir j.peek() j.peek('stdout','cat') j.peek('stderr','cat') j2.peek('stdout.gz','zcat') j2.peek('stderr.gz','zcat') </verbatim> ---++ Demo 2.1: Geant4 with Ganga First create a small wrapper file which will source the environment, compile and run the application. The entire directory G4Brachy is packed into a tarfile and sent with the job. Use editor or cat command. Remember to =gmake clean= in the =apps/G4Brachy= directory to avoid shipping unnecessary binary files <verbatim> $cd apps $tar cfz G4Brachy.tgz G4Brachy $cat > run_brachy #!/usr/bin/env bash tar xfzv G4Brachy.tgz cd G4Brachy source setup.sh gmake clean #just in case we forgot to clean before the job was sent gmake ./workdir/bin/Linux-g++/Brachy macros/default.mac $1 mv *.xml .. #move all outputs one level up so Ganga will find it ^D $ chmod u+x run_brachy </verbatim> <verbatim> $ ganga j = Job() j.inputsandbox=[File('G4Brachy.tgz')] j.outputsandbox=['brachytherapy.xml'] j.name='G4Brachy' j.application.exe=File('./run_brachy') j.application.args=['1'] j.submit() j2 = j.copy() j2.backend=LCG() j2.submit() if j.status == 'completed': !show_hist $j.outputdir/brachytherapy.xml 20 30 </verbatim> ---+++ Exercise 2.1: Particle Beam To be defined. ---++ Demo 2.2: Geant4 with Ganga (split jobs) Objective: simulate more events using a fixed number of processors Now the code gets more complex, so we will save it in a file and let ganga execute in one go. <verbatim> $ cd apps $ cat > submit_brachy j = Job() j.inputsandbox=[File('G4Brachy.tgz')] j.outputsandbox=['brachytherapy.xml'] j.name='G4Brachy' j.application.exe=File('./run_brachy') j.application.args=['1'] j.backend=LCG() j.splitter = ArgSplitter() for i in range(10): j.splitter.args.append([str(i)]) j.submit() ^D </verbatim> Now you may run all the commands in one go: <verbatim> $ ganga submit_brachy </verbatim> Start ganga in the interactive mode and have a look at the job - it should contain 10 subjobs: <verbatim> $ ganga j = jobs[-1] print j </verbatim> ---+++ Exercise 2.2: Particle Beam increase the statistics ---++ Demo 3.0: Geant4 with Ganga + DIANE on the Grid Objective: simulate more events using variable number of processors (up to specified number) and splitting job into many more smaller pieces. The easiest way to understand what's going on is to start two windows. Window 1: master will receive the intermediate results and will print messages to show what is going on <verbatim> diane-run apps/G4Brachy/macros/iodiumsource-diane.run </verbatim> Take note of the run directory. Window 2: submission of jobs and visualisation <verbatim> ganga LCGSubmitter.py --diane-worker-number=10 # 10 jobs on the Grid ganga Local.py --diane-worker-number=1 # one job locally show_hist_live rundir/merged1D_brachytherapy.xml 20 30 </verbatim> ---+++ Exercise 3.0: Particle Beam IMPORTANT: modify port number in the run file! TBD. -- Main.JakubMoscicki - 17 Oct 2008
E
dit
|
A
ttach
|
Watch
|
P
rint version
|
H
istory
: r8
<
r7
<
r6
<
r5
<
r4
|
B
acklinks
|
V
iew topic
|
WYSIWYG
|
M
ore topic actions
Topic revision: r8 - 2009-02-25
-
JakubMoscicki
Log In
ArdaGrid
ArdaGrid Web
ArdaGrid Web Home
Changes
Index
Search
Public webs
Public webs
ABATBEA
ACPP
ADCgroup
AEGIS
AfricaMap
AgileInfrastructure
ALICE
AliceEbyE
AliceSPD
AliceSSD
AliceTOF
AliFemto
ALPHA
Altair
ArdaGrid
ASACUSA
AthenaFCalTBAna
Atlas
AtlasLBNL
AXIALPET
CAE
CALICE
CDS
CENF
CERNSearch
CLIC
Cloud
CloudServices
CMS
Controls
CTA
CvmFS
DB
DefaultWeb
DESgroup
DPHEP
DM-LHC
DSSGroup
EGEE
EgeePtf
ELFms
EMI
ETICS
FIOgroup
FlukaTeam
Frontier
Gaudi
GeneratorServices
GuidesInfo
HardwareLabs
HCC
HEPIX
ILCBDSColl
ILCTPC
IMWG
Inspire
IPv6
IT
ItCommTeam
ITCoord
ITdeptTechForum
ITDRP
ITGT
ITSDC
LAr
LCG
LCGAAWorkbook
Leade
LHCAccess
LHCAtHome
LHCb
LHCgas
LHCONE
LHCOPN
LinuxSupport
Main
Medipix
Messaging
MPGD
NA49
NA61
NA62
NTOF
Openlab
PDBService
Persistency
PESgroup
Plugins
PSAccess
PSBUpgrade
R2Eproject
RCTF
RD42
RFCond12
RFLowLevel
ROXIE
Sandbox
SocialActivities
SPI
SRMDev
SSM
Student
SuperComputing
Support
SwfCatalogue
TMVA
TOTEM
TWiki
UNOSAT
Virtualization
VOBox
WITCH
XTCA
Welcome Guest
Login
or
Register
Cern Search
TWiki Search
Google Search
ArdaGrid
All webs
Copyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use
Discourse
or
Send feedback