LCG Grid Deployment - CERN ROC - CERN ROC Procedures




WARNING: This page is obsoleted by


CERNROCSamInstallationNew




SFT Test Suite for ROC

The installation of the SFT instance for the CERN OC is done on the model of the one done for the Pre-Production Service.

So please refer to that page if you are looking for hints and technical details about the installation. Here we give only the correct pointers to the CERN ROC installations.

SFT clients on the AFS UI

Several clients are configured on the AFS UI to be used by the CERN ROC. All of them are installed in the main directory

/afs/cern.ch/project/gd/egee

Here we give the relevant details for each client (if you already know about SFT configuration you don't need more)

sft-prod-glite
All the production gLiteCEs - currently uses Antun Balaz's RB
Parameter Value
Client directory: sft-prod-glite
RB g01.phy.bg.ac.yu
SFT_GOC_MAP_SELECT "select GocSite_v0_4.siteID,hostname,sitename,region,inMaintenance from GocSite_v0_4, GocNode_v0_4 where GocSite_v0_4.siteID=GocNode_v0_4.siteID and type='Production' and nodetype='gLite-CE' and monitor='Y' and inMonitoring='Y' order by GocSite_v0_4.siteID"
Cron Status: Enabled (Antonio)

sft-roc-cern
All the uncertified CEs and gLiteCEs in CERN region (certification SFT)
Parameter Value
Client directory: sft-roc-cern
RB lxb2069.cern.ch
SFT_GOC_MAP_SELECT "select GocSite_v0_4.siteID,hostname,sitename,region,inMaintenance from GocSite_v0_4, GocNode_v0_4 where GocSite_v0_4.siteID=GocNode_v0_4.siteID and (nodetype='gLite-CE' or nodetype='CE') and monitor='Y' and inMonitoring='Y' and status<>'certified' and region='CERN' order by GocSite_v0_4.siteID"
Cron Status: Enabled (Antonio)

sft-pps-glite
This client is used as backup for the pps SFT (normally run by UPATRAS)
Parameter Value
Client directory: sft-pps-glite
RB lxb2059.cern.ch
SFT_GOC_MAP_SELECT: "select GocSite_v0_4.siteID,hostname,sitename,region,inMaintenance from GocSite_v0_4, GocNode_v0_4 where GocSite_v0_4.siteID=GocNode_v0_4.siteID and (nodetype='gLite-CE' or nodetype='CE') and type='PPS' and monitor='Y' and inMonitoring='Y' and status='certified' order by GocSite_v0_4.siteID"
Cron Status: Disabled


Changes in the parameters above described needs to be reflected in the configuration files as follows:

Parameter File to be changed
Client directory: submit-sft-glite-tests.sh
config-sft.cfg
RB conf/prefRB.lst.glite
SFT_GOC_MAP_SELECT conf/defaults.glite


The configuration is almost identical for all the client except in the parameters in the tables above.

  • The configuration file defaults.glite is
    SFT_JOB_SUBMIT_CMD=glite-job-submit
    SFT_JOB_STATUS_CMD=glite-job-status
    SFT_JOB_OUTPUT_CMD=glite-job-output
    SFT_JOB_LOGGING_CMD=glite-job-logging-info
    SFT_JOB_LISTMATCH_CMD=glite-job-list-match
    SFT_JOB_CANCEL_CMD=glite-job-cancel
    
    SFT_PUBLISHER_PROXY=http://lcg-sft-publish.cern.ch:8083/sft/publishTuple
       
    SFT_GOC_MAP_SELECT= See value in the table
    SFT_LCG_VER_FILTER="LCG-[23]_[4567890123]"
       

  • The default flavour of the SFT tests used on PPS is "glite". The defaults file is
    SFT_VO=dteam
    
    # default definitions for status codes
    SFT_OK=10
    SFT_INFO=20
    SFT_NOTICE=30
    SFT_WARNING=40
    SFT_ERROR=50
    SFT_CRITICAL=60
    
    SFT_TYPE=glite
    
    #SFT_LCG_CATALOG_TYPE=edg
    
    SFT_LCG_CATALOG_TYPE=lfc
    SFT_LFC_HOME=/grid/$SFT_VO/SFT
    
    SFT_SAME_PUBLISHER_WSDL=http://gvdev.cern.ch:8080/gridview/services/WebArchiver?wsdl
       

  • The tests.glite file looks like:
    sft-wn 
    sft-softver
    sft-caver --conf data/ca_data.dat --web
    sft-brokerinfo 
    sft-csh
    sft-lcg-rm
    sft-vo-tag
    sft-vo-swdir
    sft-rgma
    sft-rgma-sc
    sft-crl
    sft-apel
       

  • The list of SE in prefSE.lst is
    grid007g.cnaf.infn.it
    srm.cern.ch
       

  • All clients write in a local working directory
    The PPS SFT clients has been set-up to write in /afs/cern.ch/project/gd/egee/ _client directory_ .
    E.g. for the sft-pps-glite client:
    > cat /afs/cern.ch/project/gd/egee/sft-pps-glite/config-sft.cfg=
    SFT_WORK=/afs/cern.ch/project/gd/egee/sft-pps-glite/workdir-glite
    LCG_GFAL_INFOSYS=lxb2086.cern.ch:2170
       

    To use it you need to specify esplicitely the configuration file in the command line.
    >  ./sftests -c config-sft.cfg submit
    >  ./sftests -c config-sft.cfg status
    >  ./sftests -c config-sft.cfg publish
       

  • Submission scripts have been ceated to run the clients directly from lxb1908. They are almost identical, with the exception of the _client directory_ . E.g. for the sft-roc-cern client:
    > cat /afs/cern.ch/project/gd/egee/sft-roc-cern/submit-sft-glite-tests.sh
    #!/bin/sh
    
    # use PPS ui
    source /afs/cern.ch/project/gd/egee/glite/ui_PPS_glite3.0_RC5/etc/profile.d/grid_env.sh
    
    # use local ui
    #source /etc/glite/profile.d/glite_setenv.sh
    
    cat /afs/cern.ch/user/a/aretico/private/pass | voms-proxy-init -voms dteam -pwstdin
    sleep 2
    /afs/cern.ch/project/gd/egee/sft-roc-cern-glite/sftests -c /afs/cern.ch/project/gd/egee/sft-roc-cern/config-sft.cfg publish
    sleep 2
    
    /afs/cern.ch/project/gd/egee/sft-roc-cern-glite/sftests -c /afs/cern.ch/project/gd/egee/sft-roc-cern/config-sft.cfg submit
    
    exit
    
       

Note on the Cern ROC certification RB

lxb2069.cern.ch is a gLiteWMS, which allows jobs to be sent both to LCG and gLiteCEs. It uses, as Information System, lxb2086.cern.ch

lxb2086 is also a top Level BDII , which uses the configuration file in

/afs/cern.ch/project/gd/egee/www/roc-cern/bdii/cern-roc-all-sites.conf

this file is generated by merging the production BDII configuration file

/afs/cern.ch/project/gd/www/gis/lcg2-bdii/dteam/lcg2-all-sites.conf

with a list of sites under observation by the Cern ROC (e.g. suspended, candidate, uncertified sites)

/afs/cern.ch/project/gd/egee/www/roc-cern/bdii/observed-sites.conf

The script that creates the BDII configuration file (currently run by Antonio) in his crontab is:

[aretico@lxb1908 bdii] cat /afs/cern.ch/project/gd/egee/www/roc-cern/bdii/create-roc-bdii-conf.sh
#!/bin/sh

# to be run in user's crontab
# currently run in acrontab :by Antonio
# 05 2 * * * lxplus.cern.ch /afs/cern.ch/project/gd/egee/www/roc-cern/bdii/create-roc-bdii-conf.sh 

BDII_PROD_CONF=/afs/cern.ch/project/gd/www/gis/lcg2-bdii/dteam/lcg2-all-sites.conf
BDII_OBS_CONF=/afs/cern.ch/project/gd/egee/www/roc-cern/bdii/observed-sites.conf
BDII_ROC_CONF=/afs/cern.ch/project/gd/egee/www/roc-cern/bdii/cern-roc-all-sites.conf

cat << EOF > ${BDII_ROC_CONF}
#
# ROC-CERN BDII configuration file.
# 
# This file is generated by the script
#
#    ${0}
#
# It is the result of merging the files
# ${BDII_PROD_CONF} 
# and 
# ${BDII_OBS_CONF}
# 
# Manual modifications by the CERN-ROC team should be done only in
# ${BDII_ROC_CONF}
#

# ----------------------------------
# Start of merged info
# ----------------------------------

EOF

cat ${BDII_PROD_CONF} ${BDII_OBS_CONF} >> ${BDII_ROC_CONF}

cat << EOF >> ${BDII_ROC_CONF}

# ----------------------------------
# End of merged info
# ----------------------------------

EOF

Cron jobs

> cat /afs/cern.ch/project/gd/egee/gocdb-xfer/launch-gocdb-xfer.sh

#!/bin/sh

# use PPS ui
source /afs/cern.ch/project/gd/egee/glite/ui_PPS_glite3.0_RC5/etc/profile.d/grid_env.sh

cat /afs/cern.ch/user/a/aretico/private/pass | voms-proxy-init -voms dteam -pwstdin
sleep 2
/afs/cern.ch/project/gd/egee/gocdb-xfer/gocdb-xfer.py

exit
  • > acrontab -l

05 2 * * * lxplus.cern.ch /afs/cern.ch/project/gd/egee/www/roc-cern/bdii/create-roc-bdii-conf.sh
05 * * * * lxplus.cern.ch /afs/cern.ch/project/gd/egee/www/preproduction/bdii/create-pps-bdii-conf.sh > /afs/cern.ch/project/gd/egee/www/preproduction/bdii/create-pps-bdii-conf.log 2>&1
00 * * * * lxb1908.cern.ch /afs/cern.ch/project/gd/egee/gocdb-xfer/launch-gocdb-xfer.sh> /afs/cern.ch/project/gd/egee/gocdb-xfer/gocdb-xfer.log 2>&1
#35 * * * * lxb1908.cern.ch /afs/cern.ch/project/gd/egee/sft-pps-glite/submit-sft-glite-tests.sh > /afs/cern.ch/project/gd/egee/sft-pps-glite/sft-glite-cron.log 2>&1
45 * * * * lxb1908.cern.ch /afs/cern.ch/project/gd/egee/sft-roc-cern/submit-sft-glite-tests.sh > /afs/cern.ch/project/gd/egee/sft-roc-cern/sft-glite-cron.log 2>&1
55 * * * * lxb1908.cern.ch /afs/cern.ch/project/gd/egee/sft-prod-glite/submit-sft-glite-tests.sh > /afs/cern.ch/project/gd/egee/sft-prod-glite/sft-glite-cron.log 2>&1
   

NOTE: connections from lxb1908.cern.ch had to be previously authorized from the administrators of the GOC DB.

Display

https://lcg-sft.cern.ch/sft-CERN-ROC/lastreport.cgi

Maintenance

Running test SFT jobs (if you are not Antonio)

All the regular submission is done through cronjobs run with Antonio's proxy. If you need to debug the client and want to submit.

  • You need writing permissions in /afs/cern.ch/project/gd/egee
  • In the client directory make a copy of the config file config-sft.cfg
    cp config-sft.cfg my_config-sft.cfg
  • Edit the config file config-sft.cfg changing the directory name in it (e.g.)
    > cat config-sft.cfg 
    SFT_WORK=/afs/cern.ch/project/gd/egee/sft-prod-glite/my-workdir-glite
    LCG_GFAL_INFOSYS=lxb2086.cern.ch:2170
       
  • Create your proxy
  • Use the submit, status and publish commands pointed to your config file
    > ./sftests -c my_config-sft.cfg submit
    > ./sftests -c my_config-sft.cfg status
    > ./sftests -c my_config-sft.cfg publish
       

Upgrade CA

On a AFS UI (example for user=aretico)

setenv CVSROOT :ext:aretico@glite.cvs.cern.ch:/cvs/glite
[aretico@lxplus ~/cvs] cvs co sft2
...
[aretico@lxplus ~/cvs] cd /afs/cern.ch/project/gd/egee/sft-roc-cern-glite
[aretico@lxplus sft-roc-cern-glite cp data/ca_data.dat data/ca_data.dat.bak
[aretico@lxplus sft-roc-cern-glite cp ~/cvs/sft2/data/ca_data.dat data/ca_data.dat

-- Main.aretico - 29 Aug 2006

Edit | Attach | Watch | Print version | History: r7 < r6 < r5 < r4 < r3 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r7 - 2007-07-19 - AntonioRetico
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    LCG All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback