This page is OBSOLETE, the Dirac components can be retrieved with the System Administration application of the new LHCb DIRAC web portal

gLite WMS

  • wms216.cern.ch (Dedicated WMS at CERN)
  • wms203.cern.ch (Dedicated WMS at CERN)
  • wms010.cnaf.infn.it (Dedicated WMS at CNAF, dell with dual quadcore (8cores) 16GB)
  • lcgwms01.gridpp.rl.ac.uk (Shared WMS at RAL, quad core dual smp Xeon 2.33GHZ (8 cores) , 16GB)
  • lcgwms02.gridpp.rl.ac.uk (Shared WMS at RAL, quad core dual smp Xeon 2.33GHZ (8 cores) , 16GB)
  • wms.grid.sara.nl (shared WMS at SARA: dual quadcore (8 cores) Dell 1950, 8GB)
  • rb03.pic.es shared (WMS at PIC: 4 x Intel(R) Xeon(TM) CPU 3.20GHz, 4GB)
  • wms-2-fzk.gridka.de(Shared WMS at FZK 8cores/16GB Mem/Latest version of m/w)
  • wms-1-fzk.gridka.de (Shared WMS at FZK 8cores/16GB Mem/Latest version of m/w)
  • wms-3-fzk.gridka.de (Shared WMS at FZK 8cores/16GB Mem/Latest version of m/w)

Bdii

User Interface

  • lxplus.cern.ch (just sourcing /afs/cern.ch/project/gd/LCG-share/sl5/etc/profile.d/grid_env.(c)sh)

Castor2 Clusters

LFC instances

Production FTSes

  • We now just use a single FTS3 instance at CERN for FTS transfers with RAL as the replacement instance. [[https://fts3.cern.ch:8449/fts3/ftsmon/#/?vo=lhcb&source_se=&dest_se=&page=1][FTS3 monitoring] for LHCb FTS3 transfers.
    • Legacy FTS2 endpoints:
    • prod-fts-ws.cern.ch
    • cclcgftsprod.in2p3.fr
    • lcgfts.gridpp.rl.ac.uk
    • fts.grid.sara.nl
    • fts.pic.es
    • fts-fzk.gridka.de
    • fts.cr.cnaf.infn.it

Tier-1 VO-boxes

Volhcb nodes (you can get some if with CDBDump volhcbXX)

Name Alias OS Type Services Warranty
volhcb03   SLC5 Intel(R) Xeon(R) CPU L5420 @ 2.50GHz (2/8) , 16GB RAM, 2x160GB HDD Production machine: accounting services 12 march 2012
volhcb24   SLC5 Intel(R) Xeon(R) CPU L5420 @ 2.50GHz (2/8) , 16GB RAM, 2x160GB HDD DIRAC Production Agents and services (2nd machine) 28 april 2012
volhcb25   SLC5 Intel(R) Xeon(R) CPU L5420 @ 2.50GHz (2/8) , 16GB RAM, 2x160GB HDD LHCbDirac Web development 28 april 2012
volhcb26   SLC5 Intel(R) Xeon(R) CPU L5420 @ 2.50GHz (2/8) , 16GB RAM, 2x160GB HDD VMDIRAC 28 april 2012
volhcb28   SLC5 Intel(R) Xeon(R) CPU L5520 @ 2.27GHz (2/8) , 16GB RAM, 2x160GB HDD DIRAC - DataManagement optimization development 22 april 2013
volhcb29   SLC5 Intel(R) Xeon(R) CPU L5520 @ 2.27GHz (2/8) , 16GB RAM, 2x160GB HDD MergingForDQAgents, HammerCloud, few other production services and agents 22 april 2013
volhcb30 lhcb-cert-dirac SLC6 Intel(R) Xeon(R) CPU L5520 @ 2.27GHz (2/8) , 16GB RAM, 2x160GB HDD Certification machine, mysql 22 April 2013
volhcb31   SLC5 Intel(R) Xeon(R) CPU L5520 @ 2.27GHz (1/1), 2GB RAM, 1x100GB HDD Jenkins Virtual Machine
volhcb32   SLC5 Intel(R) Xeon(R) CPU L5520 @ 2.27GHz (1/1), 2GB RAM, 1x100GB HDD ? Virtual Machine
volhcb33   SLC5 Intel(R) Xeon(R) CPU L5520 @ 2.27GHz (1/1), 2GB RAM, 1x100GB HDD ? Virtual Machine
volhcb34   SLC5 Intel(R) Xeon(R) CPU L5520 @ 2.27GHz (2/8) , 48GB RAM, 3x1TB HDD ? ?
volhcb35   SLC5 Intel(R) Xeon(R) CPU L5520 @ 2.27GHz (2/8) , 24GB RAM, 2x500GB HDD ? ?
volhcb36   SLC5 Intel(R) Xeon(R) CPU L5640 @ 2.27GHz (2/12), 48GB RAM, 3x2TB HDD Accounting DB ?
volhcb37   SLC5 Intel(R) Xeon(R) CPU L5640 @ 2.27GHz (2/12), 48GB RAM, 3x2TB HDD Accounting DB ?
volhcb38   SLC? Intel(R) Xeon(R) CPU L5640 @ 2.27GHz (2/12), 48GB RAM, 3x2TB HDD not in prod yet ?
volhcb39   SLC? Intel(R) Xeon(R) CPU L5640 @ 2.27GHz (2/12), 48GB RAM, 3x2TB HDD not in prod yet ?

MySQL on demand databases

Name Database(s)
dbod-lbprod JobLoggingDB

Storage




Collaboration Services

Edit | Attach | Watch | Print version | History: r82 < r81 < r80 < r79 < r78 | Backlinks | Raw View | WYSIWYG | More topic actions
Topic revision: r82 - 2018-04-03 - JoelClosier
 
    • Cern Search Icon Cern Search
    • TWiki Search Icon TWiki Search
    • Google Search Icon Google Search

    LCG All webs login

This site is powered by the TWiki collaboration platform Powered by PerlCopyright &© 2008-2024 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
or Ideas, requests, problems regarding TWiki? use Discourse or Send feedback