Cambridge HEP Group - site report
- 2 DIGITAL UNIX (ALPHA) fileservers
- 450GB disk space, DLT2000 stacker, UPS-protected.
- disk mainly RAID0 in StorageWorks pedestals.
- 9 DEC ALPHA workstations
- 3 DIGITAL UNIX, 6 VMS.
- 39 PCs
- Range from Pentium-100MHz to Pentium III-500MHz
- 16 Linux, 16 Windows NT
- 6 Windows 95, 1 Windows 3.1
- SUN workstation (Solaris - for CAD)
- 4 MACs, 5 X-terminals
- Total 600GB+ disk space
- 2 HP Laserjet (20ppm+)
- Tektronix Phaser 550 (colour laser)
- Epson A2 colour inkjet (mainly for CAD work)
Majority of systems on switched ethernet (34 @ 10Mbps, 4 @ 100Mbps).
Rest (~35) currently on shared ethernet. Departmental connection to
campus now 100Mbps, but only 10Mbps from HEP group to department.
- Sundry items
- DIGITAL UNIX(4.0D)
- AFS, CERN software (including SHIFT software for NA48), SAMBA (for WNT)
- Windows NT (4.0SP4)
- 2 nodes as main and backup servers.
- Most software served from shared
disk space on D.U. box using SAMBA.
- Exceed, CERN software, Coraldraw, MS Office 97, MS Visual Studio (C++,
FORTRAN) etc. etc.
- Recently bought DriveImagePro from PowerQuest to copy system disks
(especially to help
with disaster recovery).
- Linux (Redhat 5.1)
- CERN libraries, MAPLE, AFS, etc. etc. Authorization, home directory common
- SUN/PCs for CAD
- variety of CAD packages (CADENCE, Powerview, etc., etc.)
- Network charges
- Currently University charging at departmental level, based on current usage
(with overall bill of course based on last year's usage). Current level
(~£4000 per annum) is not excessive, but no real incentive (especially
for short term visitors) to moderate use.
- Multiple platforms
- heterogeneous backups, general management, etc. etc.
- quality of local supply is poor (though seems to be improving)- power cuts
are very disruptive.
UPS use has reduced the disruption, but not economic to protect every piece of
equipment. However, recent tendency has been to consider if a UPS is needed
whenever equipment is purchased.
- Disk space serving/management
- what is the best way to do this?
- Software costs
- especially tools for LHC software development - many cannot be obtained
cheaply via CERN.
- how to find the best balance between preventing successful attacks and
preventing the users from working?
- group or (possibly) departmental facility?
- not directly related to VC, but we have owned and heavily used a conference
phone for several years, and are
planning in the very near future to buy a portable data projector (for use of
visiting speakers in the lab, and for our people in schools).
- major departmental upgrade currently being installed, which by end of this
summer will offer 100Mbps
switched ethernet to all desktops and gigabit ethernet capability.
- replace on cycle ~4-5 years.
- certainly replace one of existing servers during next year or so, and
buy significant (~0.5-1.0TB) extra disk space.
- Analysis farm (or "smallholding"?)
- planning to put our toe in the water with a small (~10 PC) farm.