SciNet User Support Library

From SciNetWiki
Jump to: navigation, search

System Status

upGPC upTCS upSandy upGravity upBGQ Up.pngFile System
upP7 upP8 upKNL upViz upHPSS

Mon Mar 20 20:50:00 EDT 2017 File system has recovered.

Mon Mar 20 14:56:05 EDT 2017 Problems with IB fabric or the scratch3 & project3 file systems. We are investigating.

Tue Mar 15 18:00:00 EST 2017 Systems are back online and fully operational.

Tue Mar 15 16:31:39 EST 2017 Power glitch at data center. Compute nodes went down, bringing them up.

Sun Mar 5 14:34:11 EST 2017 Globus access to HPSS has been re-enabled.

Thu Mar 2 9:29:14 EST 2017 GPC jobs are back running.

Thu Mar 2 01:54:57 EST 2017 scratch filesystem went down earlier and most GPC jobs were killed. New GPC jobs are in hold till disk check finished in the morning.

Tue Feb 28 2017 16:00:00 EST The file transfer of users files on the old scratch system to the new scratch system has been completed. The new scratch folders are logically in the same place as before, i.e. /scratch/G/GROUP/USER. Your $SCRATCH environment variable will point to this location when you log in. The project folders have also been moved in the same way. Compute jobs have been released and are starting to run. Let us know if you have any concerns. Thank you for you patience.

Tue Feb 28 2017 10:02:45 EST It could take a few more hours for the scratch migration to finish. We still have a dozen or so users to go. Please check this page from time to time for updates.

Mon Feb 27 2017 10:00:00 EST The old scratch was 99% full. Given the current incident of scratch getting unmounted everywhere, we had little choice but to decide that it is time to initiate the transition to the new scratch file system at this point, instead of performing a roll-out approach that we had planned earlier.

We estimate the transition to the new scratch will take roughly one day, but since we want all users' data on the old scratch system to be available in the new scratch (at the same logical location), the exact duration of the transition depends on the amount of new data to be transferred over.

In the meantime, no jobs will start running on the GPC, Sandy, Gravity or P7.

In addition, $SCRATCH will not be accessible to users during the transition, but you can login to the login and devel nodes. $HOME is not affected.

The current scratch system issue and the scratch transition don't affect the BGQ or TCS anymore (although running jobs on TCS may have stopped this morning), because BGQ and TCS have their own separate scratch file systems. It also does not affect groups whose scratch space is on /scratch2.

Mon Feb 27 2017 7:20:00 EST Scratch file system is down. We are investigating.

Wed Feb 22 2017 16:17:00 EST Globus access to HPSS is currently not operational. We hope to have a resolution for this soon.

System News

  • Mar 3: GPC: Version 7.0 of Allinea Forge (DDT Debugger, MAP, Performance Reports) installed as a module.
  • Jan 26: New larger (1.8PB) $SCRATCH storage brought online.
  • Oct 24: P8: 2 new Power 8 Development Nodes, P8 , with 4x Nvidia P100 (Pascal) GPUs, available for users.
  • Sept 19: KNL: Intel Knights Landing Development Nodes, KNL , available for users.
  • Sept 13: GPC: Version 6.1 of Allinea Forge (DDT Debugger, MAP, Performance Reports) installed as a module.
  • Sept 13: GPC: Version 17.0.0 of the Intel Compiler and Tools are installed as modules.
  • Aug 20: P8: Power 8 Development Nodes, P8 , with 2x Nvidia K80, GPUs available for users.

(Previous System News)

QuickStart Guides

Tutorials and Manuals

What's New On The Wiki

  • Dec 2014: Updated GPC Quickstart with info on email notifications from the scheduler.
  • Dec 2014: Hdf5 compilation page updated.
  • Sept 2014: Improved information on the Python versions installed on the GPC, and which modules are included in each version.
  • Sept 2014: Description on using job arrays on the GPC on the Scheduler page.
  • Sept 2014: Instructions on using Hadoop (for the Hadoop workshop held in September).

Previous new stuff can be found in the What's new archive.

Personal tools
Knowledge Base