SciNet User Support Library

From SciNetWiki
Jump to: navigation, search

System Status

upGPC upTCS upSandy upFile System
upGravity upP7 upViz upBGQ upHPSS


Tue 16 Aug 2016 21:55:20 Cooling restored, filesystem up and OK, bringing up clusters. GPC should be available to users by 11PM (perhaps as early as 1030)

Tue 16 Aug 2016 20:46:37 Water service has been restored to building. Restarting cooling system.

Tue 16 Aug 2016 19:52:32 SciNet-related maintenance and modifications have been completed successfully. Work on the building water valve is expected to be done on time (by 9PM). Once water service is restored we need to restore cooling, power-up filesystems and then restart the clusters. Unlikely that any systems are available to users before 11PM and it will take longer to get everything online. Check here for updates

Tue 16 Aug 2016 07:07:32 Shutdown has started


Scheduled full-day maintenance shutdown begins:

7AM, Tuesday, 16 Aug

Several projects (adding new 208V circuits for storage, cooling tower maintenance etc) are being carried out on same day as the landlord needs to shutdown the main building water supply (and therefore our cooling system as well) for repairs.

Expect to start bringing systems up about 10PM depending on when the water work is done. Check here for further updates during the day


System News

  • May 3: GPC: Versions 15.0.6 and 16.0.3 of the Intel Compilers are installed as modules.
  • Feb 12: GPC: Version 6.0 of Allinea Forge (DDT Debugger, MAP, Performance Reports) installed as a module.
  • Jan 11: The 2016 Resource Allocations for compute cycles are now in effect.
  • Nov 23: The quota for home directories has been increased from 10 GB to 50 GB.
  • Nov 23, GPC: Two Visualization Nodes, viz01 and viz02, are being setup. They are 8-core Nehalem nodes with 2 graphics cards each, 64 GB of memory, and about 60GB of local hard disk. For now, you can directly log into viz01 to try it out. We would value users' feedback and request for suitable software, help with visualization projects etc.
  • Nov 16: ARC being decommissioned. During a transition period, the ARC head node and two compute nodes will be kept up. Users are encouraged to start using Gravity instead.
  • Nov 12, GPC: The number of GPC devel nodes has been doubled from 4 to 8, and the new ones can be accessed using gpc0[5-8].
  • Sept 7, GPC: The number of nodes with 32 GB of RAM has been increased from 84 to 205.
  • July 24, GPC: GCC 5.2.0 with Coarray Fortran support, installed as a module.

(Previous System News)

QuickStart Guides

Tutorials and Manuals

What's New On The Wiki

  • Dec 2014: Updated GPC Quickstart with info on email notifications from the scheduler.
  • Dec 2014: Hdf5 compilation page updated.
  • Sept 2014: Improved information on the Python versions installed on the GPC, and which modules are included in each version.
  • Sept 2014: Description on using job arrays on the GPC on the Scheduler page.
  • Sept 2014: Instructions on using Hadoop (for the Hadoop workshop held in September).

Previous new stuff can be found in the What's new archive.


Personal tools
Namespaces
Variants
Actions
Systems
Knowledge Base
Wiki
Toolbox