Storage

Overview

On all of our clusters, users have access to a global home, project, and group space (if they belong to a group that has purchased additional storage). This means that files created from compute nodes on one cluster can be used for calculations on any of our other clusters. All storage systems use GPFS as the file system and each filesystem is backed up nightly.

Filesystem Location Soft Limit Hard Limit
Home /home/<username> 100 GB 1 TB
Project /lcrc/project/<project name> 1+ TB 2+ TB
Group /lcrc/group/<group name> no quotas no quotas

We also offer the ability for groups to purchase their own storage resources to be hosted with us. In doing so you get access to the storage space across all of our clusters (unless otherwise specified), and we take care of supporting the system, replacing parts, and when possible tuning the storage resources to fit your data model.

If you are interested in learning more about purchasing additional storage resources please contact us at support@lcrc.anl.gov

You also have access to scratch space on the compute nodes while you have a job running on the node.

Quotas

In order to prevent individual users from hogging all of the available storage space, a quota is enforced on home and project directories. Home directories have a quota of 100 GB, while project directories can have 1 or more TBs.

These quotas are technically soft limits. If a running job outputs more data than you expected, you can continue writing to your home or project directories up to the hard limit. This prevents jobs in an infinite loop from crashing the filesystem. However, once you are over your soft limit, you only have a 2 week grace period to go below your quota again.

Once you are over your quota and your grace period has expired, you can no longer write files to your home directory, including the cache file used by SoftEnv. This means that your software environment could become corrupted, preventing you from finding executables you’ve previously used in the past. If you are unexpectedly seeing error messages like “command not found” or you see cryptic error messages upon login, check to make sure you aren’t over your quota with the following command:

$ /soft/lcrc/bin/lcrc-quota

If you believe your project requires additional storage space, contact us and tell us why this is the case. At this time we are only accepting requests for project quota increases. If you run out of room in your home directory, you’ll need to either delete some of the data or move it to your project or group directories.

Home, Project, and Group

Your home, project, and group directories are located on separate GPFS filesystems that are shared by all nodes on the cluster. These filesystems are located on a raid array and are served by multiple file servers. This provides both a performance increase and protection against the filesystems being inaccessible. If one server goes down, the other servers can continue to serve the filesystems.

Pros

  • Global namespace
  • Multi-TB filesystem
  • Large file support (> 2GB)
  • Backed up
  • Raid protection
  • Stable hardware
  • Native InfiniBand support

Cons

  • Moderate performance

Local Scratch Disk

If you need a place to put temporary files that don’t need to be accessed by other nodes, we recommend that you put them into the local scratch disk on the nodes during job runs. All jobs create a job specific directory with local storage which can be referenced from your job submission script using the variable $TMPDIR. The normal publicly available Blues nodes offer 15 GB of scratch space while the ‘biggpu’ queue offers 1 TB.

Pros

  • Fast access
  • Large file support (> 2GB)

Cons

  • Unique to each node; not shared between nodes
  • GB filesystem
  • Not backed up
  • Cleared out at the end of your job
  • No raid protection

Backup and Archives

As previously mentioned, all storage systems use GPFS as the file system and each file system is backed up nightly. Backups are written to both a disk and tape enclosure in LCRC. If you need to restore a lost file, please contact support@lcrc.anl.gov. We currently do not offer the ability for users to archive their own files on demand, but we hope to re-implement this in the future.