Quota request form
@@ -88,7 +94,7 @@ We can also create group or shared spaces by request.
### Reduce file counts
-The file system backing our `/staging`space is optimized to handle small numbers of large files. If your job requires many small files, we recommend placing these files in the `/home` directory or compressing multiple files into a single zip file or tarball. See [this table](htc-job-file-transfer#data-storage-locations) for more information on the differences between `/staging` and `/home`.
+The file system backing our `/staging` space is optimized to handle small numbers of large files. If your job requires many small files, we recommend placing these files in the `/home` directory or compressing multiple files into a single zip file or tarball. See [this table](htc-job-file-transfer#data-storage-locations) for more information on the differences between `/staging` and `/home`.
Data placed in our large data `/staging` location should be stored in as few files as possible (ideally, one file per job), and will be used by a job only after being copied from `/staging` into the job working directory. Similarly, large output should first be written to the job's working directory then compressed in to a single file before being copied to `/staging` at the end of the job.
@@ -105,7 +111,7 @@ Uploading or downloading data to `/staging` should only be performed via CHTC's
For example, you can use `scp` to transfer files into your `/staging` directory:
```
-$ scp large.file netid@transfer.chtc.wisc.edu:/staging/netid/
+$ scp large.file username@transfer.chtc.wisc.edu:/staging/u/username/
```
{:.term}
@@ -123,7 +129,7 @@ Staged files should be specified in the job submit file using the `osdf:///` or
depending on the size of the files to be transferred. [See this table for more information](htc-job-file-transfer#transfer-input-data-to-jobs-with-transfer_input_files).
```
-transfer_input_files = osdf:///chtc/staging/username/file1, file:///staging/username/file2, file3
+transfer_input_files = osdf:///chtc/staging/u/username/file1, file:///staging/u/username/file2, file3
```
{:.sub}
@@ -136,7 +142,7 @@ Large outputs should be transferred to staging using the same file transfer prot
```
transfer_output_files = file1, file2, file3
-transfer_output_remaps = "file1 = osdf:///chtc/staging/username/file1; file2 = file:///staging/username/file2"
+transfer_output_remaps = "file1 = osdf:///chtc/staging/u/username/file1; file2 = file:///staging/u/username/file2"
```
{:.sub}
@@ -166,14 +172,14 @@ within the user's `/home` directory:
``` {.sub}
### Example submit file for a single job that stages large data
# Files for the below lines MUST all be somewhere within /home/username,
-# and not within /staging/username
+# and not within /staging/u/username
executable = run_myprogram.sh
log = myprogram.log
output = $(Cluster).out
error = $(Cluster).err
-transfer_input_files = osdf:///chtc/staging/username/myprogram, file:///staging/username/largedata.tar.gz
+transfer_input_files = osdf:///chtc/staging/u/username/myprogram, file:///staging/u/username/largedata.tar.gz
# IMPORTANT! Require execute servers that can access /staging
Requirements = (Target.HasCHTCStaging == true)
diff --git a/_uw-research-computing/high-memory-jobs.md b/_uw-research-computing/high-memory-jobs.md
index 0242a2dc..a8cf37f9 100644
--- a/_uw-research-computing/high-memory-jobs.md
+++ b/_uw-research-computing/high-memory-jobs.md
@@ -174,16 +174,13 @@ Altogether, a sample submit file may look something like this:
``` {.sub}
### Example submit file for a single staging-dependent job
-universe = vanilla
-
# Files for the below lines will all be somewhere within /home/username,
-# and not within /staging/username
+# and not within /staging/u/username
log = run_myprogram.log
executable = run_Trinity.sh
output = $(Cluster).out
error = $(Cluster).err
transfer_input_files = trinityrnaseq-2.0.1.tar.gz
-should_transfer_files = YES
# Require execute servers that have large data staging
Requirements = (Target.HasCHTCStaging == true)
@@ -236,7 +233,7 @@ Altogether, a sample script may look something like this (perhaps called
#!/bin/bash
# Copy input data from /staging to the present directory of the job
# and un-tar/un-zip them.
-cp /staging/username/reads.tar.gz ./
+cp /staging/u/username/reads.tar.gz ./
tar -xzvf reads.tar.gz
rm reads.tar.gz
@@ -255,7 +252,7 @@ Trinity --seqType fq --left reads_1.fq \
# Trinity will write output to the working directory by default,
# so when the job finishes, it needs to be moved back to /staging
tar -czvf trinity_out_dir.tar.gz trinity_out_dir
-cp trinity_out_dir.tar.gz trinity_stdout.txt /staging/username/
+cp trinity_out_dir.tar.gz trinity_stdout.txt /staging/u/username/
rm reads_*.fq trinity_out_dir.tar.gz trinity_stdout.txt
### END
diff --git a/_uw-research-computing/htc-docker-to-apptainer.md b/_uw-research-computing/htc-docker-to-apptainer.md
index a8847f26..bbcb31b3 100644
--- a/_uw-research-computing/htc-docker-to-apptainer.md
+++ b/_uw-research-computing/htc-docker-to-apptainer.md
@@ -78,7 +78,7 @@ INFO: Build complete: container.sif
Because container images are generally large, we require users to move these images into their staging directories. While you are still in your interactive job, move the image to your staging directory.
```
-mv container.sif /staging/username/
+mv container.sif /staging/u/username/
```
{:.term}
diff --git a/_uw-research-computing/htc-job-file-transfer.md b/_uw-research-computing/htc-job-file-transfer.md
index 8659664d..6ce81c4a 100644
--- a/_uw-research-computing/htc-job-file-transfer.md
+++ b/_uw-research-computing/htc-job-file-transfer.md
@@ -42,13 +42,6 @@ The HTC system has two primary locations where users can place their files:
The data management mechanisms behind `/home` and `/staging` are different and are optimized to handle different file sizes and numbers of files. It's important to place your files in the correct location to improve the efficiency at which your data is handled and maintain the stability of the HTC file systems.
-
Need a /staging directory?
-
-
## Transfer input data to jobs with `transfer_input_files`
@@ -57,8 +50,8 @@ To transfer files to jobs, we must specify these files with `transfer_input_file
| Input File Size (Per File)* | File Location | Submit File Syntax to Transfer to Jobs |
| ----------- | ----------- | ----------- | ----------- |
| 0 - 1 GB | `/home` | `transfer_input_files = input.txt` |
-| 1 - 30 GB | `/staging` | `transfer_input_files = osdf:///chtc/staging/NetID/input.txt` |
-| 30 - 100 GB | `/staging` | `transfer_input_files = file:///staging/NetID/input.txt` |
+| 1 - 30 GB | `/staging` | `transfer_input_files = osdf:///chtc/staging/u/username/input.txt` |
+| 30 - 100 GB | `/staging` | `transfer_input_files = file:///staging/u/username/input.txt` |
| 1 - 100 GB | `/staging/groups`
† | `transfer_input_files = file:///staging/groups/group_dir/input.txt` |
| 100 GB+ | | Contact the facilitation team about the best strategy to stage your data |
@@ -73,7 +66,7 @@ Multiple input files and file transfer protocols can be specified and delimited
```
# My job submit file
-transfer_input_files = file1, osdf:///chtc/staging/username/file2, file:///staging/username/file3, dir1, dir2/
+transfer_input_files = file1, osdf:///chtc/staging/u/username/file2, file:///staging/u/username/file3, dir1, dir2/
requirements = (HasCHTCStaging == true)
@@ -120,7 +113,7 @@ transfer_output_files = output_file, output/output_file2, output/output_file3
To transfer files back to `/staging` or a specific directory in `/home`, you will need an additional line in your HTCondor submit file, with each item separated by a semicolon (;):
```
-transfer_output_remaps = "output_file = osdf:///chtc/staging/NetID/output1.txt; output_file2 = /home/netid/outputs/output_file2"
+transfer_output_remaps = "output_file = osdf:///chtc/staging/u/username/output1.txt; output_file2 = /home/u/username/outputs/output_file2"
```
{:.sub}
@@ -133,7 +126,7 @@ Make sure to only include one set of quotation marks that wraps around the infor
If you want to transfer *all* files to a specific destination, use `output_destination`:
```
-output_destination = osdf:///chtc/staging/netid/
+output_destination = osdf:///chtc/staging/u/username/
```
{:.sub}
@@ -146,7 +139,7 @@ The `osdf:///` file transfer plugin is powered by the [Pelican Platform](https:/
To transfer and unpack files, append a `?pack=auto` at the end of the plugin path of the compressed object to be transferred.
```
-transfer_input_files = osdf:///chtc/staging/netid/filename.tar.gz?pack=auto, input1.txt, input2.txt
+transfer_input_files = osdf:///chtc/staging/u/username/filename.tar.gz?pack=auto, input1.txt, input2.txt
```
This feature is only availble for Pelican-based plugins (`osdf://`, `pelican://`) and is not available for `file://` or normal file transfers. This feature is also not recommended for compressed files larger than 30 GB.
diff --git a/_uw-research-computing/htc-overview.md b/_uw-research-computing/htc-overview.md
index c92dc353..d752e99f 100644
--- a/_uw-research-computing/htc-overview.md
+++ b/_uw-research-computing/htc-overview.md
@@ -163,9 +163,9 @@ Each of the disk space values are given in megabytes (MB), which can be converte
### Check `/staging` Quota and Usage
-To see your `/staging` quota and usage, use the `get_quotas
` command. For example,
+To see your `/staging` quota and usage, use the `get_quotas` command. For example,
```
-[NetID@ap2001 ~]$ get_quotas /staging/NetID
+[NetID@ap2001 ~]$ get_quotas
```
{:.term}
@@ -178,8 +178,8 @@ Alternatively, the `ncdu` command can also be used to see how many
files and directories are contained in a given path:
```
-[NetID@ap2001 ~]$ ncdu /home/NetID
-[NetID@ap2001 ~]$ ncdu /staging/NetID
+[NetID@ap2001 ~]$ ncdu /home/username
+[NetID@ap2001 ~]$ ncdu /staging/u/username
```
{:.term}
diff --git a/_uw-research-computing/htc-uwdf-researchdrive.md b/_uw-research-computing/htc-uwdf-researchdrive.md
index 7ef5360e..323c558e 100644
--- a/_uw-research-computing/htc-uwdf-researchdrive.md
+++ b/_uw-research-computing/htc-uwdf-researchdrive.md
@@ -84,7 +84,7 @@ transfer_output_files = outputfile1.txt, outputfile2.txt, outputfile3.txt
You can use `transfer_output_remaps` to place files in different locations:
```
-transfer_output_remaps = "outputfile1.txt = pelican://chtc.wisc.edu/researchdrive//CHTC/outputfile1.txt; outputfile2.txt = osdf:///chtc/staging//outputfile2.txt"
+transfer_output_remaps = "outputfile1.txt = pelican://chtc.wisc.edu/researchdrive//CHTC/outputfile1.txt; outputfile2.txt = osdf:///chtc/staging/u/username/outputfile2.txt"
```
The example above remaps the output files such that only `outputfile1.txt` is placed in ResearchDrive, `outputfile2.txt` is placed in `/staging`, and `outputfile3.txt` is placed in the submit directory on `/home`.