Skip to content

Commit b237191

Browse files
authored
Merge pull request #36 from Eventdisplay/readme
readme improvements
2 parents fba905c + 81ce5b7 commit b237191

File tree

1 file changed

+19
-9
lines changed

1 file changed

+19
-9
lines changed

README.md

Lines changed: 19 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -94,19 +94,28 @@ e.g.,
9494

9595
## Running the analysis
9696

97-
Central execution script is [CTA.runAnalysis.sh](CTA.runAnalysis.sh). In the best case, no changes are required to this script.
97+
Central execution scripts are [CTA.mainRunScriptsReduced.sh](CTA.mainRunScriptsReduced.sh) and [CTA.runAnalysis.sh](CTA.runAnalysis.sh).
98+
In the best case, no changes are required to these scripts.
9899

99100
e.g., to run the first step of the analysis with evndisp, do
100101
```
101-
./CTA.runAnalysis.sh prod3b-S20deg-SCT EVNDISP 0 2 2 2 2
102+
./CTA.mainRunScriptsReduced.sh prod6-Paranal-20deg-dark-sq10-LL EVNDISP
102103
```
103-
(or set any other data set, as outlined in ./CTA.runAnalysis.sh)
104+
(or set any other data set, as outlined in ./CTA.mainRunScriptsReduced.sh)
104105

105-
The script does the following:
106+
To submit script, check the log file directory printed to the screen (the directory with the UUID) and then run:
107+
```
108+
./utilities/submit_scripts_to_htcondor.sh <log file directory> submit
109+
```
110+
Try this first without the submit argument and check the `submit.txt` file.
111+
This assumes the HTCondor job submission system. Gridengine will work after changing the variable `SUBC` from `condor` to `qsub` in the scripts `analysis/*sub*`.
112+
113+
The script `./CTA.mainRunScriptsReduced.sh` does the following:
106114

107115
- read a list of arrays from a subdirectory specificed for your data set in ./CTA.runAnalysis.sh (e.g., prod3b/subArray.prod3b.South-SCT.list)
108116
- execute scripts to submit jobs from the ./analysis directory
109117
- all output products are written to *${CTA_USER_DATA_DIR}/analysis/AnalysisData/${DSET}*
118+
- for all telescope multiplicity dependent analysis, this is done for the multiplicities defined in `NIM-South.txt` and `NIM-South-sub.txt`.
110119

111120
On the list of arrays:
112121
- arrays are defined by the telescope numbering as defined during the simulations.
@@ -117,11 +126,12 @@ For a complete analysis, one needs to cycle through all reconstruction steps in
117126
1. EVNDISP - calibration and image analysis
118127
2. MAKETABLES and DISPBDT - lookup table filling and disp BDT training (can be done in parallel)
119128
3. ANATABLES - stereo analysis using lookup tables and disp BDTs
120-
4. TRAIN - train BDTs for gamma/hadron separation
121-
5. ANGRES - determine angular resolution for 40% signal efficiency
122-
6. QC - determine data rates after quality cuts (used for cut optimisation)
123-
7. CUTS - optimise gamma/hadron cuts and calculate instrument response functions
124-
8. PHYS - fill instrument response functions
129+
4. PREPARETMVA - write data products needed for BDT training
130+
5. TRAIN - train BDTs for gamma/hadron separation
131+
6. ANGRES - determine angular resolution for 40% signal efficiency
132+
7. QC - determine data rates after quality cuts (used for cut optimisation)
133+
8. CUTS - optimise gamma/hadron cuts and calculate instrument response functions
134+
9. PHYS - fill instrument response functions
125135

126136
## Testing
127137

0 commit comments

Comments
 (0)