Improved readability
This commit is contained in:
parent
48328cc85f
commit
3ceddcaed1
1 changed files with 7 additions and 7 deletions
14
README.md
14
README.md
|
@ -32,13 +32,13 @@ Before running the application, make sure you have the following prerequisites i
|
|||
|
||||
## Getting Started
|
||||
|
||||
1. Build and transfer the Conda environment to Hawk:
|
||||
### 1. Build and transfer the Conda environment to Hawk:
|
||||
|
||||
Only the `main` and `r` channels are available using the Conda module on the clusters. To use custom packages, we need to move the local Conda environment to Hawk.
|
||||
|
||||
Follow the instructions in [the Conda environment builder repository](https://code.hlrs.de/SiVeGCS/conda-env-builder). The YAML file to create a test environment is available in the `deployment_scripts` directory.
|
||||
|
||||
2. Allocate workspace on Hawk:
|
||||
### 2. Allocate workspace on Hawk:
|
||||
|
||||
Proceed to the next step if you have already configured your workspace. Use the following command to create a workspace on the high-performance filesystem, which will expire in 10 days. For more information, such as how to enable reminder emails, refer to the [workspace mechanism](https://kb.hlrs.de/platforms/index.php/Workspace_mechanism) guide.
|
||||
|
||||
|
@ -47,33 +47,33 @@ ws_allocate dask_workspace 10
|
|||
ws_find dask_workspace # find the path to workspace, which is the destination directory in the next step
|
||||
```
|
||||
|
||||
3. Clone the repository on Hawk to use the deployment scripts and project structure:
|
||||
### 3. Clone the repository on Hawk to use the deployment scripts and project structure:
|
||||
|
||||
```bash
|
||||
cd <workspace_directory>
|
||||
git clone <repository_url>
|
||||
```
|
||||
|
||||
4. Send all the code to the appropriate directory on Vulcan using `scp`:
|
||||
### 4. Send all the code to the appropriate directory on Vulcan using `scp`:
|
||||
|
||||
```bash
|
||||
scp <your_script>.py <destination_host>:<destination_directory>
|
||||
```
|
||||
|
||||
5. SSH into Vulcan and start a job interactively using:
|
||||
### 5. SSH into Vulcan and start a job interactively using:
|
||||
|
||||
```bash
|
||||
qsub -I -N DaskJob -l select=1:node_type=clx-21 -l walltime=02:00:00
|
||||
```
|
||||
Note: For multiple nodes, it is recommended to write a `.pbs` script and submit it using `qsub`.
|
||||
|
||||
6. Go into the directory with all code:
|
||||
### 6. Go into the directory with all code:
|
||||
|
||||
```bash
|
||||
cd <destination_directory>
|
||||
```
|
||||
|
||||
7. Initialize the Dask cluster:
|
||||
### 7. Initialize the Dask cluster:
|
||||
|
||||
```bash
|
||||
source deploy-dask.sh "$(pwd)"
|
||||
|
|
Loading…
Reference in a new issue