forked from SiVeGCS/dask_template
finalized for documentation upload
This commit is contained in:
parent
7c37774ca5
commit
f3b8da05d9
2 changed files with 13 additions and 6 deletions
19
README.md
19
README.md
|
@ -13,6 +13,7 @@ Structure:
|
|||
To do:
|
||||
- [x] Made scripts for environment creation and deployment in the folder `local_scripts`
|
||||
- [x] Changed scripts to `deployment_scripts`
|
||||
- [x] Added step about sending python file
|
||||
|
||||
---
|
||||
|
||||
|
@ -28,19 +29,19 @@ This repository looks at a deployment of a Dask cluster on Vulcan, and executing
|
|||
|
||||
Before running the application, make sure you have the following prerequisites installed in a conda environment:
|
||||
- [Python 3.8.18](https://www.python.org/downloads/release/python-3818/): This specific python version is used for all uses, you can select it using while creating the conda environment. For more information on, look at the documentation for Conda on [HLRS HPC systems](https://kb.hlrs.de/platforms/index.php/How_to_move_local_conda_environments_to_the_clusters).
|
||||
- [Conda Installation](https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html): Ensure that Conda is installed on your local system. Follow the [official Conda installation guide] if not already installed.
|
||||
- [Conda Installation](https://docs.conda.io/projects/conda/en/latest/user-guide/install/index.html): Ensure that Conda is installed on your local system. For more information on, look at the documentation for Conda on [HLRS HPC systems](https://kb.hlrs.de/platforms/index.php/How_to_move_local_conda_environments_to_the_clusters).
|
||||
- [Dask](https://dask.org/): Install Dask using conda.
|
||||
- [Conda Pack](https://conda.github.io/conda-pack/): Conda pack is used to package the Conda environment into a single tarball. This is used to transfer the environment to Vulcan.
|
||||
|
||||
## Getting Started
|
||||
|
||||
1. Clone this repository to your local machine:
|
||||
1. Clone [this repository](https://code.hlrs.de/hpcrsaxe/spark_template) to your local machine:
|
||||
|
||||
```bash
|
||||
git clone <repository_url>
|
||||
```
|
||||
|
||||
2. Create an environment using Conda and enirvonment.yaml:
|
||||
2. Go into the direcotry and create an environment using Conda and enirvonment.yaml. Note: Be sure to add the necessary packages in environemnt.yaml:
|
||||
|
||||
```bash
|
||||
./deployment_scripts/create-env.sh <your-env>
|
||||
|
@ -52,19 +53,25 @@ Before running the application, make sure you have the following prerequisites i
|
|||
./deployment_scripts/deploy-env.sh <your-env> <destination_host>:<destination_directory>
|
||||
```
|
||||
|
||||
4. SSH into Vulcan and start a job interatively using:
|
||||
4. Send all the code to the appropriate directory on Vulcan using `scp`:
|
||||
|
||||
```bash
|
||||
scp <your_script>.py <destination_host>:<destination_directory>
|
||||
```
|
||||
|
||||
5. SSH into Vulcan and start a job interatively using:
|
||||
|
||||
```bash
|
||||
qsub -I -N DaskJob -l select=4:node_type=clx-21 -l walltime=02:00:00
|
||||
```
|
||||
|
||||
5. Go into the directory will all code:
|
||||
6. Go into the directory with all code:
|
||||
|
||||
```bash
|
||||
cd <destination_directory>
|
||||
```
|
||||
|
||||
6. Initialize the Dask cluster:
|
||||
7. Initialize the Dask cluster:
|
||||
|
||||
```bash
|
||||
source deploy-dask.sh "$(pwd)"
|
||||
|
|
Loading…
Reference in a new issue