Scheduler initialization throws errors #3

Open
opened 2024-08-27 08:53:02 +00:00 by Rishabh Saxena · 2 comments
Member

When i wanted to start the dask-example-pi.py script on a single node some errors occured and i had to do the following steps to fix it:

Had to added the following package to the script: "import os"
Had to start the scheduler in the bash shell -> "dask scheduler --port 8786
Had to start the worker in the bash shell -> "dask worker tcp://193.196.155.40:8786

Message from Felix

> When i wanted to start the dask-example-pi.py script on a single node some errors occured and i had to do the following steps to fix it: > Had to added the following package to the script: "import os" > Had to start the scheduler in the bash shell -> "dask scheduler --port 8786 >Had to start the worker in the bash shell -> "dask worker tcp://193.196.155.40:8786 Message from Felix
Rishabh Saxena self-assigned this 2024-08-27 08:53:12 +00:00

193.196.155.40 - is the login node. What is he running on the login node and why?

`193.196.155.40` - is the login node. What is he running on the login node and why?

How to prevent running on login node:

if [ -z ${PBS_NODEFILE+x} ]; then echo "ERROR: This script is ment to run as a part of PBS Job. Don't start it at login nodes."; exit 1;  fi

Source: /opt/hlrs/non-spack/2022-02/bigdata/spark_cluster/spark-3.3.1-bin-hadoop3/bin/init-spark (on Vulcan)

How to prevent running on login node: ```bash if [ -z ${PBS_NODEFILE+x} ]; then echo "ERROR: This script is ment to run as a part of PBS Job. Don't start it at login nodes."; exit 1; fi ``` Source: `/opt/hlrs/non-spack/2022-02/bigdata/spark_cluster/spark-3.3.1-bin-hadoop3/bin/init-spark` (on Vulcan)
Sign in to join this conversation.
No labels
No milestone
No project
No assignees
2 participants
Notifications
Due date
The due date is invalid or out of range. Please use the format "yyyy-mm-dd".

No due date set.

Dependencies

No dependencies set.

Reference: SiVeGCS/dask_template#3
No description provided.