[Jupyter+Spark] Worker and executor memory update
Date:
Tuesday, August 26, 2025 - 10:45am
System(s):
The Jupyter+Spark app now configures worker memory based on the job's memory limit and the number of workers.
The Jupyter+Spark app now configures worker memory based on the job's memory limit and the number of workers.
The Jupyter+Spark app now allows you to choose the Python version. You can specify a Python version that matches the one used in your Conda environment. This ensures compatibility when setting up the Spark cluster and helps prevent issues related to Python version mismatches.