runtime error

JAVA_HOME is not set Traceback (most recent call last): File "/home/user/app/app.py", line 35, in <module> spark = SparkSession.builder.appName("EnergyAnomalyDetection").getOrCreate() File "/home/user/.local/lib/python3.10/site-packages/pyspark/sql/session.py", line 497, in getOrCreate sc = SparkContext.getOrCreate(sparkConf) File "/home/user/.local/lib/python3.10/site-packages/pyspark/context.py", line 515, in getOrCreate SparkContext(conf=conf or SparkConf()) File "/home/user/.local/lib/python3.10/site-packages/pyspark/context.py", line 201, in __init__ SparkContext._ensure_initialized(self, gateway=gateway, conf=conf) File "/home/user/.local/lib/python3.10/site-packages/pyspark/context.py", line 436, in _ensure_initialized SparkContext._gateway = gateway or launch_gateway(conf) File "/home/user/.local/lib/python3.10/site-packages/pyspark/java_gateway.py", line 107, in launch_gateway raise PySparkRuntimeError( pyspark.errors.exceptions.base.PySparkRuntimeError: [JAVA_GATEWAY_EXITED] Java gateway process exited before sending its port number.

Container logs:

Fetching error logs...