I already have a spark-server running on my machine with 1 master and 1 worker node. However, every time I run anything in scala, it creates it's own spark cluster.
How can I make it use the existing spark servers that are running. I can do this with pyspark, but not with spylon-kernel.
Failed with Spylon-kernel -
