I've rolled out Scala based Spark interfaces to non-programmers in Databricks notebooks, so it's definitely possible, but only if you stick with the basic language features.
I think Scala Spark (using 10% of the language features) is a better technical decision (because it provides huge benefits like fat JARs, shading, better text editor support, etc), but the worse overall choice for most organizations because people are generally terrified of Scala.
They'd rather do nothing than write Scala code. I can empathize with their position.
Here's a more detailed PySpark vs Scala comparison in case folks are interested: https://mungingdata.com/apache-spark/python-pyspark-scala-wh...
I think Scala Spark (using 10% of the language features) is a better technical decision (because it provides huge benefits like fat JARs, shading, better text editor support, etc), but the worse overall choice for most organizations because people are generally terrified of Scala.
They'd rather do nothing than write Scala code. I can empathize with their position.