Probably PySpark and similar connectors are robust enough now that they are not necessarily joined at the hip like they were 10 years ago. If you were working in Spark at the maximal of its hype cycle around then you basically had to use Scala in at least some extent - even if it was simply a core team exposing native API’s in other languages - since it was the most native approach that exposed all the apis you needed. Nowadays probably other languages and wrappers have caught up enough that using Scala is not such the absolute requirement it was before.
This is very true in my experience. I worked in Spark for 3 years and never touched Scala code. I imagine there are many people using Spark who don't even know it's written in Scala, or whose only interaction with Scala is accidentally stumbling on Scala Spark documentation when you were meaning to Google for PySpark.