I installed a custom-built version of Spark, and set SPARK_HOME and PYTHON_PATH to the corresponding folders, as per this guide
Then I used pip to install some python package, which has pyspark as a dependency, hence I got two versions of pyspark installed - my custom-built pyspark, and the one from PyPI.
Is there any way to make this not happen?
Thanks!