How to tell PIP that pyspark has already been installed?

152 Views Asked by At

I installed a custom-built version of Spark, and set SPARK_HOME and PYTHON_PATH to the corresponding folders, as per this guide

Then I used pip to install some python package, which has pyspark as a dependency, hence I got two versions of pyspark installed - my custom-built pyspark, and the one from PyPI.

Is there any way to make this not happen?

Thanks!

0

There are 0 best solutions below