Trying to install Pytorch in an Anaconda env but it is throwing an error related to libculas

102 Views Asked by At

I am on Ubuntu 20.04 so I can run ROS1 I am trying to follow the README in this repo: https://github.com/SYSU-STAR/H2-Mapping/tree/main?tab=readme-ov-file I set up my conda env in bash

conda env create -f h2mapping.yaml
conda activate h2mapping 

Then I attempt to install PyTorch (I need cuda for the following steps in the README):

conda install pytorch torchvision torchaudio pytorch-cuda=12.1 -c pytorch -c nvidia --solver=libmamba

It gives me the following error:

Channels:
 - pytorch
 - nvidia
 - defaults
Platform: linux-64
Collecting package metadata (repodata.json): done
Solving environment: failed

LibMambaUnsatisfiableError: Encountered problems while solving:
  - package pytorch-cuda-12.1-ha16c6d3_5 requires libcublas >=12.1.0.26,<12.1.3.1, but none of the providers can be installed

Could not solve for environment specs
The following packages are incompatible
├─ libcublas 11.10.3.66.*  is requested and can be installed;
└─ pytorch-cuda 12.1**  is not installable because it requires
   └─ libcublas >=12.1.0.26,<12.1.3.1 , which conflicts with any installable versions previously reported.

Any thoughts on what to do differently? I have tried uninstalling and reinstalling cuda, I have tried downloading anaconda without having cuda installed, and I have tried reinstalling anaconda. I have also tried adding pointers to where cuda is installed to my bashrc, and I have also tried getting rid of any pointers to my Cuda install as well. Every time, I get the same error. Any thoughts?

1

There are 1 best solutions below

0
Tim Beyer On

I was able to install pytorch by running the command

conda install nvidia/label/cuda-12.1.0::NAMEHERE --solver=libmamba

for each of the offending packages.

for example, you could run

conda install nvidia/label/cuda-12.1.0::libcublas --solver=libmamba

and the next time you run the original command, another package error will show. After a few packages, the installation will run through.