I have been trying to load the NWPU VHR-10 dataset in Google collab from a directory in Google drive using 'on.listdir()' function .
However, I encountered a 'FileNotFoundError' when trying to access the directory where the positive image set located.
I am getting the following error message:
FileNotFoundError: [Errno 2] No such file or directory: '/content/drive/My Drive/deeplearning_dataset/NWPU VHR-10 dataset/positive image set'
I have confirmed that the dataset is stored in my Google drive, and that I have the correct permissions to access it.
I have also tried modifying the directory paths to ensure that they match the directory names in my Google drive.
Here is the code i am using :
from google.colab import drive
drive.mount('/content/drive')`
pos_img_dir ="/content/drive/My Drive/deeplearning_dataset/NWPU VHR-10 dataset/positive image set"
pos_img_files = [f for f in os.listdir(pos_img_dir) if f.endswith('.jpg')]
pos_img_files = np.array(pos_img_files)
I have used "os.path.join()" to join the base path and the filename, but still get the same error:
pos_img_dir ="/content/drive/My Drive/deeplearning_dataset/NWPU VHR-10 dataset/positive image set"
pos_img_files = [os.path.join(pos_img_dir, f) for f in os.listdir(pos_img_dir) if f.endswith('.jpg')]
pos_img_files = np.array(pos_img_files)