'No such file or directory when using DataLoader.from_pascal_voc

I'm having trouble using DataLoader.from_pascal_voc from TFLite Model Maker. I've successfully mounted Google Drive into Google Colab and when I printed the length of os.listdir, it shows the correct amount of data. Code for specifying training and testing path:

training_dir_images = '/content/drive/MyDrive/Skripsi/Data/Training/images'
training_dir_annotations = '/content/drive/MyDrive/Skripsi/Data/Training/annotations'

testing_dir_images = '/content/drive/MyDrive/Skripsi/Data/Test/images'
testing_dir_annotations = '/content/drive/MyDrive/Skripsi/Data/Test/annotations'
print(len(os.listdir(training_dir_images)))
print(len(os.listdir(training_dir_annotations)))
print(len(os.listdir(testing_dir_images)))
print(len(os.listdir(testing_dir_annotations)))
-----------------OUTPUT------------------
1700
1700
300
300

Here's the code when I tried to load the data using DataLoader:

training_data = object_detector.DataLoader.from_pascal_voc(training_dir_images, training_dir_annotations, label_map=labels)

test_data = object_detector.DataLoader.from_pascal_voc(testing_dir_images, testing_dir_annotations, label_map=labels)

Here's the error:

/usr/local/lib/python3.7/dist-packages/tensorflow/python/lib/io/file_io.py in _preread_check(self)
     78                                            "File isn't open for reading")
     79       self._read_buf = _pywrap_file_io.BufferedInputStream(
---> 80           compat.path_to_str(self.__name), 1024 * 512)
     81 
     82   def _prewrite_check(self):

NotFoundError: /content/drive/MyDrive/Skripsi/Data/Training/images/S_9; No such file or directory

I'm using tensorflow 2.5.0. Are there any solution to this problem? Thank you



Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source