'Split files for train and test in Google Colab

I have successfully trained and tested my model on local jupyter notebook, but I want to try the same code in Google Colab as I want to try other expensive models of CNN. Can somebody please help me what is wrong here. I have uploaded my files in the Google Colab environment from my Google Drive. Here I want to split files from 100 folders for train and test but every time I get the error that No Such File or directory.

folder = 'sample_data/firmasSINTESISmanuscritas'
number_of_users = 100
count_of_users = 0
for dir in os.listdir(folder):
print(dir)
filenames = [
    #os.path.join(os.path.dirname(os.path.abspath(__file__)), folder+'\\'+dir, i) for i in os.listdir(folder+'\\'+dir)
    os.path.join(folder+'\\'+dir, i) for i in os.listdir(folder+'\\'+dir)
]
filenames = filenames[:-1]

labels = [filename.__contains__('c-') for filename in filenames]
labels = np.array(labels, dtype=bool).astype(int).tolist()

x_train, x_test, y_train, y_test = train_test_split(filenames, labels, test_size=0.3, random_state=42)

filenames_train = filenames_train + x_train
filenames_test = filenames_test + x_test
Y_train = Y_train + y_train
Y_test = Y_test + y_test

count_of_users += 1
if number_of_users <= count_of_users:
    break
print('end')

[Error][1]


Solution 1:[1]

Have you already mounted your google drive to colab to access those files? if not, please do so by following the commands mentioned in this blog post.

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1 Bhargavi