Could you please provide the processed datasets for downloading? It quite difficult to download so many origin raw data, especially the website of OXFORD ROBOTCAR DATASET is down recently.
Hello, thank you for your reply! I ran into a problem while trying to run your code:
Specifically, in the CL-SLAM/slam/slam.py file, there is a code that returns the instance size:
return len(self.online_dataset), every time you run here, an error will be reported ValueError: len() should return >= 0.
I used KITTI's sequence 9 to run main_adapt.py. Is there any problem in the data set preparation stage? According to my observation, the online_dataset instance does not have a customized member function of len(). What do you want to return here?
The code make_cityscapes_buffer.py uses mobilenet_v3_small as feature encoder to construct replay buffer. However, the SLAM process uses depth's image encoder to get the feature. this looks a bit strange. Should I change mobilenet_v3_small to depth's image encoder to get the same results in the paper ?
Thank you for your contribution.
Could you please tell me the amount of RAM required to run main_adapt.py with a replay buffer?
I currently have 64GB of RAM, but after creating the Replay Buffer for Cityscapes it doesn't work properly when running main_adapt.py. On the basis of monitoring the system, it was found that there was insufficient RAM to cause this problem.