We present our idea of Smart Waste Locator system which is an Automated Waste Detection System based on Computer Vision Algorithms.
We have created a custom model with modified MobileNetv3 as the backbone that finds the segmentation maps of the waste detected. This model works faster than most of the light weight image processing models like SSD MobileNetv2. Along with that, rather than just giving a bounding box, our model gives a full segmentation map that traces the waste.
- Our Segmentation map model:
- Optimized SSD MobileNetv3:
We can see that the frame rate on our model is around 3 times faster than the conventional SSD model. You can check a full output video in the location "RA26_TheSixthSense_WIMDR/blob/master/much_faster_model/segout_bb.mp4".
- Tensorflow 2.3
- Tflite
- OpenCV
In the repo directory "RA26_TheSixthSense_WIMDR/much_faster_model/", run the "proc_video.py" file with all the above libraries installed. The video source there can be replaced by any file or by the camera output by initializing the "cam" variable to cv2.VideoCapture(0).
This model works on the server side. Using SSDMobileNetv2, we can identify different elements of the garbage dump. This will help the managers and collectors to identify the type of dump, therefore enabling them to facilitate the waste segregation during collection.
- Tensorflow 2.3
- OpenCV
In the repo directory "RA26_TheSixthSense_WIMDR/segregation_model/", open the "proc_video.py" file and add an image source to the variable IMDIR. Running the code will show you the output of the image with the type of waste tagged. You can also check some created outputs in the directory "RA26_TheSixthSense_WIMDR/segregation_model/detect_out/".
This site and app allows the manager to see the detected waste and assign the tasks to ragpickers. These will be uploaded to our firebase DB.
Manager's app : "RA26_TheSixthSense_WIMDR/SmartWasteDetector/"
Manager's website : "RA26_TheSixthSense_WIMDR/SmartWasteWebsite"
- Android and Andriod Studios
- Javascript
- CSS
- Google Maps API
- HTML
- Firebase
This app shows the assigned task to the ragpickers and help them navigate to it. This system also let's them mark the completion of the task. This can be done by clicking the picture of the area with the application and using computer vision to ensure that the picture has not garpage in it. If the picture is clean, then the ragpicker can upload the image.
Ragpicker's app : "RA26_TheSixthSense_WIMDR/Ragpicker-App/"
- Flutter
- Tensorflow Lite
- Google Maps API
- Firebase
This is a basic render of the device made on rhinoceros 6. It represents the basic design of the device which will be mounted on the vehicles to detect garbage.
File Location : "RA26_TheSixthSense_WIMDR/Device Render.3dm"
This file require rhinoceros 6 to view.
This is a simple webpage that tells the administrator about which devices are active. This is done by sending a ping every one minute to a central server. If a ping is not recived for 2 minutes, the device is declared offline.
This is the application which we used to make our custom dataset.
- Firebase connectivity
- GPS location fetcher
- Waste Index calculator
- UDP Streaming between source and server for drones