Multisensory perception - 1st Annual SEAS Hackathon@the SEAS Sustainable Future Hub
Team can make use of provided datasets, scripts, or other resources to explore topics or innovative methods to extract information from vison and auditory.
Team can think of some potential focuses:
- Any topics that interest you the most... Enjoy the Hackathon!
- The usage of image depth for environmental sciense
- The distribution of different sound events
- The viewshed and urban perceptions
- Some challenges...
- Photo-based viewshed analysis (image2LiDAR)
- Training soundscape prediction based on pairwised comparison labels
- Predicting soundscape using sounds and photos
Provided datasets include photos and sounds/spectrograms collected from 200 locations across the city core of Ann Arbor, MI.
- Spatial points of 200 locations
- 360-degree photos (from 200 locations)
- Spectrograms extracted from outdoor sounds recordings (from 200 location)
- Pretrained model for sematic segmentation based on CamVid dataset
- ESC-50: Dataset for Environmental Sound Classification
- other data source:
google-streetview
a python package for scrapping Google street view using APIfastai
a python package for training neuralnetworkviewscape
an R package for computing land information within viewshed
- Panoramic photo slpliter
- Semantic segmetation
- Visible greeness extraction
- Sound semantic clssification
- Image depth estimation
- Pseudo-LiDAR from Visual Depth Estimation: Bridging the Gap in 3D Object Detection for Autonomous Driving
- Audio event recognition
- Parallel Loops in Python&R
- NOAA DATA ACCESS VIEWER
- Salesses, P., Schechtner, K., & Hidalgo, C. A. (2013). The collaborative image of the city: mapping the inequality of urban perception. PloS one, 8(7), e68400.
- Tabrizian, P., Baran, P. K., Van Berkel, D., Mitasova, H., & Meentemeyer, R. (2020). Modeling restorative potential of urban environments by coupling viewscape analysis of lidar data with experiments in immersive virtual environments. Landscape and Urban planning, 195, 103704.
- Ono, Y., Hara, S., & Abe, M. (2022). Prediction method of Soundscape Impressions using Environmental Sounds and Aerial Photographs. arXiv preprint arXiv:2209.04077.
-
Get a Google account and set up a folder in your Google Drive. In this case, let's assume you create a folder called
multisensory_data
. -
Create a new Google Colab file (jupyter notebook) in the folder. In Colab, you should connect to your Google Drive and set up the work directory.
Here is what you run in your first code cell:
from google.colab import drive
drive.mount('/content/drive')
# setup working directory to a folder in Google Drive
import os
root_dir = "/content/drive/My Drive/"
# set folder name (this folder should exist in 'root_dir')
project_folder = "multisensory_data/"
def set_working_directory(project_folder):
# change the OS to use your project folder as the working directory
os.chdir(root_dir + project_folder)
print('\nYour working directory was changed to ' + root_dir + project_folder + \
"\n\nYou can also run !pwd to confirm the current working directory." )
set_working_directory(project_folder)
- Then, you can clone this repo in your work directory with the code below in your second code cell:
! git clone https://github.com/billbillbilly/Database.git
- Finally, you can download datasets using
datasets.ipynb
inscripts
folder.
Well done! Now you are able to start!