This repo is a Python project that is attempting to scrape all individual season data from NHL.com and prepare it to be copied into a SQL database.
- Create a virtual environment (if desired)
pip install -r requirements.txt
Follow the instructions here. You'll need a local server running to load data into, or perhaps something hosted with connection information to configure later.
I installed postgres via apt, then created a user and database:
sudo apt install postgresql
su -u postgres createuser -D -A -P myusername
su -u postgres createdb -O myusername mydatabase
The script should just run (within the virtual environment) via
python scrape.py
We can then put that data into our database with the following set of commands:
psql -U myusername -d mydatabase -f create_db.sql
psql -U myusername -d mydatabase -c "\copy players FROM './data/player_bios.csv' delimiter ',' csv header;"
psql -U myusername -d mydatabase -c "\copy player_teams FROM './data/player_teams.csv' delimiter ',' csv header;"
psql -U myusername -d mydatabase -c "\copy skater_stats FROM './data/skater_data.csv' delimiter ',' csv header;"
psql -U myusername -d mydatabase -c "\copy goalie_stats FROM './data/goalie_data.csv' delimiter ',' csv header;"
We need to put the \copy
commands outside the script as postgres will not allow them from inside a script easily.
It may be possible, but this seemed easier than setting up authentication.
With the database built, it should now be a relatively simple matter to collect data from the ongoing season and
update the tables. The update.py
script does this by scraping data for the current season (set in the script),
dropping data collected so far (except bios, for which we note only new players), then appending the newly scraped
data to the existing tables.
- Scrape season standings, schedule
- Pull settings into a global settings file
- Remove the need for bash scripts by using SQLAlchemy to create the initial tables