- Crawl data from Open Weather API
- Load data into SQLite relational database
- Schedule process to run every 3 seconds
✔️ Python --version 3.7 or above to run the project
- Download Python at https://www.python.org/downloads/
✔️ Python libraries:
- from asyncio.windows_events import NULL
- import sqlite3 (included in the standard library since Python 2.5).
- import schedule
- import requests
- import json (built-in module)
✔️ Environment: Visual Studio Code, with listed extensions:
- Python (to run Python source)
- SQLite Viewer (to view SQLite tables)
✔️ Project Package Management: Poetry
src/main.py
: run this file, which contains whole processsrc/database.py
: contains database operationssrc/utilities.py
: contains file-database processing functionsschema/Schema.jpg
: database schema imageschema/response_description.txt
: API data descriptionresult/weather.json
: response value from API, saved in JSON formatresult/weather.db
: SQLite database
- Install Python 3.x version, VS Code (https://code.visualstudio.com/), listed above libraries and extensions
- In Terminal of VS Code, run the command to install all required packages
>> pip install poetry
>> poetry install
- Run main.py file
- "Enter file name (i.e: weather.json)": give JSON file a name, this file will store crawled data
- "Enter database name (i.e: weather.db)": give SQLite database a name
- "Enter location": city name (i.e: Hanoi, Moscow, New York, Tokyo, Havana, Seoul, Saigon, Paris, Berlin,..)
- "Enter unit": metric or imperial
- If there is no error, the program will produce 2 new files (auto-generate): .json and .db (with crawled data)