Scrapes the publicly available "wiki" (probably everyone's search history) from ahaonline.cz, blesk.cz and e15.cz
Download the BrainfartScraper.sh file, edit maximum values based on what the highest page number on the target site(s) are. Delete code for any page you do not want to scrape. Resulting .txt files will be created in the local directory containing all search queries, one per line (LF).
Note: Recommended using an HDD to prevent damage from excessive write operations because the script writes to the output file once for every page, meaning 300-1617 writes per wiki!
Give the file execute permissions:
chmod +x ./BrainfartScraper.sh
Execute:
bash ./BrainfartScraper.sh
The command will finish depending on your internet connection, in about 2 hours if scraping all wikis.
The script's output as of 2023-04-30 is available in the attached text files.