A bash script to quickly configure a Raspberry Pi for web crawling and scraping. It automates the installation of Python, Selenium, Firefox, and other required dependencies, and adds several quality-of-life shell functions.
Choose one of enlightningthe following methods to get started.
This method is ideal for a fresh Raspberry Pi OS installation, as it uses pre-installed tools.
-
Download, unzip and navigate:
curl -LO https://github.com/m4-k131/Berry-Picker/archive/refs/heads/main.zip unzip main.zip cd Berry-Picker-main -
Proceed to the Configuration step below.
This method is useful if you already have git installed.
-
Clone and navigate:
git clone https://github.com/m4-k131/Berry-Picker.git cd Berry-Picker -
Proceed to the Configuration step below.
After following one of the installation methods, complete the setup with these steps:
-
Edit the environment file: Use a text editor like
nanoto add your credentials and authorized SSH keys.nano .env
Press
Ctrl+X, thenY, thenEnterto save and exit. -
Setup:
chmod +x setup.sh ./setup.sh source ~/.bashrc
The script adds several helpful aliases and functions to your shell:
start_crawler: Activates the Python virtual environment located at~/venv/crawler/.workon <venv_name>: A generic function to activate any virtual environment located in~/venv/.lswc [dir] [pattern]: Counts files and folders in a directory, with an optional pattern.vpn_start/vpn_stop/vpn_status: Aliases to manage an OpenVPN connection.sattach: Re-attaches to the first available detachedscreensession.check_pi: Provides a verbose, human-readable report on the Pi's power and thermal status to check for undervolting or throttling.