Skip to content

m4-k131/Berry-Picker

Repository files navigation

🍓 Berry-Picker

A bash script to quickly configure a Raspberry Pi for web crawling and scraping. It automates the installation of Python, Selenium, Firefox, and other required dependencies, and adds several quality-of-life shell functions.


Installation

Choose one of enlightningthe following methods to get started.

Method 1: Using curl + unzip (Recommended)

This method is ideal for a fresh Raspberry Pi OS installation, as it uses pre-installed tools.

  1. Download, unzip and navigate:

    curl -LO https://github.com/m4-k131/Berry-Picker/archive/refs/heads/main.zip
    unzip main.zip
    cd Berry-Picker-main
  2. Proceed to the Configuration step below.

Method 2: Using git

This method is useful if you already have git installed.

  1. Clone and navigate:

    git clone https://github.com/m4-k131/Berry-Picker.git
    cd Berry-Picker
  2. Proceed to the Configuration step below.


Configuration & Execution

After following one of the installation methods, complete the setup with these steps:

  1. Edit the environment file: Use a text editor like nano to add your credentials and authorized SSH keys.

    nano .env

    Press Ctrl+X, then Y, then Enter to save and exit.

  2. Setup:

    chmod +x setup.sh
    ./setup.sh
    source ~/.bashrc

Usage

The script adds several helpful aliases and functions to your shell:

  • start_crawler: Activates the Python virtual environment located at ~/venv/crawler/.
  • workon <venv_name>: A generic function to activate any virtual environment located in ~/venv/.
  • lswc [dir] [pattern]: Counts files and folders in a directory, with an optional pattern.
  • vpn_start / vpn_stop / vpn_status: Aliases to manage an OpenVPN connection.
  • sattach: Re-attaches to the first available detached screen session.
  • check_pi: Provides a verbose, human-readable report on the Pi's power and thermal status to check for undervolting or throttling.

About

A bash script to configure a Raspberry Pi for web crawling and scraping. It automates the installation of Python, Selenium, Firefox, and other required dependencies.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages