- πΈ Preview
- β¨ Features
- π Getting Started
βΆοΈ Running the Bot- π³ Running with Docker
- π οΈ Configuration
- π‘ Ideas for Extensions
- βοΈ Legal & Ethical Notes
- π License
Hereβs how the bot looks in action:
- π Automatic updates every 2.5 minutes
- π Live opening hours directly scraped from each library page
- πͺ Seats information: occupied, free, and percentage
- π Location info for each library branch
- π MySQL persistence for later analysis and visualization
- π Fun rotating status messages (e.g. βlistening to loud students in the libraryβ)
git clone https://github.com/niclassslua/unimannheim-bib-discord-bot.git
cd unimannheim-bib-discord-botnpm install- Go to the Discord Developer Portal.
- Click New Application β give it a name β Create.
- Navigate to Bot β click Add Bot.
- Copy the Bot Token (use this in your
.envfile asDISCORD_TOKEN). - Under Privileged Gateway Intents, enable:
- β Message Content Intent
- β Server Members Intent
- Under OAuth2 β URL Generator:
- Select bot and applications.commands.
- Under Bot Permissions, choose:
- Send Messages
- Embed Links
- Read Message History
- Copy the generated URL and add the bot to your server.
Copy the provided .env.example and fill in your own values:
cp .env.example .envEdit .env and add:
- π« Your Discord Bot Token
- ποΈ Your MySQL database credentials (only if you enable DB persistence)
- π¬ The Discord Channel ID + message IDs for each embed
npm run devnpm startThe bot will:
- Login to Discord
- Fetch the latest occupancy data
- Update the pinned messages in your chosen channel
- Save a snapshot every 4 update cycles to your database (if DB is enabled)
This project provides two Compose setups depending on whether you want persistence:
Use this if you donβt want to store data (Discord updates only).
Compose file: docker-compose.bot.yml
docker compose -f docker-compose.bot.yml up -d --buildThis sets USE_DB=false explicitly so the bot skips all DB writes.
Use this to run the bot with its own MySQL database.
Compose file: docker-compose.dev.yml
docker compose -f docker-compose.dev.yml up -d --build- MySQL is started alongside the bot.
- The schema is auto-created from
resources/sql/init.sql. - Data persists in the
db_datavolume. - The bot connects internally to the
dbservice withUSE_DB=true.
# Discord
DISCORD_TOKEN=your-bot-token-here
TEXT_CHANNEL_ID=123456789012345678
MESSAGE_A3=123456789012345678
MESSAGE_A5=123456789012345678
MESSAGE_EHRENHOF=123456789012345678
MESSAGE_SCHNECKENHOF=123456789012345678
# Database
USE_DB=true # set false to disable DB writes entirely
DB_HOST=localhost
DB_USER=your-db-user
DB_PASSWORD=your-db-password
DB_DATABASE=your-db-name
# Scraper
OCCUPANCY_URL=https://www.bib.uni-mannheim.de/standorte/freie-sitzplaetze/
DEBUG=false
The bot writes snapshots into a table belegung:
CREATE TABLE belegung (
id INT AUTO_INCREMENT PRIMARY KEY,
bib VARCHAR(50) NOT NULL,
percentage INT NOT NULL,
occupied INT NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP
);- π Grafana dashboard for historical occupancy data
- π REST API for other apps to consume the data
- This project scrapes the public seat availability page of the University of Mannheim Library.
- According to the robots.txt, the relevant path
/standorte/freie-sitzplaetze/is not disallowed for crawlers. - The data is publicly accessible without authentication and intended to inform students about seat occupancy.
- Requests are made only once every 150 seconds, resulting in negligible server load.
- The bot is non-commercial and purely for student convenience (Discord notifications).
- I fully respect the rights of the University Library and will adapt or disable the scraper immediately if requested.
This project is licensed under the MIT License.
Feel free to use, modify, and share β just credit the author.
Made with β€οΈ
