In implementing POSSE (publish on your own site, syndicate elsewhere), I need to publish to multiple social networks. Social Publish provides direct API integration for X (Twitter), Mastodon, Bluesky, and LinkedIn, plus an RSS feed for automation.
- Publish the same post to multiple networks from one form
- Upload images with alt-text, or let an LLM generate alt-text automatically
- Provide an RSS feed for external automation tools like IFTTT
Table of Contents
My docker-compose setup:
version: "3.8"
services:
# ...
social-publish:
container_name: social-publish
image: ghcr.io/alexandru/social-publish:latest
restart: always
healthcheck:
test: ["CMD-SHELL", "curl --head http://localhost:3000/ || exit 1"]
ports:
- "3000:3000"
env_file:
- ./envs/social-publish.env
networks:
- external_networkWhere ./envs/social-publish.env contains:
# Where the server is hosted — needed for correctly generating an RSS feed
BASE_URL="https://your-hostname.com"
# The server's Basic AUTH credentials
SERVER_AUTH_USERNAME="your-username"
SERVER_AUTH_PASSWORD="your-password"
# Bluesky credentials
BSKY_HOST="https://bsky.social"
BSKY_USERNAME="your-username"
BSKY_PASSWORD="your-password"
# Mastodon credentials
MASTODON_HOST="https://mastodon.social"
MASTODON_ACCESS_TOKEN="your-access-token"
# Twitter OAuth1 key and secret (Consumer Keys in the Developer Portal)
TWITTER_OAUTH1_CONSUMER_KEY="Api Key"
TWITTER_OAUTH1_CONSUMER_SECRET="Api Secret Key"
# LinkedIn OAuth2 credentials
LINKEDIN_CLIENT_ID="your-client-id"
LINKEDIN_CLIENT_SECRET="your-client-secret"
# LLM for alt-text generation (optional)
# Configure the API endpoint, key, and model for your LLM provider
# For OpenAI:
LLM_API_URL="https://api.openai.com/v1/chat/completions"
LLM_API_KEY="your-openai-api-key"
LLM_MODEL="gpt-4o-mini"
# For Mistral:
# LLM_API_URL="https://api.mistral.ai/v1/chat/completions"
# LLM_API_KEY="your-mistral-api-key"
# LLM_MODEL="pixtral-12b-2409"
# Used for authentication (https://jwt.io)
JWT_SECRET="random string"For Bluesky, you'll need an "app password".
- Go here to create one: https://bsky.app/settings/app-passwords
- Copy the password
- Set the
BSKY_PASSWORDenvironment variable to it
Keep it safe, as it grants access to everything.
For Mastodon, you'll need an "access token". Here's how to get one:
- Go to: https://mastodon.social/settings/applications
- Create a "New Application"
- Select
write:statusesandwrite:mediafor permissions, and unselect everything else - Click on the newly created application
- Copy "your access token"
- Set the
MASTODON_ACCESS_TOKENenvironment variable to it
For Twitter, we're working with Oauth1.
- Go to: https://developer.twitter.com/en/portal/projects-and-apps
- Create a project and app
- In the "Keys and tokens" section of the app, generate "Consumer Keys" and copy the generated "App Key and Secret"
- In the app's settings, go to "User authentication settings" and add as the "Callback URL":
https://<your-domain.com>/api/twitter/callback(replace<your-domain.com>with your domain, obviously) - Set the
TWITTER_OAUTH1_CONSUMER_KEYand theTWITTER_OAUTH1_CONSUMER_SECRETenvironment variables - Once the server is running, go to
https://<your-domain.com>/accountand click on "Connect Twitter"
For LinkedIn, we're working with OAuth2.
- Go to: https://www.linkedin.com/developers/apps
- Click "Create app" and fill in the required details
- In the "Auth" tab, copy the "Client ID" and "Client Secret"
- Add the following redirect URL:
https://<your-domain.com>/api/linkedin/callback(replace<your-domain.com>with your actual domain) - In the "Products" tab, request access to:
- "Sign In with LinkedIn using OpenID Connect" (provides
openidandprofilescopes) - "Share on LinkedIn" (provides
w_member_socialscope)
- "Sign In with LinkedIn using OpenID Connect" (provides
- Set the
LINKEDIN_CLIENT_IDandLINKEDIN_CLIENT_SECRETenvironment variables - Once the server is running, go to
https://<your-domain.com>/accountand click on "Connect LinkedIn"
Note: LinkedIn access tokens expire after 60 days. The system automatically refreshes tokens using the refresh token, which is valid for 1 year. You'll need to reconnect if the refresh token expires.
The application can integrate with LLM providers to automatically generate alt-text descriptions for images. This feature is optional and supports any OpenAI-compatible API (including OpenAI, Mistral AI, and other providers).
Supported providers:
- OpenAI (e.g., GPT-4o-mini): https://platform.openai.com/api-keys
- Mistral AI (e.g., Pixtral): https://console.mistral.ai/api-keys/
- Any OpenAI-compatible API endpoint
Configuration:
- Get an API key from your chosen provider
- Set the environment variables:
LLM_API_URL: The API endpoint URL (e.g.,https://api.openai.com/v1/chat/completions)LLM_API_KEY: Your API keyLLM_MODEL: Model name (e.g.,gpt-4o-minifor OpenAI,pixtral-12b-2409for Mistral)
The RSS feed is exposed at /rss (e.g., http://localhost:3000/rss). Use it with automation tools like ifttt.com if you want additional workflows beyond the direct integrations.
This is a Kotlin multiplatform project with:
- Backend: Ktor server with Arrow for functional programming
- Frontend: Compose for Web (Kotlin/JS)
- Build: Gradle with Kotlin DSL
To run the development environment with live reload:
make devThis starts both the backend server (port 3000) and frontend dev server (port 3002) with hot reload enabled.
To run backend and frontend separately:
# Backend only
make dev-backend
# Frontend only
make dev-frontendYou can navigate to http://localhost:3002 for the frontend, while the backend is available at http://localhost:3000.
To build the project:
make buildTo run tests:
make testTo check and fix code formatting:
make lint # Check formatting
make format # Auto-format codeTo build and test the Docker images locally:
# Build and run JVM image
make docker-run-jvmTo run tests in a Docker environment that matches production:
# Run all tests in Docker
make test-docker
# Run specific ImageMagick tests in Docker
make test-imagemagick-dockerSee the Makefile for all available commands.
This project is licensed under the GNU Affero General Public License v3 (AGPL-3.0). See LICENSE.txt for details.
