Skip to content

Conversation

@djantti
Copy link
Contributor

@djantti djantti commented Feb 2, 2026

We have Flatpak manifests in the tree now, so let's put them to good use! πŸ—οΈ

Right now this PR has the basics for creating single-file Mixxx bundles and uploading them as build artifacts. It's using the official Flatpak actions and containers from Flathub. The container images have all the necessary tools and SDKs included, saving on download time.

https://github.com/flatpak/flatpak-github-actions
https://github.com/flathub-infra/actions-images

These can be used for many things - verifying manifests and metadata, creating a package on other events, building Mixxx releases, uploading to repositories and even to Flathub.

There's a lot to do and my GitHub workflow experience is limited, so help is very much appreciated. πŸ™

@daschuer
Copy link
Member

daschuer commented Feb 2, 2026

This looks straight forward. Thank you.

We have discussed to pride different streams for our branches and for stable builds like we do on our PPA.
Did you look into this? Is this still a reasonable idea? How can users switch form one branch to another?
Which infrastructure do we need for this?

How does it relate to:
https://nightly.gnome.org
Can we follow their ideas?

@acolombier
Copy link
Member

Indeed, this looking very good already. One thing I would like to see is the flatpak uploaded to the downloads.mixxx.org server, like for any Mixxx build artifacts.

For that reason, it might be easier to merge the new job to the existing build matrix, so you can leverage the prepare-deploy and generate-manifest Python task.

Finally, we should look into automating the upload of a .flatpakref as part of the deploy.py script (or else), so we can easily distribute Mixxx snaphots or release with something like

flatpak install https://downloads.mixxx.org/snapshots/2.5/mixxx-2.5.4-10-ga161b85ea9-x86_64.flatpakref

Of course, we will also claim the Flathub repo for stable release, but allowing these flatpakref to be easily tested would be really good for user IMO.

I'll try to spin a PR to your repo when time allows it, but feel free to get a head start too :)

@djantti
Copy link
Contributor Author

djantti commented Feb 3, 2026

Yeah, the Flatpak build tools and containers are easy to use. I like them a lot!

For hosting a Mixxx Flatpak repo, we could run flat-manager on the download server or just rsync new OSTree repo structures after a daily build or a release. The latter approach is simple, but AFIK only the latest build would then be available on the repo. In this case we could keep an archive of .flatpak single-file bundles for older releases.

https://github.com/flatpak/flat-manager

Gnome likely uses flat-manager since they have multiple releases (flatpak remote-info --log <repo> <app-id>) available. KDE only has latest builds for nightlies. They also have .flatpakref files nicely in each app repo.

https://cdn.kde.org/flatpak

Installs from different branches can be switched using flatpak make-current <branch>, but it's also possible to use different app IDs (org.mixxx.Mixxx, org.mixxx.Mixxx.Nightly etc.). This would allow fully simultaneous installs and keep Mixxx's config files separated. The user would then have multiple desktop entries (Mixxx, Mixxx (Nightly) etc.) on their system.

For Flathub we could eventually request direct upload credentials and ask for the flathub/org.mixxx.Mixxx repo to be archived. But that's definitely something for later. πŸ˜…

@djantti
Copy link
Contributor Author

djantti commented Feb 3, 2026

For that reason, it might be easier to merge the new job to the existing build matrix, so you can leverage the prepare-deploy and generate-manifest Python task.

I was tinkering with this earlier, but quite a few steps in the matrix needed to be skipped for Flatpak build. So I was having many nested if: statements and my spider-senses started to tingle.

Maybe there's a neat way to split things into even more separate jobs? It's always difficult to figure out a clean solution with limited knowledge of what is even possible.

Anyway, I'll keep on working with this and add debug extension builds next. And PRs are most welcome! πŸ€—

@daschuer
Copy link
Member

daschuer commented Feb 4, 2026

but it's also possible to use different app IDs (org.mixxx.Mixxx, org.mixxx.Mixxx.Nightly etc.). This would allow fully simultaneous installs and keep Mixxx's config files separated.

We don't have that yet for the other targets. Since it comes with some challenges I think we should go for flatpak make-current <branch>

Is the KDE solution a matter of sharing a directory? Would it be possible to maintain this directly on out download server without any active server component?

Lets go in small steps. This is already great because it has the flatpaks as artefacts for PRs. There is only one thing we need to double check. How do they play with RPM/Debian install and future possible release streams.
What happens if a use is on a release stream and than installs a PR artefact.

How is signing handled?

@djantti
Copy link
Contributor Author

djantti commented Feb 4, 2026

Sure thing, let's go with branches. They're quite flexible. πŸ‘

The way Flatpak branches seem to work is that you can have multiple installations with the same app ID, provided they use different branches. And to make things even more flexible (and sometimes confusing), that limitation is separate for system-wide / user installations.

Branch name of master is used unless something else is specified during build. This is the current behavior for the PR and manual Flatpak builds. We can totally keep it like this, so other branch installs won't get overwritten by a PR artifact install. Also the latest package install always becomes the active one. The active branch can be switched with something like flatpak make-current org.mixxx.Mixxx beta and it's also possible to run from a specific branch with say flatpak run org.mixxx.Mixxx//stable.

The KDE nightly Flatpak repos are indeed just directories on a standard web server. I took a peek at the build jobs and they use rsync -Ha --delete to upload new app repos. That's why only the latest build is available for each app. But I'd say it's fine for a simple nightly unstable repo.

But yeah, we prolly should take small steps and limit the scope of this PR. Should we just finish up the artifact builds or add anything else?

I'll test next how the branch option behaves in the workflow. There's also a gpg-sign option for package / repo commit signing. I'll give that a try too.

@daschuer
Copy link
Member

daschuer commented Feb 4, 2026

That sounds great. I think the only part we need to consider here to have the naming convention correct that it fits to your future plans.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants