-
-
Notifications
You must be signed in to change notification settings - Fork 410
Description
Bug report
- [ x ] I confirm this is a bug with Supabase, not with my own application.
- [ x ] I confirm I have searched the Docs, GitHub Discussions, and Discord.
Describe the bug
When subscribing to Postgres changes with a large id.in.(...) filter list, Supabase Realtime fails to register the subscription. The server returns ERROR 54000 (program_limit_exceeded) because the subscription table index row size exceeds the btree limit for subscription_subscription_id_entity_filters_key. This prevents realtime updates from being delivered.
To Reproduce
- Create a Realtime channel and register a postgres_changes subscription with a filter containing a large list of IDs (e.g., id=in.(<60+ UUIDs>)).
- Subscribe to the channel.
- Observe the server error in the realtime logs.
Example supabase-js:
const channel = supabase.channel('db-changes');
channel.on(
'postgres_changes',
{
event: '*',
schema: 'public',
table: 'destinations',
filter: `id=in.(${hugeListOfUuids.join(',')})`,
},
payload => {
console.log(payload);
}
);
channel.subscribe();
Observed from Supabase:
receive error realtime:db-changes system {
"channel": "db-changes",
"extension": "postgres_changes",
"message": "Unable to subscribe to changes with given parameters. An exception happened so please check your connect parameters: [schema: public, table: destinations, filters: [{\"id\", \"in\", \"{...}\"}]]. Exception: ERROR 54000 (program_limit_exceeded) index row size 3056 exceeds btree version 4 maximum 2704 for index \"subscription_subscription_id_entity_filters_key\" ...",
"table": subscription
"constraint": subscription_subscription_id_entity_filters_key
"hint": Values larger than 1/3 of a buffer page cannot be indexed.
Consider a function index of an MD5 hash of the value, or use full text indexing.
Index row references tuple (7,3) in relation \"subscription\".", "status": "error"}
}
Expected behavior
According to the documentation this filter is only limited to 100 values, but if there is also a size limit, this should also be documented. This error can only be surfaced by enabling realtime debug logs in the Supabase constructor.
Ideally, Supabase should either this this to accept the subscription regardless of filter size (perhaps using the MD5 fix documented in the returned error), or return a clearer error / limit guidance before attempting to insert the oversized filter into the subscription table (e.g., reject with a user‑facing limit or documented max size).
System information
- Version of supabase-js: [e.g. 6.0.2]
- Running against Supabase Cloud (Hosted)
Additional context
This is a React Native app. We dynamically subscribe to a growing set of destination IDs. Once the filter list grows large, realtime registration fails with the error above and no updates are delivered. If there’s a documented max filter size or a recommended workaround (e.g., server-side views, channel per subset, etc.), that would be helpful.
This can currently be worked around by batching the IDs so that they don't get near the limit (10-20 per batch)