-
Notifications
You must be signed in to change notification settings - Fork 40
Description
Hi, thank you for making this great syncing framework. Here at my company we're looking into the "Sync data by time" section of the docs, and settled on bucketing data by day only for now. The worst case we're seeing for now is getting about a year of data, which is still way under the 1000 buckets limit, but the number of parameters (365) is daunting (the token would become very large).
We noticed the example that shows the usage of a SQL function json_each...
SELECT value as partition FROM json_each(request.parameters() ->> 'partitions')...and wondered if we could have also a function generate_series just like in Postgres:
SELECT generate_series('2026-01-01'::date, '2026-02-01'::date, '1 day'::interval)::date;
+-----------------+
| generate_series |
|-----------------|
| 2026-01-01 |
| 2026-01-02 |
| ... |
| 2026-01-31 |
| 2026-02-01 |
+-----------------+This way, we would only need to pass the initial and final dates as parameters.
An alternative we considered was using the inequality operators >, >=, < and <=, which would work in the simple example from the documentation. However, in our application we need to filter data for a specific user, and then we would have to denormalize the data for all the users that have access to each project, and it would be a massive performance hit to our database.