I’m experiencing high CPU usage on LiveQuery use.
I have a lot of active queries and a lot of data in the DB, for example: 5K documents, 25 live queries and 200 changes a minute (not exact numbers).
The real issue is the CPU usage by the “changed” function (used Android Studio tools to check CPU).
As a quick fix I though to lower the live query frequency, I’ve noticed that live queries run at 200 mills frequency, which is an over kill for us. Is there a way to slow it down ?
I dont believe you have a way to control the frequency - although we do coalesce some of the notifications (@Jens? ) .
That said, 25 live queries seems little quite a bit. Have you considered changing your queries or your data model so you have fewer of those ? Do you need all 25 of them to be live at the same time ?
Its’s very complex and dynamic infrastructure with multiple clients on the platform, so we do need that amount of queries.
I’ve noticed that the name of the frequency constant in CB code is called “default-something” which implies that it can be changed, possible in a future update… It will really help if we could control that.
Can you tell me if the live query process is optimized? or maybe there is a future task to optimize it furthermore?
25 live queries is probably too many. The way a live query works is that, when a db-changed notification is posted, it runs the query again, then compares the new results to the current ones; if they differ, it posts a notification. There’s some logic to keep it from running the query too often, but no more sophisticated optimizations.
If you’re making a large number of changes at once, one thing you can do is make them in a different Database object, all in on batch. Then your main Database will get only one change notification, after the batch/transaction commits, and your queries will only run once.