Skip to content

Optimize dashboard loading time

It's really important to optimize dashboard loading time to improve user experience. Serenytics provides an intelligent cache to do this. Here is how to configure it best for your use case at hand.

Use case 1: Data updated every day

It's often the best compromise between data recency and speed of dashboard loading.

Whatever your data source, the process is the same:

  • in your data source configuration, click on "Cache data" and choose the option "until cache invalidation".
  • in the "Automation" menu, create a new task of type "Refresh dashboards cache" and schedule it every data (e.g. every day at 3 a.m.). Select the dashboard(s) whose cache must be refreshed. All caches of data sources used in the dashboards you selected will automatically be invalidated when the task runs, and the new data will be cached. When a user loads the dashboard during the day, all the data will already be in the cache. Our compute engine won't have to compute anything and the time to load the dashboard will be minimal.

Use case 2: Weekly or monthly updated data

It's the same principle as before. You just have to schedule the task every week or every month instead of every day.

Use case 3: Live dashboard

If your dashboard must have the latest data, several strategies are available in order to get the best loading performances.

If the data source is a SQL source hosted in you infrastructure, and if it's not overloaded:

In this case, it's best to use our REST API to invalidate cache from your back-end, each time you modify the data.

In every other case:

You should synchronise your data in our internal datawarehouse (and use the new Storage data sources in your dashboards). For that, you have two solutions:

  • either use a python script to synchronise the data incrementally, and schedule it for instance every 5 minutes if it's fast, otherwise every hour.
  • or push the data updates directly to our datawarehouse each time the data is modified on your side, by using ou batch API from your back-end.


  • Updating the data in Serenytics internal datawarehouse (especially via the batch method) will invalidate the data source cache automatically.