It is important to be aware of the relationship between the in memory cache and the amount of memory allocated to the hosting environment for Nathean Analytics.

First with the hosting application, there is a certain amount of RAM allocated to the web application. This memory allocation is shared between the application itself and the in memory cache. When the original solution is being designed, it’s useful to attempt to size the requirements, but with hosting environments it’s quite straightforward to resize the amount of memory allocated at any time.

At any time you can view the memory used by opening the About dialog.

The second element is how large the cached datasets use. For each dataset which is marked to be cached, the system stores this data in memory and the data for the views on the dataset. Obviously if large datasets are cached, then the memory can become full.

Note When the total memory used exceeds 90% of the limit for over 30 seconds it automatically recycles the application (when in an Azure hosted plan see here). This has the effect of clearing the in memory cache and so can result in slowness of the dashboards (as the cache will be rebuilt on the next access.

Strategies for improving memory usage

  1. Reduce the number of cached datasets. Dashboards which used cached datasets benefit from caching. For formatted reports less so.
  2. On cached datasets, set the default parameters such that only a small number of rows are returned.
  3. Reduce the number of views cached – as a default they are all cached but individual views can be removed from the caching process – exclude views from the cache
  4. Identify datasets which are consuming too much memory
  • It may be obvious – if a dataset is expected to return a large number of rows or columns, it will consume more memory.
  • Use the field sizes on the dataset field definition to help identify large rows.
  • Examine the content of the cache – You set up a data connection and data set which shows you the content of the cache at any time. This will help you identify the hungry dataset.
  • If all the above do not resolve the issue, then it may be that additional memory is required to be provisioned. In the Azure environment, this is straightforward and does require additional hosting charges.
Revision: 7
Last modified: Nov 05, 2019

Feedback

Was this helpful?

Yes No
You indicated this topic was not helpful to you ...
Could you please leave a comment telling us why? Thank you!
Thanks for your feedback.

Post your comment on this topic.

Post Comment