Sign In Register

How can we help you today?

Start a new topic

Caching large JSON documents as string

I have many large complex document records used for level\menu descriptions. 50-500kb and up to 1-10,000 properties in the each document. I intend to use the sparks memory cache for the most frequent items. But does it offer any performance/load benefit to have a separate intermediary 'Cache' collection where the complex document is just saved as a string against a key?

In addition, I can not find reference to limits on the Sparks Cache. Is there any limits to number of entries and total memory size etc? And how will the system respond when you break them? 

Thank you in advance



Login to post a comment