I have run into a rather urgent and show-stopping issue with the REST API and I need advice on how to resolve it. Namely, we use the REST API to enter a large number (12 million) records into a Data type. First we get the X-GS-JWT token and then we do a batch of PUT requests to populate the data. However, with the rate limitations (300 requests per minute) it'd take almost 30 days to populate all the data without hitting the rate limit (200ms delay between requests). Of course this now also means that we have to stop and restart the batch job every hour to generate a new X-GS-JWT key, otherwise the PUT requests start failing, and that is unmaintainable given the scope of the task. Can we please get some help regarding this as the project deadline is in 4 days and we somehow need to get this data entered.
Some advice from someone who has had to steer around things like this.
1. Use a GS callback instead of a single data API PUT. With that callback - you can send up to 200K of data at once.
2. get a new X-GS-JWT token each request. It's fast and you wont have to worry about it going bad on you
Thanks a lot for the fast reply, this seem exactly like what we need. Would you please be so kind as to point me in the right direction as to how to create the requests with a GS callback? Right now I am firing the PUT requests using Postman.
you dont even need a X-GS-JWT token for callbacks either - that's another benefit!
That's awesome to hear, unfortunately I am having some trouble finding any info regarding these callbacks and how I can use them to publish the data I need. This this part of the REST API or a separate functionality?
I typed too soon, just saw your second response. Thanks, I'll take a look through these.
We rate limit to protect database infrastructure. Circumventing these limits will result in the unstable behavior of your DataBase Service.
Hammering the DB with inserts will effectively cripple the service for your instance, resulting in time outs. I strongly advise you do not attempt this.
Best Regards, Patrick.
I believe you misunderstand - this is not something that I need at runtime. I just have to pre-insert this data before the game ships for the game to consume at a later stage. The project I am working on uses physical codes that players can purchase and insert inside the game to gain in-game currency. I have a fixed amount of these codes (12 100 000) and I need to have them inside the GS database to check against what the player inserts.