Sign In Register

How can we help you today?

Start a new topic

Too many instruction calls (>22000) in the System Scheduler "Every hour" - how to handle this?!

Hello @ all,


I have a question how to handle an "expected behaviour" of the cloud code!

Due to too many instruction calls (>22000) in the System Scheduler "Every hour", the cloud code script timed out and breaks. For me, that's a big problem and I see it first time in live stage - because of more players than in testing stage I believe...

But how can I avoid the problem?

Short instruction from my side, what I wanna do:

Every hour the script iterates through all players, all creatures and all worlds - here sometimes, but not every time, some actions take place - based on game logic. In this case players is the GameSparks internal collection, creatures and world are some data types.
I need to do it every hour... and of course I searched for the problem and came along to the bulk jobs. On job I should be able to create in the "Every hour" scheduler. But this jobs are always related to an array of players.

So my questions:

1. How can I execute a bulk job from "Every hour" on ALL players?
2. How can I execute my logic for creatures and worlds?

Each hour the changes made by the game ("Every hour") and the players have to be tracked and saved as "stats", so I have to do it this way.

Or maybe, do you recommend another way of achiving this hourly stats and calculations which I might not be seeing?!


Looking forward to read from you guys!

Best regards,
Sebastian Engfer


You are simply never going to be able to use this platform (and few others) to iterate over your entire player base (and creatures/worlds) and apply logic to each of them....every hour.  You will eventually get to the point where it takes LONGER than an hour to execute this logic - and you will never catch up!

Bulk jobs are not a panacea - and are really not designed to handle this sort of load in addition to the regular player initiated gameplay.

Why do you think you need to manipulate your data every hour?   What is the specific use case?
If you are really hell-bent on this approach, you can manipulate the data outside of cloud code by using their RESTful API to read/write the GS data from the outside (some code you write and execute on your own servers).

best of luck!


1 person likes this

Yes you the Gamesparks node JS SDK for this sort of heavy lifting

https://www.npmjs.com/package/gamesparks-node

You can send the results back via a callback or connect using a username and password 

You should never manipulate all players data through a batch. This is a very bad design.

Most of the times, you simply need to update when the user connect by saving last update time and computing the time difference.

On some rare occasion, you can update when an other player "visits" elements that belongs of a player.


If that is not sufficient, I believe you can schedule job to be run for a specific action. For example, if a building is finished in 8 hours, you register a job to update the player data in 8 hours. But I would only recommend using that as last resort.

Login to post a comment