Sign In Register

How can we help you today?

Start a new topic
Answered

Accessing uploaded data

Hi,


I am trying to find a way to browse the data uploaded through GetUploadUrlRequest.

Players will be uploading pictures, and I uderstand that I can just maintain a list of the upload id  against each player. But right now I am still in development and I would like to access / wipe this data.

Is there anyway todo so ? (for example to retrieve the list of all upload ids so far, or to browse the folder where it is stored)


Best Answer

Hi Stephane,


I'm afraid you can't directly view whats been uploaded already but you can get the uploadIds from the NoSql explorer. If you navigate there and select the playerMessage collection, you can use the query {"messageClass": ".UploadCompleteMessage"} to return all the UploadCompleteMessages and get the uploadIds from the results and delete as required with "Spark.getFiles().deleteUploadedFile(uploadId);". Alternatively with a Multi Stage Publishing setup in place if you copy a snapshot from one stage to another all the runtime and system collections are cleared (including the uploadedFiles system one) leaving you with a fresh data set to work from, you can read more about this setup here.


Thanks,

Liam


Hi Stephane,


You can create a custom collection for uploaded files by doing the following. Navigate to Cloud Code>User Messages> UploadCompleteMessage and paste in the code below.


 

//this will grab the uploadedData that we need from the response
var dataToGrab = Spark.getData().uploadData;
 
//load the collection
var uploadedFiles = Spark.runtimeCollection("uploadedFiles");
 
//insert the info into into our collection
var success = uploadedFiles.insert({"uploadedData":dataToGrab})


Now when you upload a file you can see it by navigating to the NoSql explorer and looking at the "uploadedFiles" collection. You could then query this by playerId too see all of a particular players uploads.


Next create an event called "deleteUpload" and gave it a single string attribute called "uploadId" set to be "Used In Script", in the cloud code for this event place the following.


//load the collection
var uploadedFiles = Spark.runtimeCollection("uploadedFiles");
 
//get the uploadId that we passed in through the event
var uploadId = Spark.getData().uploadId;
 
//delete the file from the system
Spark.getFiles().deleteUploadedFile(uploadId);
 
//remove the file from our custom collection
var fileToRemove = uploadedFiles.remove({"uploadedData.uploadId": uploadId });
 
//display what file has been removed
Spark.setScriptData("File Removed", uploadId);



Spark.getFiles().deleteUploadedFile(uploadId); is the important piece of code here as it deletes the file from the system, you can read more about it here. You can customize the above event to suit your needs. For example you could use the "distinct" on the collection to get an array of all the unique uploadIds, then process through them as you want, i.e deleting them all.


 

var uploadIds = uploadedFiles.distinct("uploadedData.uploadId");


If you have any more questions just let me know.


Thanks,

Liam 

Hi Liam,


Thank you for all the snippets, that will help. However I was wondering if there is any way to browse/delete the files that I have already uploaded, but for which I did not keep the uploadID ?

Answer

Hi Stephane,


I'm afraid you can't directly view whats been uploaded already but you can get the uploadIds from the NoSql explorer. If you navigate there and select the playerMessage collection, you can use the query {"messageClass": ".UploadCompleteMessage"} to return all the UploadCompleteMessages and get the uploadIds from the results and delete as required with "Spark.getFiles().deleteUploadedFile(uploadId);". Alternatively with a Multi Stage Publishing setup in place if you copy a snapshot from one stage to another all the runtime and system collections are cleared (including the uploadedFiles system one) leaving you with a fresh data set to work from, you can read more about this setup here.


Thanks,

Liam

Perfect. Thanks :-)

Login to post a comment