I have cried blood finally getting this script to work (new to the google apps scripting and json, relatively familiar with javascript). It works great with a small data set, but as soon as I jump to the large, 20,000 item data set, then it times out! I can't think of another way to make this more efficient. These prices update every 3 minutes so I want to cache it for 10 minutes only. I feel like ScriptDb is not appropriate for this. Essentially I had to store information in the cache since I had ~500 equations using a simple, non-cache version of testing()
and google/api didn't allow that many calls. I figured one call, and ~500 pulls from the cache would be better. Is there some way to make it more efficient that I am not thinking of?
This is run every 10 minutes to store the information. This piece times out.
function updateCache() {
var cache = CacheService.getPublicCache();
var myUrl = "http://www.gw2spidy.com/api/v0.9/json/all-items/all"
var jsonData = UrlFetchApp.fetch(myUrl);
var jsonArray = JSON.parse(jsonData).results;
for (var i =0; i < jsonArray.length; i++) {
cache.put(jsonArray[i].data_id + 'sell', jsonArray[i].min_sale_unit_price, 1500);
}
return cache.get(itemID + 'sell')
}
This is the function that uses the cache and returns the data.
function testing(itemID) {
var cache = CacheService.getPublicCache();
return cache.get(itemID + 'sell')
}