Faster imports via shared CDN service
#1
The current way of doing imports is less ideal. For very small datasets it isn't a problem, but 1+ years worth of data
quickly becomes somewhat of a problem.

So...

1. Route all 'imports' via a cache (a cdn or other server/service such as github)
2. If server has data available for date X get that period from the cache (else get from exchange api)
3. Have option to upload the data on completion (thus making the data available for others)
--
4. Don't store the data as pure db's but rather store it as e.g. csv files so that these can get gzipped.
5. After import convert the gzipped files to db's.

So the problem...
Most exchanges got a TOS that prevent data from being shared or make it easily available.
Thus the data would need to be obfuscated so that it only works (easily) via Gekko.
  Reply


Messages In This Thread
Faster imports via shared CDN service - by tommiehansen - 02-13-2018, 10:28 AM

Forum Jump:


Users browsing this thread: