Complete Gekko's datasets - ready files to download and use - Printable Version +- Gekko Forum (https://forum.gekko.wizb.it) +-- Forum: Gekko (https://forum.gekko.wizb.it/forum-13.html) +--- Forum: General Discussion (https://forum.gekko.wizb.it/forum-14.html) +--- Thread: Complete Gekko's datasets - ready files to download and use (/thread-56661.html) |
Complete Gekko's datasets - ready files to download and use - xFFFFF - 04-04-2018 Ready to use Gekko's SQLite dumps files. Without importing via Gekko import and exchange APIs. Just copy the file to the history directory and you have: full history of Binance Exchange for example. The files are updated daily after 23:15 GMT. Included - Binance - FULL history - Poloniex - all pairs - in progress - Kraken - all pairs - in progress Download Links and all acctually information are on Github: https://github.com/xFFFFF/Gekko-Datasets Soon - Kraken all pairs - Bitfinex all pairs - Poloniex all pairs - Gdax all pairs How to import pairs from Coinfalcon, BTCC and Bitstamp? RE: Complete Gekko's datasets - ready to download and use - ankasem - 04-04-2018 thank you very much half of more strategies could not be tested rest a little We can not reach RE: Complete Gekko's datasets - ready to download and use - Gryphon - 04-05-2018 Awesome, thank you! RE: Complete Gekko's datasets - ready files to download and use - ankasem - 04-05-2018 thank you ------------------------------------------ binance-btc = binance_0.1.db binance-usdt= binance_01.db file names are the same how to put all of them in the history file thank you RE: Complete Gekko's datasets - ready files to download and use - xFFFFF - 04-05-2018 You can merge files in external app. This one support merge databases: https://www.codeproject.com/Articles/220018/SQLite-Compare-Utility Here is terminal solution: http://sqlite.1065341.n5.nabble.com/How-do-you-combine-two-SQLite-databases-td19362.html Too large database will decrase Gekko UI performance on every database scan. In my opinion best solution is using Gekko CLI and for seperated binance-bnb database is create new folder history_bnb and set in config.js somethink like this: Code: config.adapter = 'sqlite'; RE: Complete Gekko's datasets - ready files to download and use - User1337 - 04-06-2018 (04-04-2018, 10:14 PM)xFFFFF Wrote: Ready to use Gekko's SQLite dumps files. Without importing via Gekko import and exchange APIs.Great work bro! RE: Complete Gekko's datasets - ready files to download and use - tommiehansen - 04-06-2018 Great, You could use my CLI-tools (or copy them) to upload and users could use the download part of it to easily download the data (and then simply update this via git pull and thus very easily pull your last data by just doing git pull & . download.sh). One of the benefits is that it's creates a single tar.gz archive and automatically uploads this to transfer.sh + updates a simple txt-file with link to latest upload. The limit on transfer.sh is 10 GB/file but the compressed files usually take up 30-40% of the original so should be good for at least 20 GB of data (if not one could easily split this to multiple archives). 1. $ git clone https://github.com/tommiehansen/gekko_tools 2. Goto dir cli within the gekko_tools folder 3. copy-paste the config file; $ cp config.sample.sh config.sh 4. edit the config.sh file and set your Gekko directory if it differs (it defaults to being two levels under the cli directory so ../../gekko/) 5. run $ . upload.sh RE: Complete Gekko's datasets - ready files to download and use - xFFFFF - 04-06-2018 I dont understand. What is difference between my solution and Your? Do You saw my script? https://github.com/xFFFFF/Gekko-Datasets/blob/master/datasets.sh RE: Complete Gekko's datasets - ready files to download and use - tommiehansen - 04-06-2018 The difference for the upload part is that it becomes a single file that is much easier to download then X number of different files and it doesn't use Google Drive and doesn't need perl either. Also the users could just clone the git repository and just do $ . download.sh to refresh the dataset, uncompress it and add it to their Gekko directory all with one single command. So basically what i already wrote. RE: Complete Gekko's datasets - ready files to download and use - xFFFFF - 04-07-2018 Now the user can choose which pairs he wants to download. It does not have to download 4 GB of data. I did it on purpose. I do not see the point in writing a download script, because the user can do it with one command: wget https://drive.google.com/file/d/18izvqp3-ncOQAAE8cHAImtXmUTJuQLKv/view?usp=sharing && unzip binance.zip && cp gekko/history Update GDAX EUR pairs added to gdrive |