I have full Binance history in datasets, but haven't webserver to share
#6
I had this problem since i enabled multi-server mode for my GAB-tool and needed a way to sync data across servers that's behind a lot of security measures etc.

So created two bash-scripts for this purpose which can be found under:
https://github.com/tommiehansen/gekko_tools

Basically these two makes it possible, for me, to easily just git pull gekko_tools from any server and sync all my strategies/toml-files/configs/history between servers that might not allow a simpler rsync method (could be Gekko instances behind firewalls, inside Docker-containers or whatever).

sync.sh
1. Compresses (tar.gz) all user-related data for Gekko (you'll have to edit the file and set *YOUR* relative path to *YOUR* Gekko folder)
2. Uploads this to transfer.sh (limit is 10 GB per file which should be enough since compression also reduces filesize by ~50% meaning you could basically have 19.8 GB of data in Gekko)
3. Outputs the link for you to share/do whatever with
4. Saves the last upload-link in sync/last.txt (for later use)
5. Removes the compressed file

It compresses these DIR's:
gekko/strategies
gekko/config
gekko/web/vue/UIconfig.js

...and this dir if the user presses 'y' (for yes):
gekko/history

This would then have to be committed somewhere so the last.txt can be synced on 2nd/3rd/4h server etc (you could simply create a fork and git pull that all the time). The core logic is there anyway so it should be very simple to modify this to do whatever one would want.

get_replace.sh
1. Gets the last url from sync/last.txt
2. Downloads the file
3. Temporarily uncompresses the data into gekko/ (under same folder -- yes)
4. Uses rsync to overwrite ../gekko/ with the files in the temporary created gekko-folder (you will need to again set *YOUR* path to *YOUR* Gekko-folder)
5. Removes the temporary Gekko-folder
6. Removes the compressed tar.gz file.

--

Anyway -- this is just one way to do it.
As said these bash scripts could easily be modified to do other things or improved to work better for more users etc.
But by using this we could basically actually sort of share history data quite easily.
Anyone could download my history data with the current scripts if they want (but they would also overwrite some of their settings with mine if they just run get_replace.sh ... so be careful).

--

The beauty in creating bash scripts is that it's "write-once-do-all-the-time". Doing all this manually would of course be quite tedious; it's a lot simpler to just run something like sync.sh and be done with it.
  Reply


Messages In This Thread
RE: I have full Binance history in datasets, but haven't webserver to share - by tommiehansen - 04-03-2018, 01:47 PM

Forum Jump:


Users browsing this thread: