[Gekko development] Guaranteeing event order in the gekko stream
#1
This is a technical post describing a solution in the architecture change of a gekko stream called the FIFO event emitter.

Gekko Streams have been in Gekko for over a year now, they represent a flexible collection of "plugins" that can all interact with each other using "events". The main idea is that each "plugin" is a box that consumes events and produces its own events. If you for example setup a "Gekko Paper trader" under the hood a number of plugins are created and glued together by Gekko, these plugins are (in the current 0.5.x version):

- Trading advisor: this plugin that takes candle data, feeds it into a stragy and emits new advice events.
- Paper trader: this plugin that takes advice events (from the trading advisor) calculates simulated roundtrips and emit new trade events.
- Performance analyzer: this plugins takes trade events and calculates overall statistics over them (profit, risk, etc) this plugin emits roundtrip events.

The beauty of keeping all the above plugins separate is that they are instant swappable: the only difference between a paper trader and a real trader is the second plugin in the above list being replaced with a plugin that actually real trades at a live exchange. Note that the input (advice events) and the output (trade events) are the same, this means that the other plugins don't care about what kind of trader is being ran (paper trader or live trader) as long as they send the same event in the same format. This essentially means that the plugins follow some kind of event based interface model.

The problem

During a backtest Gekko will compose the same set of plugins and pump market data (from a database) into them as fast as they can process them, the problem has to do with a main event called the "candle event" that represents an updated state of the market. This event is used by a lot of different plugins, for example by all 3 plugins in the list above. In response to the candle event the trading advisor can potentially broadcast an advice event, which will go into the paper trader who will emit a trade event. This event will go into the performance analyzer. Here is a visualization of what is happening.


[Image: 37642560-1f26e00a-2c50-11e8-8fc8-1c5ee00b941c.png]


It is extremely important that the paper trader can only process the advice event after it fully processed the candle event, because it needs simulate a trade based on the current price. The problem is guaranteeing that (which is ultimately determined by the order in which plugins are loaded which depends on a few different things).

How Gekko 0.5 solves it

The current Gekko (0.5x and below) does not actually deal with this problem, the implemented workaround is to add the candle the advice is based on to the candle event. The paper trader would completely ignore the raw candle event and always rely on the candle that was attached to the advice event. This works great and is fast, but it's a very limited workaround. In the new event system coming to 0.6 we will also issue events whenever anything in the portfolio changes. Which happens on every candle where the paper trader has a long exposure (assuming the price changed). This event is not possible in the current work around.

In depth

In most scenarios (for example during a backtest) all plugins are run in fully synchronous javascript code (in nodejs). This means that their complete execution will block everything else.

If we know that plugins will always execute everything synchronous we could defer the emitting of events to the next tick of the nodejs event loop. However doing so means that we need to guarantee that the next candle can only come in after all events have been propagated, which can become harder to manage. Especially if you don't want to use setTimeout since it does not allow for more than millisecond precision (which might potentially slow things down if you are queuing up a lot of orders) as well as the overhead of the node event loop. Keeping async plugins in mind, another downside is that it becomes tricky to parallelize multiple async plugins acting on the same event.

A more elegant solution is turn the native event emitter from nodejs (which behaves last in first out event queue, LIFO) into a FIFO (first in first out) event queue. Here is a visualization of the native nodejs event emitter applied to a gekko stream:

[Image: 37645125-87a6f0d6-2c58-11e8-9e2e-3180c29a40a9.png]

Take a look at red numbers which represent the order in which events are processed: This would cause the paper trader to act on advice using outdated market data (since the new candle has not hit the paper trader yet). Here is the gekko event emitter which forces all plugins to process an event (the candle event) before propagating the next one (the advice event).

[Image: 37645205-bb5b09e4-2c58-11e8-9d21-c8e7ce90b502.png]




The actual implementation consists of two parts:

1. Create a new class that allows plugins to emit an event but not actually propagate it straight away, add a function to emit an event which was pending (see the code).
2. Inside the main loop of the gekko stream, make sure to flush all pending events (one by one recursively) until all pending events are completely processed. Only after this start working with the next candle (see the code).

---

This is just the beginning, having more guarantees in place allow us to make more optimizations without having to worry about nasty race conditions.
  Reply


Forum Jump:


Users browsing this thread: