05-20-2018, 10:12 PM
I've written a simple .NET Core app which is using Genetic Algorithms to tune the RSI BULL BEAR ADX strategy.
What I'm trying to work out, is, when measuring the "fitness" of a given set of strategy parameters, is it best to pick the parameter values that simply lead to the highest returns or the highest Sharpe Ratio?
I started off using the Sharpe Ratio - as I assumed this might lead me to tuning the strategy to be a mix of "reasonable risk" and "reasonable return" but I've found that in a lot cases, the highest Sharpe Ratio numbers can often produce pretty poor returns (whereas lower ratios can produce very good returns).
Anyone else done some work/research on this?
(PS for anyone interested, I'm using the GeneticSharp Nuget package for the GA implementation)
What I'm trying to work out, is, when measuring the "fitness" of a given set of strategy parameters, is it best to pick the parameter values that simply lead to the highest returns or the highest Sharpe Ratio?
I started off using the Sharpe Ratio - as I assumed this might lead me to tuning the strategy to be a mix of "reasonable risk" and "reasonable return" but I've found that in a lot cases, the highest Sharpe Ratio numbers can often produce pretty poor returns (whereas lower ratios can produce very good returns).
Anyone else done some work/research on this?
(PS for anyone interested, I'm using the GeneticSharp Nuget package for the GA implementation)