The Cenderawasih Trading algorithm (Patreon)
Downloads
Content
Introduction
In this post I will show you the performance results I got from the Cenderawasih strategy. That trading algowas very popular on the Freqtrade Discord at a certain time in the past. So let's get right to it.
Source of the file
There was a time in the past where I was hoarding trading strategy code left and right form the Internet and from Github. And during that time I did not write down where I got all these sources. Luckily the possible author left his or her name in the code so all credits for this algo go to you.
I'm just the guy who tests the results to his own specific strategy testing benchmark and shows this on the youtube.
The strategy code
Now what about the trading algorithm.
At first sight this strategy uses the weighted moving average, the Hull moving average and a rolling volume indicator since these are part of the code.
def tv_wma(df, length = 9) -> DataFrame: """ Source: Tradingview "Moving Average Weighted" Pinescript Author: Unknown Args : dataframe : Pandas Dataframe length : WMA length field : Field to use for the calculation Returns : dataframe : Pandas DataFrame with new columns 'tv_wma' """ norm = 0 sum = 0 for i in range(1, length - 1): weight = (length - i) length norm = norm + weight sum = sum + df.shift(i) weight tv_wma = (sum / norm) if norm > 0 else 0 return tv_wma def tv_hma(dataframe, length = 9) -> DataFrame: """ Source: Tradingview "Hull Moving Average" Pinescript Author: Unknown Args : dataframe : Pandas Dataframe length : HMA length field : Field to use for the calculation Returns : dataframe : Pandas DataFrame with new columns 'tv_hma' """ h = 2 * tv_wma(dataframe['close'], math.floor(length / 2)) - tv_wma(dataframe['close'], length) tv_hma = tv_wma(h, math.floor(math.sqrt(length))) # dataframe.drop("h", inplace=True, axis=1) return tv_hma def rvol(dataframe, window=24): av = ta.SMA(dataframe['volume'], timeperiod=int(window)) rvol = dataframe['volume'] / av return rvol
The ROI has been set to 100 percent profit.
# ROI table: minimal_roi = { "0": 100.0 }
and the stoploss to almost 10 percent loss after entering the trade.
# Stoploss: stoploss = -0.098
This strategy has the 5 minute timeframe in mind but I could also test it on higher timeframes as well.
timeframe = '5m'
The code also has loads of possibilities for further optimization.
dummy = IntParameter(20, 70, default=61, space='buy', optimize=False) rsi_buy_ema = IntParameter(20, 70, default=61, space='buy', optimize=False) buy_rsi_1 = IntParameter(0, 70, default=50, space='buy', optimize=False) buy_rsi_fast_1 = IntParameter(0, 70, default=50, space='buy', optimize=False) optimize_buy_hma = False base_nb_candles_buy_hma = IntParameter(5, 100, default=6, space='buy', optimize=optimize_buy_hma) low_offset_hma = DecimalParameter(0.9, 0.99, default=0.95, space='buy', optimize=optimize_buy_hma) optimize_buy_ema = False base_nb_candles_buy_ema = IntParameter(5, 100, default=6, space='buy', optimize=optimize_buy_ema) low_offset_ema = DecimalParameter(0.9, 1.1, default=1, space='buy', optimize=optimize_buy_ema) optimize_buy_vwma = False base_nb_candles_buy_vwma = IntParameter(5, 80, default=6, space='buy', optimize=optimize_buy_vwma) low_offset_vwma = DecimalParameter(0.9, 0.99, default=0.9, space='buy', optimize=optimize_buy_vwma) optimize_buy_volatility = False buy_length_volatility = IntParameter(10, 200, default=72, space='buy', optimize=optimize_buy_volatility) buy_min_volatility = DecimalParameter(0, 0.5, default=0, decimals = 2, space='buy', optimize=False) buy_max_volatility = DecimalParameter(0.5, 2, default=1, decimals = 2, space='buy', optimize=optimize_buy_volatility) optimize_buy_volume = False buy_length_volume = IntParameter(5, 100, default=6, optimize=optimize_buy_volume) buy_volume_volatility = DecimalParameter(0.5, 3, default=1, decimals=2, optimize=optimize_buy_volume) # Sell optimize_sell_hma = False base_nb_candles_sell_hma = IntParameter(5, 100, default=6, space='sell', optimize=optimize_sell_hma) high_offset_hma = DecimalParameter(0.9, 1.1, default=0.95, space='sell', optimize=optimize_sell_hma) optimize_sell_ema = False base_nb_candles_sell_ema = IntParameter(5, 100, default=6, space='sell', optimize=optimize_sell_ema) high_offset_ema = DecimalParameter(0.9, 1.1, default=0.95, space='sell', optimize=optimize_sell_ema) optimize_sell_ema2 = False base_nb_candles_sell_ema2 = IntParameter(5, 100, default=6, space='sell', optimize=optimize_sell_ema2) high_offset_ema2 = DecimalParameter(0.9, 1.1, default=0.95, space='sell', optimize=optimize_sell_ema2) optimize_sell_ema3 = False base_nb_candles_sell_ema3 = IntParameter(5, 100, default=6, space='sell', optimize=optimize_sell_ema3) high_offset_ema3 = DecimalParameter(0.9, 1.1, default=0.95, space='sell', optimize=optimize_sell_ema3)
But since this will introduce a curve fitting bias. I will not optimize this algorithm but test it on its current merits to see of it performs well 'out-of-the-box'.
But ofcourse you are always allowed to optimize this algo, albeit on your own risk ofcourse.
Furthermore, the algo used the 1 day timeframe as an informative timeframe.
@informative('1d') def populate_indicators_1d(self, dataframe: DataFrame, metadata: dict) -> DataFrame: dataframe['age_filter_ok'] = (dataframe['volume'].rolling(window=self.age_filter, min_periods=self.age_filter).min() > 0) return dataframe
Other indicators used in this algo are the RSI and the Squeeze Momentum indicator from the Finta library. So be sure to have that library installed before using this algorithm.
# RSI dataframe['rsi'] = ta.RSI(dataframe, timeperiod=14) dataframe['rsi_fast'] = ta.RSI(dataframe, timeperiod=4) dataframe['sqzmi'] = fta.SQZMI(dataframe)
There is also a nifty live data check that Ensures that there is sufficient volume data by checking that the minimum volume over the past 72 periods is greater than 0, implying that the data is 'live' or recent.
dataframe['live_data_ok'] = (dataframe['volume'].rolling(window=72, min_periods=72).min() > 0)
Next there is a section that Calculates various moving averages (Hull Moving Average, Exponential Moving Average, Volume Weighted Moving Average) with specific offsets. These are used to smooth out price data and identify potential buy or sell signals. And you can look into each of these by yourself if you want.
if not self.optimize_buy_hma: dataframe['hma_offset_buy'] = tv_hma(dataframe, int(self.base_nb_candles_buy_hma.value)) self.low_offset_hma.value if not self.optimize_buy_ema: dataframe['ema_offset_buy'] = ta.EMA(dataframe, int(self.base_nb_candles_buy_ema.value)) self.low_offset_ema.value if not self.optimize_buy_vwma: dataframe['vwma_offset_buy'] = pta.vwma(dataframe["close"], dataframe["volume"], int(self.base_nb_candles_buy_vwma.value)) self.low_offset_vwma.value if not self.optimize_buy_volatility: df_std = dataframe['close'].rolling(int(self.buy_length_volatility.value)).std() dataframe["volatility"] = (df_std > self.buy_min_volatility.value) & (df_std < self.buy_max_volatility.value) if not self.optimize_buy_volume: df_rvol = rvol(dataframe, int(self.buy_length_volume.value)) dataframe['volume_volatility'] = (df_rvol < self.buy_volume_volatility.value) if not self.optimize_sell_hma: dataframe['hma_offset_sell'] = tv_hma(dataframe, int(self.base_nb_candles_sell_hma.value)) self.high_offset_hma.value if not self.optimize_sell_ema: dataframe['ema_offset_sell'] = ta.EMA(dataframe, int(self.base_nb_candles_sell_ema.value)) self.high_offset_ema.value if not self.optimize_sell_ema2: dataframe['ema_offset_sell2'] = ta.EMA(dataframe, int(self.base_nb_candles_sell_ema2.value)) self.high_offset_ema2.value if not self.optimize_sell_ema3: dataframe['ema_offset_sell3'] = ta.EMA(dataframe, int(self.base_nb_candles_sell_ema3.value)) *self.high_offset_ema3.value
Buy signals
Now about the populate_buy_trend method:
This part Constructs a set of conditions combining the indicators and filters defined above. These conditions include checks against live data, age filters, volatility, volume volatility, price comparisons with EMAs and HMA, and RSI thresholds.
To give you one example here:
if self.optimize_buy_hma: dataframe['hma_offset_buy'] = tv_hma(dataframe, int(self.base_nb_candles_buy_hma.value)) *self.low_offset_hma.value
First it checks if this condition is True, and if so then this buy condition can be a part of the buy signals. However in this case it is False and this condition will not count against the buy conditions.
optimize_buy_hma = False
So in a way, you also have the possibility to switch certain indicators on and off without having to alter complete blocks of code here.
If the configured buy conditions are met, the strategy tags the trade and sets the buy signal, indicating a buy order should be placed.
Sell signals
As for the sell trend method:
This part establishes the set of criteria for selling, which includes price comparisons with various moving averages. Each condition is tagged with a specific label for easier identification.
You can also see here that there are conditions that check if the configured parameter is switched on or off.
In the end this method combines all sell conditions with an additional check for positive volume. If any of the combined conditions are met, a sell signal is generated by setting sell to 1.
So all in all:
The strategy uses a combination of momentum (RSI), volume, volatility, and price action (via different moving averages) to identify potential buy and sell opportunities.
It has a mix of short-term and long-term indicators to capture different market dynamics.
And the strategy also includes optimization flags so that you can optimize it if you need to.
But what about the performance that comes right out of the box. Is trading algorithm worth your time and money?
Backtest results
Lets start with the endresult of all the tests over the timeframes I use.
As you can see. The initially chosen timeframe has indeed the best score. IT does not have the highest endbalance. But since keeping the attained wealth is more important than highest gains at a certain point in time, this seems to have the best cards.
However, there is one thing that bothers me a little bit, and that is that there are less trades made on the 5 minute timeframe than on some of the higher timeframes. That could be an indication that this algorithm has potential future performance problems and might need some more investigation.
The profit curve, on the other hand, looks very promising. And the way this algo holds onto its attained profits was rarely seen on other algo's I tested in the past.
And there is also an almost strikin similarity between the profit curve and the amount of winning trades here. Which is very particular.
Now the way this algorithm holds on to its profits is also shown back in the drawdown curve.
I think I have not seen another algorithm with such a small highest and average drawdown. And altough this looks really nice on a plot, you should now also wonder if this is also a realistic performance, or not.
Comparing this strategies attained end balance with the other strategies I tested in the past, show that there are many other that eventually performed better.
However, you should take all factors into consideration and some strategies with higher results also come with higher risks.
Overall the performance indicators show a very mixed picture here.
The win percentage, drawdown score and profit factor are excellent and this strategy seems to fit most of the pairs I test.
But looking at the profic percentage, CAGR, Sortino and Sharpe, it sheds a totally different light on this strategy. Very low numbers here...
And all this mixed information gives me personally some mixed feelings about this strategy.
There sure are still a lot of cogs to turn and switches to flip to make this strategy perform better on the factors where I have my doubts. And it might have very promising results if you find the right combinations here for your preferred pairs.
So take all this information into consideration if you decide to turn your attention to this strategy for real trading.
Strategy League
So all in all,
Adding all current scores up from this particular configuration, it makes this strategy end up in the upper third section of the best performing algorithms. Which is still not bad.
End credits
Thanks for reading this post.
And I will see you in the next one,
Goodbye