I explain,
I made many tests :
- compute ADX 7 with just the lookback minimal size (2 x period - 1, it seems)
- compute ADX 7 but with more input data (tried with different inputs size +5, +10 , +20 more samples...)
- compute ADX 7 using stream_ADX version
Results are never the same, the more the input array is, different the results is (sensibly different but that means no signal or deferred signal).
When using stream_ADX after case 1) it is similar as using only case 1).
Is it something related to the computation of true range ? What could we do to normalize the behavior ?