Moving Average Options for Pin to Fiat
-
Hi Guys I said I would have a little look into averages so I managed to sneak a quick half hour after work and have plodded through a couple of different scenarios
Ok first I have done a basic moving average test
- Add all (I used either last 5 or last 10) values together and divide by the number of values e.g. 5 or 10.
Then I have looked at a Weighted average option but again in a simple way
- For all (I used either last 5 or last 10) values assign a weight (when using 10 I assigned this minute to have weight 10 and 1 min ago to have weight 9 etc)
- Then multiply each value by its weight and sum them all together dividing by the total of the weights we used (when using 10 that is (10 + 9 + 8 etc)
I have whacked them on some charts to see what you think. (see attached)
I think tomorrow I will have a look at exponentially weighted moving averages and just for fun a Fibonacci weighted average.
[attachment deleted by admin]
-
Cool. :)
The EMA is easy: Add new point to moving average, divide by two.
I’d also like to see a volume weighted moving average, where the price is multiplied by the number of coins transacted, then divide by your total weight. This makes a LOT of sense in my mind, because 1 huge transaction should have more influence than a ton of tiny ones. Otherwise it’s trivial to manipulate the average by just doing a bunch of tiny transactions.
My prediction is you’re going to get your best fit with a volume weighted average over a 72 hour moving window. EMA will probably allow for too much motion as recent spikes will drive things up fast even while the price has remained low for a while, which is bad. This will allow for long term market changes to push up the price within a 36 hour period, while low volume spikes are pretty much ignored. Anything shorter a time frame and your prices will change from one hour to the next, which is REAL bad, and without volume weighting tiny transactions will unfairly weight.
-
Sounds like a great idea I’ll see if I can knock up a data model tomorrow and try those out.