We are getting our first encouraging results on an arbitrage model from a seriens of tests running on 10 trading pairs ( crypto ), we are willing to expand to undreds of trading pairs for analizing new patterns. Courrently running python code on a digital ocean droplet. But we are realizing we probably need 100 times more CPU compared to our current 6 usd monthly fee to get a more clear view on how to proceed with development.
How much should I expect to spend to keep 1000 trading pairs monitored at a tick level on average? ( we are using websokets ) Any tricks to lower the computational bill at the end of the month? We are amateurs with other job and limited economical resources, what direction should we go to compute for extracting the data we need for as cheap as possible?
Submitted October 26, 2020 at 06:30PM by bodytexture