How to handle Timeseries data for a lot of user?

Hello everyone. My idea: I want to build a web app that analyzes the value of cryptocurrency which the user is holding. Because the crypto price changes every minute, I have to write a new OHLC of value to the database every minute for each user. Now, i get the problem that i dont know what is best database for handle this use case? If my web app has fewer than 1000 users, it is easy to handle with any database. But in the other case, my web app has more than 100.000-1.000.000 users, so I need a database that has high performance for writing and fast querying data and is easy to build a schema for. Which database is suitable and best fits my app idea?

  • My solution is to insert data into a single table in Timescale DB. Do you believe the query performance will be good without latency if I have 100.000 users and I have to write 100.000 operations every minute? If so, the table will include 100.000 * 1440 time = 144.000.000 millions of rows throughout a day, and about 4.3 billion rows after a month. I’m think it will get some issue when the user increase more or the data become to bigger. Thank for your help
1 Like

I think usually folks would not update the price of every user’s portfolio with every market movement, rather they’d have a table of symbols which would be inserted into every minute and then they’d have a table with a portfolio for each user. The portfolio would be the amount of each symbol that a user has, and the value would be calculated on the fly using a join. If you want to do that for historical values of the user’s portfolio at the time, you could introduce a time dimension to the users portfolio as well. This will become easier when we have as of joins which is something we’re looking at completing soon, but you can already do with a little extra effort.