“Time-Series for Developers: What the Heck is Time-Series” Recap and Resources
See step-by-step demos, examples of time-series in real scenarios, and 3 queries you can use immediately.
We recently hosted “Time-Series for Developers” - a 30 min technical session focused on laying foundational concepts and showing developers how to apply time-series thinking to their day-to-day work.
What will you learn?
If you’re new to time-series data, you’ll learn (a) what time-series data is in practical terms and (b) the types of questions you can ask and answer with it. If you’re experienced with time-series or TimescaleDB, you’ll (a) brush up on your query skills and (b) learn a few common mistakes to avoid (like making sure you use the right date format - international v. US standard).
The session’s broken into 4 parts:
What’s time-series data?
We focus on practical definitions, not theory - using examples to demonstrate how time-series data is all around us, from package delivery to movie production to money transfer apps.
In my time working at Timescale, I’ve seen all of the above use cases and more, and two examples stand out to me:
- Laika is a major stop-motion animated film studio. They use time-series data to monitor metrics from their various infrastructure (render farms, workstations, virtual machines, etc.) and analyze resource consumption over the many years it takes to make one of their award-winning movies.
- TransferWise is a fintech company focusing on international money transfers. They use time-series data to measure application performance and store currency rates to better serve their customers, predict future currency fluctuations, and ensure accurate, instant transactions.
How does time-series data help you?
You’ll see how developers use time-series data in their projects everyday, and how it has two key differentiators.
Viewing time as the primary axis and collecting all data points for a system enables you to analyze the past, monitor the present, and plan for the future.
Let’s code: New York taxi mission
To demonstrate just how powerful time-series analysis can be, we spend 15-20 mins walking through a mock scenario, where we’re tasked with analyzing NYC taxicab data to find ways to cut carbon emissions, suggest routes to travelers, and more.
You’ll learn how to use PG Admin, 3 simple - yet powerful - queries, and how to JOIN time-series and relational data, all while we analyze a real dataset and answer questions.
Resources + Q & A
We start one part of the NYC Taxi mission, but there are two others! They dive into more advanced queries and questions, like using geo-spatial information to enhance your analysis and special TimescaleDB functions, like
time_bucket to simplify complex SQL queries.
- Follow along with the session recording or dive straight into the tutorial
- Get started with Timescale Cloud
- Join our Slack to ask questions and get help from our engineers and community members
We received questions throughout the session (thank you to everyone who submitted one!), with a selection below:
Is there a maximum recommended table size limit for TimescaleDB?
That depends on what kind of data you’re working with and the read and write patterns you’re expecting. I’d recommend reading our “Hypertable Best Practices” documentation - it’ll help you understand how to configure your hypertables for your scenario.
If you’d like to provide more details or get more help, shoot us a message in our Community Slack and we can chat more about your use case.
How do I claim my $300 Timescale Cloud credit?
If you’re new to Timescale Cloud, head to our signup page. You create an account and automatically get $300 in credits to use within your first 30 days.
If you have questions or would like a demo first, setup time with our Cloud Advocacy team.
Which version of PG Admin are you running? That looks way better than a version I used years ago!
I like how visual PG Admin is! I’m using PG Admin 4.14 for Mac – you can download it here.
If a set of data fails to insert (as a zero from a temperature sensor for example), and we have a log of the data - would we go back and batch-update, or create new records and delete the incorrect ones?
You can do either of those things. Batch updating from failed inserts is fine and shouldn’t have too much of an impact on performance. The performance hit comes when you’re commonly inserting data out of order. Doing that infrequently, via a batch update, should be fine.
How do you see time-series data impacting fields like medical research?
In general, high-fidelity data regarding what’s happening in your body is a huge area of opportunity.
Personally, I think things like wearables that continuously monitor different things will be super impactful – for example, things like Continuous Glucose Monitoring have the potential to have a much larger impact on diabetes patients than the regular one-time-a-day prick system.
To learn about future sessions and get updates about new content, releases, and other technical content, subscribe to our Biweekly Newsletter.
We hope to see you at the next one!