Tune in to the episode 🍿 about “My journey into Postgres monitoring” with Lukas Fittl & Rob Treat.
The latest episode of Path To Citus Con—the monthly podcast for developers who love Postgres—is now out. This episode featured guests Lukas Fittl (founder of pganalyze) and Rob Treat (an early Circonus developer) on the topic “My Journey into Postgres Monitoring” along with co-hosts Claire Giordano and Pino de Candia.
Have you ever asked yourself: “Why is my query so slow?” Or had to figure out which query is slowing things down? Or why your database server is at 90% CPU? According to Lukas, you might find these and many more answers by reviewing your error logs.
If you’re running Postgres on a managed service, what kinds of things do you need to monitor & optimize for? Versus what will your cloud service provider do? There is a discussion on this as well as a segue onto monitoring vs. observability: what’s the difference?
Lukas Fittl and Rob Treat, joined by co-hosts Claire Giordano and Pino de Candia, had a broad conversation about all things monitoring: ways or tools to monitor Postgres (pganalyze, pgMustard, pgBadger, pgDash, your cloud provider’s Query Performance Insights, pg_stat_statements, pg_stat_io, & more), access to log files, pain points that people are trying to solve, and the role that AI might play in monitoring databases of the future.
Let’s dive into some interesting bits from the episode…
“I think [the biggest pain point] is definitely slow queries, right? Because that's when you get yelled at as a DBA. Is when the queries are slow. And then I think there's that next level, which is like CPU and memory and disk IO… But the thing that causes somebody to come to your desk (as if we still did that) and say, Hey, there's a problem, right? That's slow queries.” – Rob Treat
Queries are used to get data from a database. When you are online shopping and using filters while searching for products, an application is running a query to get you the results. If a filter takes longer than expected to give you the results, you might be bothered or go to a different website. Nobody likes slow query responses.
“The thing that still surprises me is most people don't look at their logs for Postgres or at the error logs. And there's so much useful information in there. And if you talk to the typical hacker, they would say, yeah, sure, I'll put a log message right when this happens, but most people just don't look.” – Lukas Fittl
It’s often useful to be reminded of things that are obvious. Because obvious things can be overlooked. In this episode, Lukas reminds us to look at the logs when investigating issues with the database. When you are investigating a problem, the error logs might contain the answer to your questions.
“You generally are going to need external systems in order to trend data over time. So be aware that whatever your journey is going to look like, probably is going to involve external tools. Whatever those might be because you're going to need some way to trend that data over time in a way to make it easy to understand and analyze.” – Rob Treat
Since I started working on Postgres, one of the things that I quickly learned was about this rich ecosystem not only of extensions but also tooling. Given that Postgres is an open-source project and easy to get started in, and also because developers love working on Postgres—a lot of useful tools have been created. Including several cool tools that will help you visualize the performance of your database over time.
“So that kind of sparked that initial idea that what if, we just made a dashboard that showed all the queries that have run on your database in, like the last hour, the last 24 hours, just giving you a really clear overview of the database's view of the world, because as an application engineer, you're oftentimes not really seeing that, right?” – Lukas Fittl
Some of you have told us that you like the origin stories we cover in the Path To Citus Con podcast. For example, there are a couple of episodes about how people got started as a developer (and in Postgres). In the case of Lukas and pganalyze, it was interesting to hear what motivated Lukas to start this bootstrapped monitoring company in order to solve his own use case.
Our next podcast episode will be recorded live on Discord on Wed Jan 10 @ 10am PST | 1pm EST | 6pm UTC
You can find all the past episodes for the Path To Citus Con podcast on:
More useful links:
Thanks for listening! And if you enjoy the episodes of the Path To Citus Con podcast for developers who love Postgres, please tell your teammates. Also, we really appreciate ratings and reviews, so more people will discover the podcast, and hopefully be as delighted as you.