Metadata to actionable insights in Grafana: How to view Parseable metrics
Jen Wike Huger
Posted on July 11, 2024
Parseable deployments in the wild are handling larger and larger volumes of logs, so we needed a way to enable users to monitor their Parseable instances.
Typically this would mean setting up Prometheus to capture Parseable ingest and query node metrics and visualize those metrics on a Grafana dashboard. We added Prometheus metrics support in Parseable to enable this use case.
But we wanted a simpler, self-contained approach that allows users to monitor their Parseable instances without needing to set up Prometheus.
This led us to figuring out a way to store Parseable server's internal metrics in a special log stream called pmeta
.
This stream keeps track of important information about all of the ingestors in the cluster. This includes information like the URL of the ingestor, commit id of that ingestor, number of events processed by the ingestor, and staging file location and size.
This is a sample event in the pmeta
stream.
{
"address": "http://ec2-3-136-154-35.us-east-2.compute.amazonaws.com:443/",
"cache": "Disabled",
"commit": "d6116e8",
"event_time": "2024-07-02T09:49:05.125255417",
"event_type": "cluster-metrics",
"p_metadata": "",
"p_tags": "",
"p_timestamp": "2024-07-02T09:49:05.540",
"parseable_deleted_events_ingested": 35095373,
"parseable_deleted_events_ingested_size": 10742847195,
"parseable_deleted_storage_size_data": 1549123461,
"parseable_deleted_storage_size_staging": 0,
"parseable_events_ingested": 3350101,
"parseable_events_ingested_size": 1054739567,
"parseable_lifetime_events_ingested": 38445474,
"parseable_lifetime_events_ingested_size": 11797586762,
"parseable_lifetime_storage_size_data": 1732950386,
"parseable_lifetime_storage_size_staging": 0,
"parseable_staging_files": 2,
"parseable_storage_size_data": 183826925,
"parseable_storage_size_staging": 0,
"process_resident_memory_bytes": 113250304,
"staging": "/home/ubuntu/parseable/staging"
}
Let's show you how to visualize this data in a Grafana dashboard.
We'll start by setting up Parseable to collect this pmeta
data.
Getting Started with Parseable
Parseable is a cloud-native log management solution that efficiently handles large-scale log data. By integrating Parseable with your infrastructure, you can streamline log ingestion, storage, and querying, making it an essential tool for observability and monitoring.
You can choose the right installation process for you.
To quickly install Parseable using Docker, open the terminal and type the command:
docker run -p 8000:8000 \
-v /tmp/parseable/data:/parseable/data \
-v /tmp/parseable/staging:/parseable/staging \
-e P_FS_DIR=/parseable/data \
-e P_STAGING_DIR=/parseable/staging \
containers.parseable.com/parseable/parseable:latest \
parseable local-store
You can verify the installation by accessing the Parseable UI by navigating to http://localhost:9000 in your web browser. Log in using the default credentials (admin/admin) and explore the dashboard to ensure everything is set up correctly.
Finally, we need to create a log stream before we can send events. A log stream is like a project that will essentially store all your ingested logs. For this tutorial, we'll have a log stream named pmeta
. To create a log stream, log in to your Parseable instance, and you'll find a button on the right-hand top side.
Note that:
pmeta
is automatically created and populated in a Parseable cluster (high availability setup)pmeta
is not created in a single node setupif you're not interested in this data, you can set the retention to 1 day for the
pmeta
stream to avoid storing this data
Read more about the pmeta
stream in the Parseable documentation.
Instal Grafana and the Parseable plugin
Grafana helps you collect, correlate, and visualize data with beautiful dashboards. We'll connect the parseable instance with Grafana via the Parseable Grafana datasource. This plugin allows you to query Parseable data using SQL and visualize it in Grafana.
If you want to self-host Grafana, you can host it on a dedicated cloud instance or locally, depending on your requirements. Follow the official Grafana installation guide for more information.
Once the Grafana instance is setup, let's quickly install the Parseable plugin and connect our Parseable instance to Grafana.
Login to your Grafana instance and navigate to the administration setting on the left-hand side menu.
Click on Plugin and Data Option
Open the Plugins page and search for Parseable
Install the plugin and then click on Add New Datasource
From the datasource page, fill in the following details:
In the URL field, type the URL of your Parseable query instance. For example, https://demo.parseable.com:443
.
Under the Auth Section, switch to the Basic Auth
setting.
In the Basic Auth Details
section, enter your Parseable username and password.
Finally, click on Save & Test
to verify the connection.
Setting up the Grafana dashboard
We'll now use the Parseable Data Source to query data from the Parseable Meta Stream (pmeta
).
Navigate to the Dashboard section and click on New
and select Import Dashboard
.
Enter the Dashboard ID as 21472
and click on load.
Ensure the data source to Parseable-DataSource by selecting from the dropdown menu.
Once done, click on import
. It should take a few seconds to load, and then it will create the dashboard.
We query data from Parseable using SQL. To learn more about querying data in Parseable, you can refer to our documentation.
Summary
Now, you've learned how to create a Grafana dashboard using Parseable's pmeta
stream.
This dashboard provides crucial insights into your Parseable instance's performance, and we encourage you to customize this dashboard further to fit your specific needs.
🏃🏽♀️ To see Parseable in action, watch this YouTube video. Get started with Parseable in just a single command.
💬 For technical questions or to share your implementations, join our developer community on Slack.
📝 Read more from the Parseable blog.
Ready to enhance your observability? Start using Parseable and Grafana today to unlock the full potential of your log data.
Posted on July 11, 2024
Join Our Newsletter. No Spam, Only the good stuff.
Sign up to receive the latest update from our blog.