How We Used SingleStore Job Service to Streamline Our Monitoring + Telemetry

How We Used SingleStore Job Service to Streamline Our Monitoring + Telemetry

We recently refactored an internal monitoring service to run entirely on SingleStore Job Service (currently in preview).

We were able to aggregate data from more than 50 SingleStore deployments, run Python-based transformations and collaborate in a familiar notebook environment. Here’s how we did it, and what we learned.

manual-distributed-monitoring-a-solution-before-single-store-job-serviceManual + distributed monitoring: A solution before SingleStore Job Service

We developed internal monitoring systems to gauge the performance and reliability of our distributed database deployments. Naturally, this service is distributed across multiple servers.

A specific need arose for conducting monitoring tasks at the aggregate level —  we want to  assess the overall reliability of our distributed service across all deployments, understanding the clients and apps our customers use to connect to SingleStore. To meet this requirement, we had to consolidate data from all 50+ servers and run analytical queries against the combined result.

We wanted to consolidate this distributed data into a single SingleStore database for our queries (harnessing OLAP performance on SingleStore). Initially, we built a backend service in Golang for each of the 50 source servers where we had to independently manage deployments, service lifecycle and monitoring.

We had even thought of using an external data movement tool, but managing a dynamic list of 50+ databases (update the host list, store credentials securely) isn’t straightforward either.

aggregating-data-into-a-single-database-with-single-store-job-serviceAggregating data into a single database with SingleStore Job Service

With SingleStore Job Service and Notebooks, we can do this aggregation of databases right within our SingleStore org.

 This approach brings us several advantages:

  1. It eliminates the need to worry about deployment and manageability of separate services and data integration tools. We leveraged the secure execution environment of SingleStore Notebooks + Job Service infrastructure, and focused on our analysis.
  2. We could build quickly within a collaborative and familiar environment with SQL and Python. We got a couple members of the team together, got a prototype out in a shared notebook and moved it to production by scheduling the notebook.
  3. Since all source databases belong to a single organization, a notebook deployed within that organization can access them using the same credentials consistently.
  4. The ability to incorporate data transformation logic in Python adds a layer of flexibility. While SQL is still employed for aggregation, the versatility provided by Python-written jobs allows us to produce 'clean data' as an output.

whats-next-for-usWhat’s next for us

Now, with the data consolidated in a single SingleStore database, a wide range of individuals can utilize it to run SQL queries and develop analytics. Integrated notebooks prove especially valuable in this context and we plan on building dashboards and analytics right in SingleStore.

Try SingleStore free on the cloud with our Free Shared Tier. 


Share