Question

My app engine instances are logging incoming requests and I'd like to be able to run manual/one-off queries for data analysis.

As an example, I would like to be able to determine how many requests are made to each end-point, within a given period. So an SQL query might look something like this:

SELECT path, count(path)
FROM request_log
WHERE time >= :start_time AND time <= :end_time
GROUP BY path

What is the simplest and most cost effective way to achieve these kinds of queries based on logging data?

Note: Monthly log ingestion is just under 1TB.

Was it helpful?

Solution

In GCP it'll be a two step process:

You should be able to create Log Filters:

And then use those log filters to create metrics:

This won't be as flexible or seamless as a SQL query, but the filters should provide you with a way to select the logs you're interested in, and the metrics should allow you to count them.

If you are keen on using SQL, you will need to export your log data to Big Query. You can do this manually when you need to, or you can set up sinks to automatically send new logs that match criteria to Big Query. Once the data is exported, you would be using the Big Query service to run SQL on the logs.

Here is a link on exporting log data to Big Query:

Licensed under: CC-BY-SA with attribution
Not affiliated with dba.stackexchange
scroll top