Big Query, Cloud, Data Engineering, GCP

How can I get BigQuery cost per query per user?

How Can I monitor the costs breakdown in GCP BigQuery per user per query?

In order to be able to get costs for jobs in a particular table, you can use BigQuery Audit Logs [1] . You can create a BigQuery logging sink [2] and then you can query the resulting table to get cost breakdowns, for example, like it is done in these examples [3], by creating a view on top of that.

How to parse the query logs data from StackDriver?

After you created the loggin sink, you will get an unpartitioned table called:


The table is highly nested, attached quick snippet to parse out costly select queries:

resource.labels.project_id	as project_id,
protopayload_auditlog.authenticationInfo.principalEmail	as user,
protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.startTime as startTime,
cast(protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.startTime as date) as date,
protopayload_auditlog.requestMetadata.callerSuppliedUserAgent	as userAgent,
protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobConfiguration.query. query as query,
protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.totalBilledBytes/1024/1024/1024/1024*5 as cost


Note this query is also committed in our Big Data Demystified GitHub

Why should I monitor Costs in BigQuery?

  1. Because sometimes you will find scheduled queries which need not run any more.
  2. sometimes the ETL costs much more than you think.
  3. ROI on using pay as you VS flat.


I put a lot of thoughts into these blogs, so I could share the information in a clear and useful way. If you have any comments, thoughts, questions, or you need someone to consult with, feel free to contact me:

1 thought on “How can I get BigQuery cost per query per user?”

Leave a Reply