How Can I monitor the costs breakdown in GCP BigQuery per user per query?
In order to be able to get costs for jobs in a particular table, you can use BigQuery Audit Logs  . You can create a BigQuery logging sink  and then you can query the resulting table to get cost breakdowns, for example, like it is done in these examples , by creating a view on top of that.
How to parse the query logs data from StackDriver?
After you created the loggin sink, you will get an unpartitioned table called:
The table is highly nested, attached quick snippet to parse out costly select queries:
select resource.labels.project_id as project_id, protopayload_auditlog.authenticationInfo.principalEmail as user, protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.startTime as startTime, cast(protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.startTime as date) as date, protopayload_auditlog.requestMetadata.callerSuppliedUserAgent as userAgent, protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobConfiguration.query. query as query, protopayload_auditlog.servicedata_v1_bigquery.jobCompletedEvent.job.jobStatistics.totalBilledBytes/1024/1024/1024/1024*5 as cost from `MyDataSet.cloudaudit_googleapis_com_data_access_*`
Note this query is also committed in our Big Data Demystified GitHub
Why should I monitor Costs in BigQuery?
- Because sometimes you will find scheduled queries which need not run any more.
- sometimes the ETL costs much more than you think.
- ROI on using pay as you VS flat.
I put a lot of thoughts into these blogs, so I could share the information in a clear and useful way. If you have any comments, thoughts, questions, or you need someone to consult with, feel free to contact me: