← topics

>_ BigQuery

20 commands

List all datasets in the current BigQuery project

bq ls

Create a new BigQuery dataset

bq mk DATASET_NAME

Display schema and metadata for a BigQuery table

bq show DATASET.TABLE

Run a BigQuery query using standard SQL (GoogleSQL)

bq query --use_legacy_sql=false 'SELECT * FROM DATASET.TABLE'

Load a CSV file from Cloud Storage into a BigQuery table using a schema file

bq load DATASET.TABLE gs://BUCKET/FILE.csv SCHEMA.json

Export a BigQuery table to a CSV file in Cloud Storage

bq extract DATASET.TABLE gs://BUCKET/export.csv

Delete a BigQuery table without prompting for confirmation

bq rm -f DATASET.TABLE

Display the first rows of a BigQuery table

bq head DATASET.TABLE

Copy a BigQuery table to a new destination table

bq cp SOURCE_DATASET.SOURCE_TABLE DEST_DATASET.DEST_TABLE

Create a BigQuery table with an explicit schema file

bq mk --table DATASET.TABLE SCHEMA.json

Estimate the bytes processed by a BigQuery query without running it

bq query --dry_run --use_legacy_sql=false 'SELECT * FROM DATASET.TABLE'

Run a BigQuery query and write the results to a destination table

bq query --use_legacy_sql=false --destination_table=DATASET.RESULTS_TABLE 'SELECT * FROM DATASET.TABLE'

Load newline-delimited JSON data from Cloud Storage into BigQuery

bq load --source_format=NEWLINE_DELIMITED_JSON DATASET.TABLE gs://BUCKET/FILE.json SCHEMA.json

Create a BigQuery view with a standard SQL query

bq mk --use_legacy_sql=false --view 'SELECT id, name FROM DATASET.TABLE' DATASET.VIEW_NAME

Create a BigQuery dataset in a specific region such as the EU

bq mk --location=EU DATASET_NAME

Export a BigQuery table to compressed CSV files in Cloud Storage

bq extract --compression=GZIP --destination_format=CSV DATASET.TABLE gs://BUCKET/export_*.csv.gz

Run a BigQuery query that fails if it would process more than 1 GB of data

bq query --use_legacy_sql=false --maximum_bytes_billed=1000000000 'SELECT * FROM DATASET.TABLE'

Set a 30-day expiry on a BigQuery table (value in seconds)

bq update --expiration=2592000 DATASET.TABLE

Load Parquet files from Cloud Storage into BigQuery with schema autodetection

bq load --source_format=PARQUET --autodetect DATASET.TABLE gs://BUCKET/*.parquet

Create a BigQuery table partitioned by day using a schema file

bq mk --table --time_partitioning_type=DAY DATASET.PARTITIONED_TABLE SCHEMA.json

Ready to test yourself?

Practice these commands →