Something went wrong!
Hang in there while we get back on track
BigQuery Integration
Query and manage data in Google BigQuery using the bq command-line tool
Requires Claude Code CLI
This skill integrates with BigQuery through Claude Code. Install Claude Code and add this skill to use it.
Available Actions
Run SQL Query
Execute GoogleSQL queries against BigQuery datasets
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
| query | string | Required | SQL query to execute |
| destination_table | string | Optional | Save results to this table instead of stdout |
| dry_run | boolean | Optional | Validate query and estimate cost without running |
Returns
string Query results in table format or JSON
Try saying...
- "Query the sales table to find revenue by region"
- "Run an SQL analysis on customer data in BigQuery"
- "Check how many users signed up last month"
- "Get the top 10 products by sales from BigQuery"
List Datasets
Show all datasets in a project
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
| project_id | string | Optional | Project to list datasets from (uses default if not specified) |
Returns
array List of dataset names and metadata
Try saying...
- "Show me all BigQuery datasets"
- "List datasets in my project"
- "What datasets are available?"
List Tables
Show all tables in a dataset
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
| dataset | string | Required | Dataset name to list tables from |
Returns
array List of table names and metadata
Try saying...
- "What tables are in the analytics dataset?"
- "List all tables in customer_data"
- "Show tables in my dataset"
Show Table Schema
Display schema and metadata for a table
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
| table_path | string | Required | Full table path in format PROJECT:DATASET.TABLE |
Returns
object Table schema, row count, size, and other metadata
Try saying...
- "What columns does the users table have?"
- "Show me the schema for the sales table"
- "Describe the customer_events table"
Create Dataset
Create a new dataset to organize tables
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
| dataset_name | string | Required | Name for the new dataset |
| location | string | Optional | Geographic location for the dataset |
| description | string | Optional | Description of the dataset |
Returns
string Confirmation message with dataset details
Try saying...
- "Create a new dataset for marketing analytics"
- "Make a dataset called customer_data in the EU region"
- "Set up a new BigQuery dataset for logs"
Load Data
Import data from Cloud Storage or local files into a table
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
| destination_table | string | Required | Target table in format DATASET.TABLE |
| source_path | string | Required | Cloud Storage path (gs://) or local file path |
| source_format | string | Optional | Format of the source data |
| autodetect | boolean | Optional | Automatically detect schema from the data |
Returns
string Load job status and row count
Try saying...
- "Load this CSV file into BigQuery"
- "Import data from Cloud Storage into the sales table"
- "Upload this JSON file to the events table"
Extract Data
Export table data to Cloud Storage
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
| source_table | string | Required | Table to export in format DATASET.TABLE |
| destination_uri | string | Required | Cloud Storage URI for the export |
| destination_format | string | Optional | Export format |
Returns
string Extract job status and file location
Try saying...
- "Export this table to Cloud Storage as CSV"
- "Download BigQuery table data to a bucket"
- "Extract the results table to JSON format"
Delete Table
Remove a table from a dataset
Parameters
| Name | Type | Required | Description |
|---|---|---|---|
| table_path | string | Required | Table to delete in format DATASET.TABLE |
Returns
string Confirmation of deletion
Try saying...
- "Delete the temporary results table"
- "Remove the old_data table from BigQuery"
- "Drop the test_table"
Getting Started
- 1 Install Google Cloud SDK: brew install google-cloud-sdk (macOS) or see docs for Linux
- 2 Run: gcloud auth login
- 3 Follow the browser prompts to authenticate with your Google account
- 4 Optionally set default project: gcloud config set project YOUR_PROJECT_ID
Verify Setup
bq ls --datasets
Success: Lists datasets in your project without errors
Security & Access
Access Scope
The bq CLI can access all BigQuery resources your Google account has permissions for, including running queries (which may incur costs), reading data, and modifying/deleting datasets and tables.
- Queries may process large amounts of data and incur significant costs
- Always use --dry_run to estimate costs before running expensive queries
- The CLI can read all data in tables your account has access to
- Deletion operations are immediate and cannot be undone (except for tables with time travel enabled)
- Be cautious with wildcards and production datasets
Limitations
- Queries have a maximum execution time of 6 hours
- Maximum row size is 100 MB
- Some operations incur costs based on data processed
- Interactive queries are subject to concurrent query limits
Get This Skill
Requires Pro subscription ($9/month)
Quick Reference
- Type
- CLI
- Auth
- CLI Authentication
- Setup
- Moderate Setup
- Tools Required
- Bash
- CLI Dependencies
- bq gcloud
Documentation
Have Feedback?
Help us improve this skill by sharing your ideas and suggestions.
Request Improvements