Google Cloud BigQuery#

Getting Started#

To use Google Cloud BigQuery as a data source, you will need to install the google-cloud-bigquery Python package. You can install this package using pip:

pip install google-cloud-bigquery db-dtypes


Service Account Key File#

To authenticate with Google Cloud BigQuery, you will need to create a service account and download the service account key file. You can create a service account and download the key file by following the instructions here.

Once you have downloaded the key file, you can authenticate with Google Cloud BigQuery by setting the GOOGLE_APPLICATION_CREDENTIALS environment variable to the path of the key file:

export GOOGLE_APPLICATION_CREDENTIALS=/path/to/key/file.json

Reading Data#

To read data from Google Cloud BigQuery, you will need to create a BigQueryClient object. You can then use this object to read data from Google Cloud BigQuery.

# Cell 1 - Load libraries
import marimo as mo
from import bigquery

# Cell 2 - Load datasets
client = bigquery.Client()
datasets = list(client.list_datasets())

# Cell 3 - Select dataset
selected_dataset = mo.ui.dropdown(
    label="Select dataset", options=[d.dataset_id for d in datasets]

# Cell 4 - Load tables
dataset = client.dataset(selected_dataset.value)
tables = list(client.list_tables(dataset))
selected_table = mo.ui.dropdown(
    label="Select table", options=[t.table_id for t in tables]

# Cell 5 - Load table data
results = client.list_rows(dataset.table(selected_table.value), max_results=10)
mo.ui.table(results.to_dataframe(), selection=None)


Check out our full example using Google Cloud BigQuery here

Or run it yourself:

marimo run