Terraform Remote Backend

We want to store the terraform state files with the GCP project itself. This has several advantages:

  • The state files are stored in a central location, making it easier for multiple users to collaborate on the same project.
  • The state files are stored securely in GCP, reducing the risk of data loss.
  • One human can work on multiple deployments at the same time, by having one codebase and updating the project id environment variable.

Note: If you don’t want to store the state file in the backend, you can remove the following code from the main.tf file in the repository:

terraform {
  backend "gcs" {
  }
}

or replace it with the following code:

terraform {
  backend "local" {
  }
}

Configuration and Deployment

We are using terraform to setup the backend. We will also be hosting the terraform state inside the project itself by using a remote backend. The configuration is passed on the command line since we want to use the project-id in the bucket name. Since the project-ids are globally unique, so will the storage bucket name.

Cloud Function Backend

First create a file that will contain the LOOKER_AUTH_TOKEN and place it at the root. This will be used by the cloud function locally, as well as the extension framework app. The value of this token will uploaded to the GCP project as secret to be used by the Cloud Function.

If in the /explore-assistant-backend cd back to root (ie. cd ..) and run the following command:

openssl rand -base64 32 > .vertex_cf_auth_token

From the /explore-assistant-backend directory run the following.

To deploy the Cloud Function backend:

cd terraform 
export TF_VAR_project_id=XXX
export TF_VAR_use_bigquery_backend=0
export TF_VAR_use_cloud_function_backend=1
export TF_VAR_looker_auth_token=$(cat ../../.vertex_cf_auth_token)
gsutil mb -p $TF_VAR_project_id gs://${TF_VAR_project_id}-terraform-state/
terraform init -backend-config="bucket=${TF_VAR_project_id}-terraform-state"
terraform plan
terraform apply

BigQuery Backend

To deploy the BigQuery backend:

cd terraform 
export TF_VAR_project_id=XXX
export TF_VAR_use_bigquery_backend=1
export TF_VAR_use_cloud_function_backend=0
gsutil mb -p $TF_VAR_project_id gs://${TF_VAR_project_id}-terraform-state/
terraform init -backend-config="bucket=${TF_VAR_project_id}-terraform-state"
terraform plan
terraform apply

You will hvae to wait 1-2 minutes for the APIs to turn on. You will also have to wait a couple of minutes for the service account for the BigQuery connection to appear.

If you use the defaults, you can test whether everything is working by running:

    SELECT ml_generate_text_llm_result AS generated_content
    FROM
    ML.GENERATE_TEXT(
        MODEL `explore_assistant.explore_assistant_llm`,
        (
          SELECT "hi" as prompt
        ),
        STRUCT(
        0.05 AS temperature,
        1024 AS max_output_tokens,
        0.98 AS top_p,
        TRUE AS flatten_json_output,
        1 AS top_k)
      )

Also, as part of the BigQuery backend setup, we create the Service Account that can be used to connect Looker to the BigQuery dataset to fetch the examples and use the model. You can follow the instructions for creating the connection in Looker here (https://cloud.google.com/looker/docs/db-config-google-bigquery#authentication_with_bigquery_service_accounts). You should be able to pickup the instructions on step 5.

Deployment Notes

  • Changes to the code in explore-assistant-cloud-function will result in a zip file with a new hash. This hash is added to the environment variables for the cloud function, and a new hash will trigger the redeployment of the cloud function.

Resources Created

  • Google Cloud Functions or Cloud Run services, based on the selected backend.
  • Google BigQuery dataset and table to store the examples
  • Google BigQuery connection and gemini pro model, if using the BigQuery backend.
  • Necessary IAM roles and permissions for the Looker Explore Assistant to operate.
  • Storage buckets for deploying cloud functions or storing data.
  • Artifact Registry for storing Docker images, if required.

Cleaning Up

To remove all resources created by this Terraform configuration, run:

terraform destroy

Note: This will delete all resources and data. Ensure you have backups if needed.