Overview

This Terraform configuration establishes a backend for the Looker Explore Assistant on Google Cloud Platform (GCP), facilitating interaction with the Gemini Pro model of Vertex AI. The setup supports two options: a Cloud Function backend and a BigQuery backend, each acting as a proxy/relay for running content through the model.

The Explore Assistant also uses a set of examples to improve the quality of its answers. We store those examples in BigQuery. Please see the comparisons below when deciding which deployment approach to use.

What backend should I use?

Here we list the reasons and tradeoffs of each deployment approach in an effort to scope the right backend deployment approach based on individual preferences and existing setups.

Regardless of Backend:

  • Any Looker database connection can be used for fetching the actual data returned from the natural language query url
  • They implement the same API, as in no Looker Credentials are stored in the backends and the arguments are the same (ie. model parameters and a prompt)
  • By default both approaches fetch examples from a BigQuery table out of simplicity. For Cloud Functions you can modify this React Hook and change the connection_name on line 18 to point to the non BQ database connection in Looker that houses your example prompts/training data.

For Cloud Function/Run:

  • Generally speaking, this approach is recommended for folks who want more development control on the backend
  • Your programming language of choice can be used
  • Workflows for custom codeflow like using custom models, combining models to improve results, fetching from external datastores, etc. are supported
  • An HTTPS endpoint will be made available that can be leveraged external to Looker (ie. external applications with a custom web app)
  • The endpoint needs to be public for Looker to reach it (To Note: the repo implements a signature on the request for security. Otherwise putting the endpoint behind a Load Balancer or API Proxy is recommended. Keep in mind that Looker Extensions however, when not embedded are only accessible by authenticated Looker users.)

For BigQuery:

  • Generally speaking, this approach will be easier for users already familiar with Looker
  • Invoking the LLM with custom prompts is all done through SQL.
  • BigQuery’s Service Account or User Oauth Authentication can be used
  • BigQuery however will serve as a pass through to the Vertex API
  • Looker & BigQuery query limits will apply to this approach

Prerequisites

  • Terraform installed on your machine.
  • Access to a GCP account with permission to create and manage resources.
  • A GCP project where the resources will be deployed.

Support

For issues, questions, or contributions, please open an issue in the GitHub repository where this configuration is hosted.