Skip to content

Latest commit

 

History

History
83 lines (59 loc) · 4.38 KB

lab_12.md

File metadata and controls

83 lines (59 loc) · 4.38 KB

Lab 12

Goal

Get experience configuring a cloud environment


  • You'll be pairing with a new person.
  • You'll work in the Console.
  • There are a lot of things that can go wrong. No stress, this is a learning experience.
  • Take your time. It's preferred that you read outputs carefully, understand what's going on, and explore, rather than rushing through.

Part 1: IAM for users

You'll be User A and B, to see things from both sides. When it says User A:, use a browser logged in as that person. Ditto for B. May be easiest to switch between laptops.

  1. User A: Project setup
    1. Create a new Project.
      • Project name: Something like <user1>-<user2>-lab-12
      • Organization: afeld.me
      • Location: SIPA Advanced Computing
    2. Change the Billing Account to that person's Billing Account for Education.
    3. Grant User B BigQuery Admin at the Project level.
  2. User B: BigQuery setup
    1. Switch to the Project created above.
    2. Create a table in BigQuery using sample data from Lab 8 or any other CSV.
  3. User A: Revoke access.
  4. User B: Refresh the page. You should get an access error.*
  5. Explain to each other what's been done so far.

*Not seeing access revoked immediately? This is because "The IAM API is eventually consistent."


Part 2: IAM for services

We're going to deploy a Cloud Run Function that lists our BigQuery datasets.


Set up the Hello World Cloud Run Function

  1. Create a Function using the "inline editor". Set the following, leave the other defaults.
    • Service name: lab-12
    • Region: us-central1
    • Runtime: Python 3.12
    • Uncheck Use Cloud IAM to authenticate incoming requests.
    • Under Container(s), Volumes, Networking, Security -> Security, set service account to be Default compute service account.
  2. It will give you some Hello World code. Click Save and redeploy.
  3. Deployment might take a few minutes. When it's done, try visiting the URL.
  4. View the Logs.
  5. Refresh the live URL a bunch of times.
  6. View the Metrics.
  7. Explain to each other what happened in this section.

Custom Function

  1. Edit source, and replace the code.
  2. Click Save and redeploy.
  3. Deployment might take a minute. When it's done, try visiting the URL. It should show an empty list ([]), as it doesn't have access to any BigQuery DataSets.
  4. Grant BigQuery Data Viewer to the default compute service account (<project-id>[email protected]).
  5. Refresh the Function URL. You should see your dataset listed.
  6. Explain to each other what happened in this section.

Unlike when we set up Streamlit to talk to BigQuery, no key was needed. Things within Google Cloud (Functions, in this case) run as a service account, and roles can be granted to that.


Delete the Project to avoid charges.


Nothing needs to be submitted for this Lab.