iT邦幫忙

2024 iThome 鐵人賽

DAY 22
0
  • 建立一個 Service account 給 cloud function

    # 創建一個服務帳戶給 Cloud Function 使用
    resource "google_service_account" "function_account" {
      account_id   = "cloud-function-gke-trigger"
      display_name = "Cloud Function GKE Trigger"
    }
    
    # 給服務帳戶授予必要的權限
    resource "google_project_iam_member" "function_account_roles" {
      project = var.project_id
      role    = "roles/container.developer"
      member  = "serviceAccount:${google_service_account.function_account.email}"
    }
    

    從 console 下載對應的 key ,方便測試本地開發 cloud function

  • 建立一個觸發 job 的 python code,由於是 private,很難在本地測試開發,我們先測到 service account 可以拿到 config 就好

    import os
    from dotenv import load_dotenv
    from google.cloud import container_v1
    from kubernetes import client, config
    
    # 載入 .env 檔案中的環境變數
    load_dotenv()
    
    def trigger_gke_job(request=None):
        # 獲取環境變量
        project_id = os.getenv('GOOGLE_CLOUD_PROJECT')
        region = os.getenv('GKE_REGION')
        cluster_name = os.getenv('GKE_CLUSTER')
    
        # print(os.getenv("GOOGLE_APPLICATION_CREDENTIALS"))
    
        # # 認證並獲取 GKE 集群憑證
        container_client = container_v1.ClusterManagerClient()
        cluster_full_name = f"projects/{project_id}/locations/{region}/clusters/{cluster_name}"
        response = container_client.get_cluster(name=cluster_full_name)
        print(response)
    
  • 將 cloudfunction 部署上去

    
    # 創建一個 Cloud Storage bucket 來存儲函數代碼
    resource "google_storage_bucket" "function_bucket" {
      name     = "xxx-gke-job-trigger-function"
      location = "ASIA-EAST1"
    }
    
    # 將函數代碼打包並上傳到 bucket
    data "archive_file" "function_zip" {
      type        = "zip"
      output_path = "${path.module}/function.zip"
      source_dir  = "${path.module}/function_code"
    }
    
    resource "google_storage_bucket_object" "function_code" {
      name   = "function-${data.archive_file.function_zip.output_md5}.zip"
      bucket = google_storage_bucket.function_bucket.name
      source = data.archive_file.function_zip.output_path
    }
    
    # 部署 Cloud Function
    resource "google_cloudfunctions_function" "gke_job_trigger" {
      name        = "xxx-gke-job-trigger"
      description = "Triggers a job in xxx GKE cluster"
      runtime     = "python39"
    
      available_memory_mb   = 256
      source_archive_bucket = google_storage_bucket.function_bucket.name
      source_archive_object = google_storage_bucket_object.function_code.name
      trigger_http          = true
      entry_point           = "trigger_gke_job"
      service_account_email = google_service_account.function_account.email
    
      environment_variables = {
        GKE_CLUSTER = "portal-uat-asia-east1-gke"
        GKE_REGION  = "asia-east1"
      }
    }
    
    # 允許未經身份驗證的調用(可選,取決於您的安全要求)
    resource "google_cloudfunctions_function_iam_member" "invoker" {
      project        = google_cloudfunctions_function.gke_job_trigger.project
      region         = google_cloudfunctions_function.gke_job_trigger.region
      cloud_function = google_cloudfunctions_function.gke_job_trigger.name
    
      role   = "roles/cloudfunctions.invoker"
      member = "allUsers"
    }
    
  • 由於 cloud function 沒有 VPC 的概念,要額外給 VPC connector

    
    # 創建 VPC 連接器
       resource "google_vpc_access_connector" "connector" {
         name          = "vpc-con"
         ip_cidr_range = "10.50.0.0/28"
         network       = "portal-vpc"
         region        = "asia-east1"
    
    
      # 設置最小吞吐量為 200 Mbps(最小允許值)
      min_throughput = 200
    
      # 設置最大吞吐量為 1000 Mbps(最大允許值)
      # 您可以根據需求調整這個值,但它必須是 100 的倍數,且不小於 min_throughput
      max_throughput = 1000
    
       }
    
  • 在 console 觸發 cloud function 看看狀況


上一篇
PoC
下一篇
將 Infra 從 Code-Base 改為 Cloud-Resource-Base Part3
系列文
從 AWS 轉生到 GCP 世界,還順便轉職成 DevOps 的 SRE30
圖片
  直播研討會
圖片
{{ item.channelVendor }} {{ item.webinarstarted }} |
{{ formatDate(item.duration) }}
直播中

尚未有邦友留言

立即登入留言