Monday, August 30, 2021

GCP Short lab: launch an instance, Startup script & test logging via CloudShell

                                         This image has an empty alt attribute; its file name is BlogHeader_Set2_D.png
Intro

GCP Cloud Shell not only allows to execute and automate tasks around your Cloud resources, but It can help you understand what happens behind the scene when a vm is provisioned for example. That’s also a good way to prepare for the GCP Cloud engineer certification. Although it normally takes a subscription to ACloudGuru to follow the course behind this quick lab, they still made the startup script available for free in their GitHub. That’s all we need to demonstrate how to complete the task using google GCloud commands which was not covered in the original course.

Here’s a direct link to the shell script in their GitHub repo: gcp-cloud-engineer/compute-labs/worker-startup-script.sh

I. Lab purpose

In this exercise, we want to learn what a compute engine requires in order to run a script at launch that will install a logging agent, do a stress test, and write all the syslog events into a Google logging and output feedback into a bucket. You’ll realize that there are underlying service accounts and permission scopes that allow a machine to interact with other cloud services and APIs to get the job done. To sum it up :  

  • We want a new project,
  • Launch a new GCE instance that runs a specific script
  • System logs shipped to Stackdriver (Cloud logging) logs.
  • We want to have a new GCS bucket for resulting log file.
  • We want the log file (status feedback) to appear in that new bucket after the instance finishes starting up.
  • All this with No SSH access to the instance itself. Vm should handle everything on its own.

Google CLI setup

I used Cloud Shell to complete this lab. However, you can also run GCloud commands from your workstation via Google SDK.

Main steps

  • Retrieve the Startup script
  • Create a new project
  • Create logs destination bucket
  • Enable GCE API
  • Create new GCE instance
    • Enable Scope to write to GCS
    • Set startup script
    • Set metadata to point to logs destination bucket
  • Monitoring Progress
    • Check stack driver logs
    • Check CPU graph
    • Check logs bucket



1. Create a New Project


  • From your Cloud Shell terminal.
    $ gcloud projects create gcs-gce-project-lab --name="GCS & GCE LAB" --labels=type=lab
    $ cloud config set project gcs-gce-project-lab
    Link the new project with a billing account
  • There are 2 ways to link a project to a billing account. One through alpha and another through beta gcloud command
  • $ gcloud beta billing accounts list

    ACCOUNT_ID NAME OPEN MASTER_ACCOUNT_ID
    -------------------- ---------------------------- ------ ------------------
    0X0X0X-0X0X0X-0X0X0X Brokedba Billing account True ** link the project to a billing account **

    $ gcloud alpha billing accounts projects link gcs-gce-project-lab \
    --billing-account=
    0X0X0X-0X0X0X-0X0X0X

    ** OR **
    $ gcloud beta billing projects link gcs-gce-project-lab --billing-account=0X0X0X-0X0X0X-0X0X0X


2. Enable GCE API

  • By default, Most of the APIs are disabled in a project upon creation unlike other Cloud platforms.
  • In our case we need to enable the GCE APIs in order to create and launch vms.
  • GCE APIs

    $ gcloud services enable compute.googleapis.com
    $ gcloud services enable computescanning.googleapis.com



3. Set default region/zone

  • This can be done within the current GCloud configuration level or at project level
  • Active Config in Cloud Shell

    $ gcloud config set compute/region us-east1
    $ gcloud config set compute/zone us-east1-b

    Project level

    $ gcloud compute project-info add-metadata --metadata google-compute-default-region=us-east1,google-compute-default-zone=us-east1-b --project gcs-gce-project-lab



4. Service account

  • In GCP, a default service account is attached to projects upon creation that all future vms can use to interact with the rest of the platform through IAM permissions and scopes. We’ll use it and add more privileges for our vm.  
  • The FORMAT is: PROJECT_NUMBER-compute@developer.gserviceaccount.com
  • We just run the below commands to retrieve the Service account name:
  • PROJECT number

    $ gcloud projects describe gcs-gce-project-lab | grep projectNumber

    projectNumber: '521829558627'

    Derived Service Account name

    => 221829558627-compute@developer.gserviceaccount.com





5. Download the Startup script from github 

  • This script is responsible for updating Linux packages, install a logging agent, do a stress test, and write all the syslog events into a GCS bucket. `lab-logs-bucket` is the metadata name matching the bucket we’ll be creating.

    This image has an empty alt attribute; its file name is image-21.png
  • But first, we need to download it locally so we can edit it and call it when creating the compute instance later on.
  • PROJECT number

    $ wget https://raw.githubusercontent.com/ACloudGuru/gcp-cloud-engineer/master/compute-labs/worker-startup-script.sh

    VM metadata

    Every compute instance stores its metadata on a metadata server. Your VM automatically has access to the metadata server API without any additional authorization. Metadata is stored as key:value pairs and there are two types; default and custom. In our example the bucket name `gs://gcs-gce-bucket` is stored in the instance metadata name `lab-logs-bucket` that our script will query during the startup. 









6. Create the GCS bucket

  •   The bucket name must match the log_bucket_metadata_name called in our worker-startup-script. Choose a unique one
  • logs bucket

    $ gsutil mb -l us-east1 -p gcs-gce-project-lab gs://gcs-gce-bucket


7. Create GCE instance and run Startup script

  • We are finally ready to give this test a go and monitor the progress of each task including metadata and startup script.
  •  You should run the command in one line as my display is truncated for better readability 
  • Instance creation

    $ gcloud compute instances create gcs-gce-vm --metadata lab-logs-bucket=gs://gcs-gce-bucket --metadata-from-file startup-script=./worker-startup-script.sh --machine-type=f1-micro --image-family debian-10 --image-project debian-cloud --service-account 521829558627-compute@developer.gserviceaccount.com --scopes storage-rw,logging-write,monitoring-write,logging-write,pubsub,service-management,service-control,trace

    NAME          ZONE      MACHINE_TYPE  INTERNAL_IP  EXTERNAL_IP    STATUS
    ---------------------- ------------- ------------- -------------- ---------
    gcs-gce-vm us-east1-b f1-micro 10.100.0.1 34.72.95.120 RUNNING
     Notice the write privilege into GCS (storage-rw)  that will allow our vm to write the logs into the logs bucket.

Final results

  • System logs available in Stackdriver Logs

  • GCS bucket created  and log File appears in bucket after instance finishes starting up

  • No SSH access needed to the instance



This image has an empty alt attribute; its file name is image-25.pngThis image has an empty alt attribute; its file name is image-26.pngThis image has an empty alt attribute; its file name is image-27.png 


CONCLUSION 

    • We learned that we can automate startup and shutdown scripts without ever needing to ssh to the instance   
    • We learned more about the scopes needed from a vm to interact with google storage through its service account
    • Using default or custom service accounts can efficiently streamline the tasks done by vms or services without human intervention
    • Feel free to try the lab yourself and remember to change the name of the bucket since they are globally unique      

Thanks for reading!

No comments:

Post a Comment