Summer Sale - Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dpm65

Associate-Cloud-Engineer Google Cloud Certified - Associate Cloud Engineer Questions and Answers

Questions 4

You need to create a custom VPC with a single subnet. The subnet’s range must be as large as possible. Which range should you use?

Options:

A.

.00.0.0/0

B.

10.0.0.0/8

C.

172.16.0.0/12

D.

192.168.0.0/16

Buy Now
Questions 5

Your application is running on Google Cloud in a managed instance group (MIG). You see errors in Cloud Logging for one VM that one of the processes is not responsive. You want to replace this VM in the MIG quickly. What should you do?

Options:

A.

Select the MIG from the Compute Engine console and, in the menu, select Replace VMs.

B.

Use the gcloud compute instance-groups managed recreate-instances command to recreate theVM.

C.

Use the gcloud compute instances update command with a REFRESH action for the VM.

D.

Update and apply the instance template of the MIG.

Buy Now
Questions 6

Your company publishes large files on an Apache web server that runs on a Compute Engine instance. The Apache web server is not the only application running in the project. You want to receive an email when the egress network costs for the server exceed 100 dollars for the current month as measured by Google Cloud Platform (GCP). What should you do?

Options:

A.

Set up a budget alert on the project with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”

B.

Set up a budget alert on the billing account with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”

C.

Export the billing data to BigQuery. Create a Cloud Function that uses BigQuery to sum the egress network costs of the exported billing data for the Apache web server for the current month and sends an email if it is over 100 dollars. Schedule the Cloud Function using Cloud Scheduler to run hourly.

D.

Use the Stackdriver Logging Agent to export the Apache web server logs to Stackdriver Logging. Create a Cloud Function that uses BigQuery to parse the HTTP response log data in Stackdriver for the current month and sends an email if the size of all HTTP responses, multiplied by current GCP egress prices, totals over 100 dollars. Schedule the Cloud Function using Cloud Scheduler to run hourly.

Buy Now
Questions 7

You created a cluster.YAML file containing

resources:

name: cluster

type: container.v1.cluster

properties:

zone: europe-west1-b

cluster:

description: My GCP ACE cluster

initialNodeCount: 2

You want to use Cloud Deployment Manager to create this cluster in GKE. What should you do?

Options:

A.

gcloud deployment-manager deployments create my-gcp-ace-cluster --config cluster.yaml

B.

gcloud deployment-manager deployments create my-gcp-ace-cluster --type container.v1.cluster --config cluster.yaml

C.

gcloud deployment-manager deployments apply my-gcp-ace-cluster --type container.v1.cluster --config cluster.yaml

D.

gcloud deployment-manager deployments apply my-gcp-ace-cluster --config cluster.yaml

Buy Now
Questions 8

You have an application that receives SSL-encrypted TCP traffic on port 443. Clients for this application are located all over the world. You want to minimize latency for the clients. Which load balancing option should you use?

Options:

A.

HTTPS Load Balancer

B.

Network Load Balancer

C.

SSL Proxy Load Balancer

D.

Internal TCP/UDP Load Balancer. Add a firewall rule allowing ingress traffic from 0.0.0.0/0 on the target instances.

Buy Now
Questions 9

Your company stores data from multiple sources that have different data storage requirements. These data include:

1. Customer data that is structured and read with complex queries

2. Historical log data that is large in volume and accessed infrequently

3. Real-time sensor data with high-velocity writes, which needs to be available for analysis but can tolerate some data loss

You need to design the most cost-effective storage solution that fulfills all data storage requirements. What should you do?

Options:

A.

Use Spanner for all data.

B.

Use Cloud SQL for customer data, Cloud Storage (Coldline) for historical logs, and BigQuery for sensor data.

C.

Use Cloud SQL for customer data, Cloud Storage (Archive) for historical logs, and Bigtable for sensor data.

D.

Use Firestore for customer data, Cloud Storage (Nearline) for historical logs, and Bigtable for sensor data.

Buy Now
Questions 10

You have created an application that is packaged into a Docker image. You want to deploy the Docker image as a workload on Google Kubernetes Engine. What should you do?

Options:

A.

Upload the image to Cloud Storage and create a Kubernetes Service referencing the image.

B.

Upload the image to Cloud Storage and create a Kubernetes Deployment referencing the image.

C.

Upload the image to Container Registry and create a Kubernetes Service referencing the image.

D.

Upload the image to Container Registry and create a Kubernetes Deployment referencing the image.

Buy Now
Questions 11

Your web application is hosted on Cloud Run and needs to query a Cloud SOL database. Every morning during a traffic spike, you notice API quota errors in Cloud SOL logs. The project has already reached the maximum API quota. You want to make a configuration change to mitigate the issue. What should you do?

Options:

A.

Modify the minimum number of Cloud Run instances.

B.

Set a minimum concurrent requests environment variable for the application.

C.

Modify the maximum number of Cloud Run instances.

D.

Use traffic splitting.

Buy Now
Questions 12

You created a Kubernetes deployment by running kubectl run nginx image=nginx labels=app=prod. Your Kubernetes cluster is also used by a number of other deployments. How can you find the identifier of the pods for this nginx deployment?

Options:

A.

kubectl get deployments –output=pods

B.

gcloud get pods –selector=”app=prod”

C.

kubectl get pods -I “app=prod”

D.

gcloud list gke-deployments -filter={pod }

Buy Now
Questions 13

All development (dev) teams in your organization are located in the United States. Each dev team has its own Google Cloud project. You want to restrict access so that each dev team can only create cloud resources in the United States (US). What should you do?

Options:

A.

Create a folder to contain all the dev projects Create an organization policy to limit resources in US locations.

B.

Create an organization to contain all the dev projects. Create an Identity and Access Management (IAM) policy to limit the resources in US regions.

C.

Create an Identity and Access Management

D.

Create an Identity and Access Management (IAM)policy to restrict the resources locations in all dev projects. Apply the policy to all dev roles.

Buy Now
Questions 14

Your coworker has helped you set up several configurations for gcloud. You've noticed that you're running commands against the wrong project. Being new to the company, you haven't yet memorized any of the projects. With the fewest steps possible, what's the fastest way to switch to the correct configuration?

Options:

A.

Run gcloud configurations list followed by gcloud configurations activate .

B.

Run gcloud config list followed by gcloud config activate.

C.

Run gcloud config configurations list followed by gcloud config configurations activate.

D.

Re-authenticate with the gcloud auth login command and select the correct configurations on login.

Buy Now
Questions 15

You are hosting an application from Compute Engine virtual machines (VMs) in us–central1–a. You want to adjust your design to support the failure of a single Compute Engine zone, eliminate downtime, and minimize cost. What should you do?

Options:

A.

– Create Compute Engine resources in us–central1–b.–Balance the load across both us–central1–a and us–central1–b.

B.

– Create a Managed Instance Group and specify us–central1–a as the zone.–Configure the Health Check with a short Health Interval.

C.

– Create an HTTP(S) Load Balancer.–Create one or more global forwarding rules to direct traffic to your VMs.

D.

– Perform regular backups of your application.–Create a Cloud Monitoring Alert and be notified if your application becomes unavailable.–Restore from backups when notified.

Buy Now
Questions 16

You need to extract text from audio files by using the Speech-to-Text API. The audio files are pushed to a Cloud Storage bucket. You need to implement a fully managed, serverless compute solution that requires authentication and aligns with Google-recommended practices. You want to automate the call to the API by submitting each file to the API as the audio file arrives in the bucket. What should you do?

Options:

A.

Run a Kubernetes job to scan the bucket regularly for incoming files, and call the Speech-to-Text API for each unprocessed file.

B.

Create an App Engine standard environment triggered by Cloud Storage bucket events to submit the file URI to the Google Speech-to-Text API.

C.

Run a Python script by using a Linux cron job in Compute Engine to scan the bucket regularly for incoming files, and call the Speech-to-Text API for each unprocessed file.

D.

Create a Cloud Function triggered by Cloud Storage bucket events to submit the file URI to the Google Speech-to-Text API.

Buy Now
Questions 17

Your company set up a complex organizational structure on Google Could Platform. The structure includes hundreds of folders and projects. Only a few team members should be able to view the hierarchical structure. You need to assign minimum permissions to these team members and you want to follow Google-recommended practices. What should you do?

Options:

A.

Add the users to roles/browser role.

B.

Add the users to roles/iam.roleViewer role.

C.

Add the users to a group, and add this group to roles/browser role.

D.

Add the users to a group, and add this group to roles/iam.roleViewer role.

Buy Now
Questions 18

You create a Deployment with 2 replicas in a Google Kubernetes Engine cluster that has a single preemptible node pool. After a few minutes, you use kubectl to examine the status of your Pod and observe that one of them is still in Pending status:

Associate-Cloud-Engineer Question 18

What is the most likely cause?

Options:

A.

The pending Pod's resource requests are too large to fit on a single node of the cluster.

B.

Too many Pods are already running in the cluster, and there are not enough resources left to schedule the pending Pod.

C.

The node pool is configured with a service account that does not have permission to pull the container image used by the pending Pod.

D.

The pending Pod was originally scheduled on a node that has been preempted between the creation of the Deployment and your verification of the Pods’ status. It is currently being rescheduled on a new node.

Buy Now
Questions 19

Your organization needs to grant users access to query datasets in BigQuery but prevent them from accidentally deleting the datasets. You want a solution that follows Google-recommended practices. What should you do?

Options:

A.

Add users to roles/bigquery user role only, instead of roles/bigquery dataOwner.

B.

Add users to roles/bigquery dataEditor role only, instead of roles/bigquery dataOwner.

C.

Create a custom role by removing delete permissions, and add users to that role only.

D.

Create a custom role by removing delete permissions. Add users to the group, and then add the group to the custom role.

Buy Now
Questions 20

(You are migrating your on-premises workload to Google Cloud. Your company is implementing its Cloud Billing configuration and requires access to a granular breakdown of its Google Cloud costs. You need to ensure that the Cloud Billing datasets are available in BigQuery so you can conduct a detailed analysis of costs. What should you do?)

Options:

A.

Enable the BigQuery API and ensure that the BigQuery User IAM role is selected. Change the BigQuery dataset to select a data location.

B.

Create a Cloud Billing account. Enable the BigQuery Data Transfer Service API to export pricing data.

C.

Enable Cloud Billing data export to BigQuery when you create a Cloud Billing account.

D.

Enable Cloud Billing on the project and link a Cloud Billing account. Then view the billing data table in the BigQuery dataset.

Buy Now
Questions 21

Your auditor wants to view your organization's use of data in Google Cloud. The auditor is most interested in auditing who accessed data in Cloud Storage buckets. You need to help the auditor access the data they need. What should you do?

Options:

A.

Assign the appropriate permissions, and then use Cloud Monitoring to review metrics

B.

Use the export logs API to provide the Admin Activity Audit Logs in the format they want

C.

Turn on Data Access Logs for the buckets they want to audit, and Then build a query in the log viewer that filters on Cloud Storage

D.

Assign the appropriate permissions, and then create a Data Studio report on Admin Activity Audit Logs

Buy Now
Questions 22

You have deployed an application on a Compute Engine instance. An external consultant needs to access the Linux-based instance. The consultant is connected to your corporate network through a VPN connection, but the consultant has no Google account. What should you do?

Options:

A.

Instruct the external consultant to use the gcloud compute ssh command line tool by using Identity-Aware Proxy to access the instance.

B.

Instruct the external consultant to use the gcloud compute ssh command line tool by using the public IP address of the instance to access it.

C.

Instruct the external consultant to generate an SSH key pair, and request the public key from the consultant.Add the public key to the instance yourself, and have the consultant access the instance through SSH with their private key.

D.

Instruct the external consultant to generate an SSH key pair, and request the private key from the consultant.Add the private key to the instance yourself, and have the consultant access the instance through SSH with their public key.

Buy Now
Questions 23

Your company is moving its entire workload to Compute Engine. Some servers should be accessible through the Internet, and other servers should only be accessible over the internal network. All servers need to be able to talk to each other over specific ports and protocols. The current on-premises network relies on a demilitarized zone (DMZ) for the public servers and a Local Area Network (LAN) for the private servers. You need to design the networking infrastructure on

Google Cloud to match these requirements. What should you do?

Options:

A.

1. Create a single VPC with a subnet for the DMZ and a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public ingress traffic for the DMZ.

B.

1. Create a single VPC with a subnet for the DMZ and a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public egress traffic for the DMZ.

C.

1. Create a VPC with a subnet for the DMZ and another VPC with a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public ingress traffic for the DMZ.

D.

1. Create a VPC with a subnet for the DMZ and another VPC with a subnet for the LAN. 2. Set up firewall rules to open up relevant traffic between the DMZ and the LAN subnets, and another firewall rule to allow public egress traffic for the DMZ.

Buy Now
Questions 24

You need to produce a list of the enabled Google Cloud Platform APIs for a GCP project using the gcloud command line in the Cloud Shell. The project name is my-project. What should you do?

Options:

A.

Run gcloud projects list to get the project ID, and then run gcloud services list --project .

B.

Run gcloud init to set the current project to my-project, and then run gcloud services list --available.

C.

Run gcloud info to view the account value, and then run gcloud services list --account .

D.

Run gcloud projects describe to verify the project value, and then run gcloud services list --available.

Buy Now
Questions 25

You are deploying an application to Google Kubernetes Engine (GKE) that needs to call an external third-party API. You need to provide the external API vendor with a list of IP addresses for their firewall to allow traffic from your application. You want to follow Google-recommended practices and avoid any risk of interrupting traffic to the API due to IP address changes. What should you do?

Options:

A.

Configure your GKE cluster with one node, and set the node to have a static external IP address. Ensure that the GKE cluster autoscaler is off. Send the external IP address of the node to the vendor to be added to the allowlist.

B.

Configure your GKE cluster with private nodes. Configure a Cloud NAT instance with static IP addresses. Provide these IP addresses to the vendor to be added to the allowlist.

C.

Configure your GKE cluster with public nodes. Write a Cloud Function that pulls the public IP addresses of each node in the cluster. Trigger the function to run every day with Cloud Scheduler. Send the list to the vendor by email every day.

D.

Configure your GKE cluster with private nodes. Configure a Cloud NAT instance with dynamic IP addresses. Provide these IP addresses to the vendor to be added to the allowlist.

Buy Now
Questions 26

During a recent audit of your existing Google Cloud resources, you discovered several users with email addresses outside of your Google Workspace domain.

You want to ensure that your resources are only shared with users whose email addresses match your domain. You need to remove any mismatched users, and you want to avoid having to audit your resources to identify mismatched users. What should you do?

Options:

A.

Create a Cloud Scheduler task to regularly scan your projects and delete mismatched users.

B.

Create a Cloud Scheduler task to regularly scan your resources and delete mismatched users.

C.

Set an organizational policy constraint to limit identities by domain to automatically remove mismatched users.

D.

Set an organizational policy constraint to limit identities by domain, and then retroactively remove the existing mismatched users.

Buy Now
Questions 27

You are developing a new application and are looking for a Jenkins installation to build and deploy your source code. You want to automate the installation as quickly and easily as possible. What should you do?

Options:

A.

Deploy Jenkins through the Google Cloud Marketplace.

B.

Create a new Compute Engine instance. Run the Jenkins executable.

C.

Create a new Kubernetes Engine cluster. Create a deployment for the Jenkins image.

D.

Create an instance template with the Jenkins executable. Create a managed instance group with this template.

Buy Now
Questions 28

You need to update a deployment in Deployment Manager without any resource downtime in the deployment. Which command should you use?

Options:

A.

gcloud deployment-manager deployments create --config

B.

gcloud deployment-manager deployments update --config

C.

gcloud deployment-manager resources create --config

D.

gcloud deployment-manager resources update --config

Buy Now
Questions 29

You have a managed instance group comprised of preemptible VM's. All of the VM's keepdeleting and recreating themselves every minute. What is a possible cause of thisbehavior?

Options:

A.

Your zonal capacity is limited, causing all preemptible VM's to be shutdown torecover capacity. Try deploying your group to another zone.

B.

You have hit your instance quota for the region.

C.

Your managed instance group's VM's are toggled to only last 1 minute inpreemptible settings.

D.

Your managed instance group's health check is repeatedly failing, either to amisconfigured health check or misconfigured firewall rules not allowing the healthcheck to access the instance

Buy Now
Questions 30

For analysis purposes, you need to send all the logs from all of your Compute Engine instances to a BigQuery dataset called platform-logs. You have already installed the Stackdriver Logging agent on all the instances. You want to minimize cost. What should you do?

Options:

A.

1. Give the BigQuery Data Editor role on the platform-logs dataset to the service accounts used by your instances.2. Update your instances’ metadata to add the following value: logs-destination: bq://platform-logs.

B.

1. In Stackdriver Logging, create a logs export with a Cloud Pub/Sub topic called logs as a sink.2. Create a Cloud Function that is triggered by messages in the logs topic.3. Configure that Cloud Function to drop logs that are not from Compute Engine and to insert Compute Engine logs in the platform-logs dataset.

C.

1. In Stackdriver Logging, create a filter to view only Compute Engine logs.2. Click Create Export.3. Choose BigQuery as Sink Service, and the platform-logs dataset as Sink Destination.

D.

1. Create a Cloud Function that has the BigQuery User role on the platform-logs dataset.2. Configure this Cloud Function to create a BigQuery Job that executes this query:INSERT INTO dataset.platform-logs (timestamp, log)SELECT timestamp, log FROM compute.logsWHERE timestamp > DATE_SUB(CURRENT_DATE(), INTERVAL 1 DAY)3. Use Cloud Scheduler to trigger this Cloud Function once a day.

Buy Now
Questions 31

You have an application running in Google Kubernetes Engine (GKE) with cluster autoscaling enabled. The application exposes a TCP endpoint. There are several replicas of this application. You have a Compute Engine instance in the same region, but in another Virtual Private Cloud (VPC), called gce-network, that has no overlapping IP ranges with the first VPC. This instance needs to connect to the application on GKE. You want to minimize effort. What should you do?

Options:

A.

1. In GKE, create a Service of type LoadBalancer that uses the application's Pods as backend.2. Set the service's externalTrafficPolicy to Cluster.3. Configure the Compute Engine instance to use the address of the load balancer that has been created.

B.

1. In GKE, create a Service of type NodePort that uses the application's Pods as backend.2. Create a Compute Engine instance called proxy with 2 network interfaces, one in each VPC.3. Use iptables on this instance to forward traffic from gce-network to the GKE nodes.4. Configure the Compute Engine instance to use the address of proxy in gce-network as endpoint.

C.

1. In GKE, create a Service of type LoadBalancer that uses the application's Pods as backend.2. Add an annotation to this service: cloud.google.com/load-balancer-type: Internal3. Peer the two VPCs together.4. Configure the Compute Engine instance to use the address of the load balancer that has been created.

D.

1. In GKE, create a Service of type LoadBalancer that uses the application's Pods as backend.2. Add a Cloud Armor Security Policy to the load balancer that whitelists the internal IPs of the MIG's instances.3. Configure the Compute Engine instance to use the address of the load balancer that has been created.

Buy Now
Questions 32

Your team is building a website that handles votes from a large user population. The incoming votes will arrive at various rates. You want to optimize the storage and processing of the votes. What should you do?

Options:

A.

Save the incoming votes to Firestore. Use Cloud Scheduler to trigger a Cloud Functions instance to periodically process the votes.

B.

Use a dedicated instance to process the incoming votes. Send the votes directly to this instance.

C.

Save the incoming votes to a JSON file on Cloud Storage. Process the votes in a batch at the end of the day.

D.

Save the incoming votes to Pub/Sub. Use the Pub/Sub topic to trigger a Cloud Functions instance to process the votes.

Buy Now
Questions 33

A team of data scientists infrequently needs to use a Google Kubernetes Engine (GKE) cluster that you manage. They require GPUs for some long-running, non-restartable jobs. You want to minimize cost. What should you do?

Options:

A.

Enable node auto-provisioning on the GKE cluster.

B.

Create a VerticalPodAutscaler for those workloads.

C.

Create a node pool with preemptible VMs and GPUs attached to those VMs.

D.

Create a node pool of instances with GPUs, and enable autoscaling on this node pool with a minimum size of 1.

Buy Now
Questions 34

You are configuring service accounts for an application that spans multiple projects. Virtual machines (VMs) running in the web-applications project need access to BigQuery datasets in crm-databases-proj. You want to follow Google-recommended practices to give access to the service account in the web-applications project. What should you do?

Options:

A.

Give “project owner” for web-applications appropriate roles to crm-databases- proj

B.

Give “project owner” role to crm-databases-proj and the web-applications project.

C.

Give “project owner” role to crm-databases-proj and bigquery.dataViewer role to web-applications.

D.

Give bigquery.dataViewer role to crm-databases-proj and appropriate roles to web-applications.

Buy Now
Questions 35

You just installed the Google Cloud CLI on your new corporate laptop. You need to list the existing instances of your company on Google Cloud. What must you do before you run the gcloud compute instances list command?

Choose 2 answers

Options:

A.

Run gcloud auth login, enter your login credentials in the dialog window, and paste the received login token to gcloud CLI.

B.

Create a Google Cloud service account, and download the service account key. Place the key file in a folder on your machine where gcloud CLI can find it.

C.

Download your Cloud Identity user account key. Place the key file in a folder on your machine where gcloud CLI can find it.

D.

Run gcloud config set compute/zone $my_zone to set the default zone for gcloud CLI.

E.

Run gcloud config set project $my_project to set the default project for gcloud CLI.

Buy Now
Questions 36

You have successfully created a development environment in a project for an application. This application uses Compute Engine and Cloud SQL. Now, you need to create a production environment for this application.

The security team has forbidden the existence of network routes between these 2 environments, and asks you to follow Google-recommended practices. What should you do?

Options:

A.

Create a new project, enable the Compute Engine and Cloud SQL APIs in that project, and replicate the setup you have created in the development environment.

B.

Create a new production subnet in the existing VPC and a new production Cloud SQL instance in your existing project, and deploy your application using those resources.

C.

Create a new project, modify your existing VPC to be a Shared VPC, share that VPC with your new project, and replicate the setup you have in the development environment in that new project, in the Shared VPC.

D.

Ask the security team to grant you the Project Editor role in an existing production project used by another division of your company. Once they grant you that role, replicate the setup you have in the development environment in that project.

Buy Now
Questions 37

You need to manage a Cloud Spanner Instance for best query performance. Your instance in production runs in a single Google Cloud region. You need to improve performance in the shortest amount of time. You want to follow Google best practices for service configuration. What should you do?

Options:

A.

Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 45% If you exceed this threshold, add nodes lo your instance.

B.

Create an alert in Cloud Monitoring to alert when the percentage to high priority CPU utilization reaches 45% Use database query statistics to identify queries that result in high CPU usage, and then rewrite those queries to optimize their resource usage

C.

Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 65% If you exceed this threshold, add nodes to your instance

D.

Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 65%. Use database query statistics to identity queries that result in high CPU usage, and then rewrite those queries to optimize their resource usage.

Buy Now
Questions 38

You need to immediately change the storage class of an existing Google Cloud bucket. You need to reduce service cost for infrequently accessed files stored in that bucket and for all files that will be added to that bucket in the future. What should you do?

Options:

A.

Use the gsutil to rewrite the storage class for the bucket Change the default storage class for the bucket

B.

Use the gsutil to rewrite the storage class for the bucket Set up Object Lifecycle management on the bucket

C.

Create a new bucket and change the default storage class for the bucket Set up Object Lifecycle management on lite bucket

D.

Create a new bucket and change the default storage class for the bucket import the files from the previous bucket into the new bucket

Buy Now
Questions 39

(You are managing the security configuration of your company's Google Cloud organization. The Operations team needs specific permissions on both a Google Kubernetes Engine (GKE) cluster and a Cloud SQL instance. Two predefined Identity and Access Management (IAM) roles exist that contain a subset of the permissions needed by the team. You need to configure the necessary IAM permissions for this team while following Google-recommended practices. What should you do?)

Options:

A.

Grant the team the two predefined IAM roles.

B.

Create a custom IAM role that combines the permissions from the two relevant predefined roles.

C.

Create a custom IAM role that includes only the required permissions from the predefined roles.

D.

Grant the team the IAM roles of Kubernetes Engine Admin and Cloud SQL Admin.

Buy Now
Questions 40

(You have an application running inside a Compute Engine instance. You want to provide the application with secure access to a BigQuery dataset. You must ensure that credentials are only valid for a short period of time, and your application will only have access to the intended BigQuery dataset. You want to follow Google-recommended practices and minimize your operational costs. What should you do?)

Options:

A.

Attach a custom service account to the instance, and grant the service account the BigQuery Data Viewer IAM role on the project.

B.

Attach a new service account to the instance every hour, and grant the service account the BigQuery Data Viewer IAM role on the dataset.

C.

Attach a custom service account to the instance, and grant the service account the BigQuery Data Viewer IAM role on the dataset.

D.

Attach a new service account to the instance every hour, and grant the service account the BigQuery Data Viewer IAM role on the project.

Buy Now
Questions 41

You have deployed multiple Linux instances on Compute Engine. You plan on adding more instances in the coming weeks. You want to be able to access all of these instances through your SSH client over me Internet without having to configure specific access on the existing and new instances. You do not want the Compute Engine instances to have a public IP. What should you do?

Options:

A.

Configure Cloud Identity-Aware Proxy (or HTTPS resources

B.

Configure Cloud Identity-Aware Proxy for SSH and TCP resources.

C.

Create an SSH keypair and store the public key as a project-wide SSH Key

D.

Create an SSH keypair and store the private key as a project-wide SSH Key

Buy Now
Questions 42

You are planning to migrate a database and a backend application to a Standard Google Kubernetes Engine (GKE) cluster. You need to prevent data loss and make sure there are enough nodes available for your backend application based on the demands of your workloads. You want to follow Google-recommended practices and minimize the amount of manual work required. What should you do?

Options:

A.

Run your database as a StatefulSet. Configure cluster autoscaling to handle changes in the demands of your workloads.

B.

Run your database as a single Pod. Run the resize command when you notice changes in the demands of your workloads.

C.

Run your database as a Deployment. Configure cluster autoscaling to handle changes in the demands of your workloads.

D.

Run your database as a DaemonSet. Run the resize command when you notice changes in the demands of your workloads.

Buy Now
Questions 43

Your organization has decided to deploy all its compute workloads to Kubernetes on Google Cloud and two other cloud providers. You want to build an infrastructure-as-code solution to automate the provisioning process for all cloud resources. What should you do?

Options:

A.

Build the solution by using YAML manifests, and provision the resources.

B.

Build the solution by using Terraform, and provision the resources.

C.

Build the solution by using Python and the cloud SDKs from all providers to provision the resources.

D.

Build the solution by using Config Connector, and provision the resources.

Buy Now
Questions 44

You created several resources in multiple Google Cloud projects. All projects are linked to different billing accounts. To better estimate future charges, you want to have a single visual representation of all costs incurred. You want to include new cost data as soon as possible. What should you do?

Options:

A.

Configure Billing Data Export to BigQuery and visualize the data in Data Studio.

B.

Visit the Cost Table page to get a CSV export and visualize it using Data Studio.

C.

Fill all resources in the Pricing Calculator to get an estimate of the monthly cost.

D.

Use the Reports view in the Cloud Billing Console to view the desired cost information.

Buy Now
Questions 45

You are building an application that processes data files uploaded from thousands of suppliers. Your primary goals for the application are data security and the expiration of aged data. You need to design the application to:

•Restrict access so that suppliers can access only their own data.

•Give suppliers write access to data only for 30 minutes.

•Delete data that is over 45 days old.

You have a very short development cycle, and you need to make sure that the application requires minimal maintenance. Which two strategies should you use? (Choose two.)

Options:

A.

Build a lifecycle policy to delete Cloud Storage objects after 45 days.

B.

Use signed URLs to allow suppliers limited time access to store their objects.

C.

Set up an SFTP server for your application, and create a separate user for each supplier.

D.

Build a Cloud function that triggers a timer of 45 days to delete objects that have expired.

E.

Develop a script that loops through all Cloud Storage buckets and deletes any buckets that are older than 45 days.

Buy Now
Questions 46

You are managing a project for the Business Intelligence (BI) department in your company. A data pipeline ingests data into BigQuery via streaming. You want the users in the BI department to be able to run the custom SQL queries against the latest data in BigQuery. What should you do?

Options:

A.

Create a Data Studio dashboard that uses the related BigQuery tables as a source and give the BI team view access to the Data Studio dashboard.

B.

Create a Service Account for the BI team and distribute a new private key to each member of the BI team.

C.

Use Cloud Scheduler to schedule a batch Dataflow job to copy the data from BigQuery to the BI team's internal data warehouse.

D.

Assign the IAM role of BigQuery User to a Google Group that contains the members of the BI team.

Buy Now
Questions 47

(You are developing an internet of things (IoT) application that captures sensor data from multiple devices that have already been set up. You need to identify the global data storage product your company should use to store this data. You must ensure that the storage solution you choose meets your requirements of sub-millisecond latency. What should you do?)

Options:

A.

Store the IoT data in Spanner. Use caches to speed up the process and avoid latencies.

B.

Store the IoT data in Bigtable.

C.

Capture IoT data in BigQuery datasets.

D.

Store the IoT data in Cloud Storage. Implement caching by using Cloud CDN.

Buy Now
Questions 48

You are using multiple configurations for gcloud. You want to review the configured Kubernetes Engine cluster of an inactive configuration using the fewest possible steps. What should you do?

Options:

A.

Use gcloud config configurations describe to review the output.

B.

Use gcloud config configurations activate and gcloud config list to review the output.

C.

Use kubectl config get-contexts to review the output.

D.

Use kubectl config use-context and kubectl config view to review the output.

Buy Now
Questions 49

You have deployed an application on a single Compute Engine instance. The application writes logs to disk. Users start reporting errors with the application. You want to diagnose the problem. What should you do?

Options:

A.

Navigate to Cloud Logging and view the application logs.

B.

Connect to the instance’s serial console and read the application logs.

C.

Configure a Health Check on the instance and set a Low Healthy Threshold value.

D.

Install and configure the Cloud Logging Agent and view the logs from Cloud Logging.

Buy Now
Questions 50

You have a Linux VM that must connect to Cloud SQL. You created a service account with the appropriate access rights. You want to make sure that the VM uses this service account instead of the default Compute Engine service account. What should you do?

Options:

A.

When creating the VM via the web console, specify the service account under the ‘Identity and API Access’ section.

B.

Download a JSON Private Key for the service account. On the Project Metadata, add that JSON as the value for the key compute-engine-service-account.

C.

Download a JSON Private Key for the service account. On the Custom Metadata of the VM, add that JSON as the value for the key compute-engine-service-account.

D.

Download a JSON Private Key for the service account. After creating the VM, ssh into the VM and save the JSON under ~/.gcloud/compute-engine-service-account.json.

Buy Now
Questions 51

Your company uses a large number of Google Cloud services centralized in a single project. All teams have specific projects for testing and development. The DevOps team needs access to all of theproduction services in order to perform their job. You want to prevent Google Cloud product changes from broadening their permissions in the future. You want to follow Google-recommended practices. What should you do?

Options:

A.

Grant all members of the DevOps team the role of Project Editor on the organization level.

B.

Grant all members of the DevOps team the role of Project Editor on the production project.

C.

Create a custom role that combines the required permissions. Grant the DevOps team the custom role on the production project.

D.

Create a custom role that combines the required permissions. Grant the DevOps team the custom role on the organization level.

Buy Now
Questions 52

You have a website hosted on App Engine standard environment. You want 1% of your users to see a new test version of the website. You want to minimize complexity. What should you do?

Options:

A.

Deploy the new version in the same application and use the --migrate option.

B.

Deploy the new version in the same application and use the --splits option to give a weight of 99 to the current version and a weight of 1 to the new version.

C.

Create a new App Engine application in the same project. Deploy the new version in that application. Use the App Engine library to proxy 1% of the requests to the new version.

D.

Create a new App Engine application in the same project. Deploy the new version in that application. Configure your network load balancer to send 1% of the traffic to that new application.

Buy Now
Questions 53

Your company’s infrastructure is on-premises, but all machines are running at maximum capacity. You want to burst to Google Cloud. The workloads on Google Cloud must be able to directly communicate to the workloads on-premises using a private IP range. What should you do?

Options:

A.

In Google Cloud, configure the VPC as a host for Shared VPC.

B.

In Google Cloud, configure the VPC for VPC Network Peering.

C.

Create bastion hosts both in your on-premises environment and on Google Cloud. Configure both as proxy servers using their public IP addresses.

D.

Set up Cloud VPN between the infrastructure on-premises and Google Cloud.

Buy Now
Questions 54

You need to manage multiple Google Cloud Platform (GCP) projects in the fewest steps possible. You want to configure the Google Cloud SDK command line interface (CLI) so that you can easily manage multiple GCP projects. What should you?

Options:

A.

1. Create a configuration for each project you need to manage.2. Activate the appropriate configuration when you work with each of your assigned GCP projects.

B.

1. Create a configuration for each project you need to manage.2. Use gcloud init to update the configuration values when you need to work with a non-default project

C.

1. Use the default configuration for one project you need to manage.2. Activate the appropriate configuration when you work with each of your assigned GCP projects.

D.

1. Use the default configuration for one project you need to manage.2. Use gcloud init to update the configuration values when you need to work with a non-default project.

Buy Now
Questions 55

You have been asked to migrate a docker application from datacenter to cloud. Your solution architect has suggested uploading docker images to GCR in one project and running an application in a GKE cluster in a separate project. You want to store images in the project img-278322 and run the application in the project prod-278986. You want to tag the image as acme_track_n_trace:v1. You want to follow Google-recommended practices. What should you do?

Options:

A.

Run gcloud builds submit --tag gcr.io/img-278322/acme_track_n_trace

B.

Run gcloud builds submit --tag gcr.io/img-278322/acme_track_n_trace:v1

C.

Run gcloud builds submit --tag gcr.io/prod-278986/acme_track_n_trace

D.

Run gcloud builds submit --tag gcr.io/prod-278986/acme_track_n_trace:v1

Buy Now
Questions 56

You are building a data lake on Google Cloud for your Internet of Things (loT) application. The loT application has millions of sensors that are constantly streaming structured and unstructured data to your backend in the cloud. You want to build a highly available and resilient architecture based on Google-recommended practices. What should you do?

Options:

A.

Stream data to Pub/Sub, and use Dataflow to send data to Cloud Storage

B.

Stream data to Pub/Sub. and use Storage Transfer Service to send data to BigQuery.

C.

Stream data to Dataflow, and use Storage Transfer Service to send data to BigQuery.

D.

Stream data to Dataflow, and use Dataprep by Trifacta to send data to Bigtable.

Buy Now
Questions 57

Your company developed an application to deploy on Google Kubernetes Engine. Certain parts of the application are not fault-tolerant and are allowed to have downtime Other parts of the application are critical and must always be available. You need to configure a Goorj e Kubernfl:es Engine duster while optimizing for cost. What should you do?

Options:

A.

Create a cluster with a single node-pool by using standard VMs. Label the fault-tolerant Deployments as spot-true.

B.

Create a cluster with a single node-pool by using Spot VMs. Label the critical Deployments as spot-false.

C.

Create a cluster with both a Spot W node pool and a rode pool by using standard VMs Deploy the critical.deployments on the Spot VM node pool and the fault; tolerant deployments on the node pool by using standard VMs.

D.

Create a cluster with both a Spot VM node pool and by using standard VMs. Deploy the critical deployments on the mode pool by using standard VMs and the fault-tolerant deployments on the Spot VM node pool.

Buy Now
Questions 58

Your company runs its Linux workloads on Compute Engine instances. Your company will be working with a new operations partner that does not use Google Accounts. You need to grant access to the instances to your operations partner so they can maintain the installed tooling. What should you do?

Options:

A.

Enable Cloud IAP for the Compute Engine instances, and add the operations partner as a Cloud IAP Tunnel User.

B.

Tag all the instances with the same network tag. Create a firewall rule in the VPC to grant TCP access on port 22 for traffic from the operations partner to instances with the network tag.

C.

Set up Cloud VPN between your Google Cloud VPC and the internal network of the operations partner.

D.

Ask the operations partner to generate SSH key pairs, and add the public keys to the VM instances.

Buy Now
Questions 59

You want to find out when users were added to Cloud Spanner Identity Access Management (IAM) roles on your Google Cloud Platform (GCP) project. What should you do in the GCP Console?

Options:

A.

Open the Cloud Spanner console to review configurations.

B.

Open the IAM & admin console to review IAM policies for Cloud Spanner roles.

C.

Go to the Stackdriver Monitoring console and review information for Cloud Spanner.

D.

Go to the Stackdriver Logging console, review admin activity logs, and filter them for Cloud Spanner IAM roles.

Buy Now
Questions 60

Your company uses Cloud Storage to store application backup files for disaster recovery purposes. You want to follow Google’s recommended practices. Which storage option should you use?

Options:

A.

Multi-Regional Storage

B.

Regional Storage

C.

Nearline Storage

D.

Coldline Storage

Buy Now
Questions 61

You need to host an application on a Compute Engine instance in a project shared with other teams. You want to prevent the other teams from accidentally causing downtime on that application. Which feature should you use?

Options:

A.

Use a Shielded VM.

B.

Use a Preemptible VM.

C.

Use a sole-tenant node.

D.

Enable deletion protection on the instance.

Buy Now
Questions 62

Your company runs a variety of applications and workloads on Google Cloud and you are responsible for managing cloud costs. You need to identify a solution that enables you to perform detailed cost analysis You also must be able to visualize the cost data in multiple ways on the same dashboard What should you do?

Options:

A.

Use the cost breakdown report with the available filters from Cloud Billing to visualize the data

B.

Enable the Cloud Billing export to BigQuery. and use Looker Studio to visualize the data

C.

Run Queries in Cloud Monitoring Create dashboards to visualize the billing metrics

D.

Enable Cloud Monitoring metrics export to BigQuery and use Looker to visualize the data

Buy Now
Questions 63

You are using Container Registry to centrally store your company’s container images in a separate project. In another project, you want to create a Google Kubernetes Engine (GKE) cluster. You want to ensure that Kubernetes can download images from Container Registry. What should you do?

Options:

A.

In the project where the images are stored, grant the Storage Object Viewer IAM role to the service account used by the Kubernetes nodes.

B.

When you create the GKE cluster, choose the Allow full access to all Cloud APIs option under ‘Access scopes’.

C.

Create a service account, and give it access to Cloud Storage. Create a P12 key for this service account and use it as an imagePullSecrets in Kubernetes.

D.

Configure the ACLs on each image in Cloud Storage to give read-only access to the default Compute Engine service account.

Buy Now
Questions 64

You have one project called proj-sa where you manage all your service accounts. You want to be able to use a service account from this project to take snapshots of VMs running in another project called proj-vm. What should you do?

Options:

A.

Download the private key from the service account, and add it to each VMs custom metadata.

B.

Download the private key from the service account, and add the private key to each VM’s SSH keys.

C.

Grant the service account the IAM Role of Compute Storage Admin in the project called proj-vm.

D.

When creating the VMs, set the service account’s API scope for Compute Engine to read/write.

Buy Now
Questions 65

You have created a code snippet that should be triggered whenever a new file is uploaded to a Cloud Storage bucket. You want to deploy this code snippet. What should you do?

Options:

A.

Use App Engine and configure Cloud Scheduler to trigger the application using Pub/Sub.

B.

Use Cloud Functions and configure the bucket as a trigger resource.

C.

Use Google Kubernetes Engine and configure a CronJob to trigger the application using Pub/Sub.

D.

Use Dataflow as a batch job, and configure the bucket as a data source.

Buy Now
Questions 66

You are the team lead of a group of 10 developers. You provided each developer with an individual Google Cloud Project that they can use as their personal sandbox to experiment with different Google Cloud solutions. You want to be notified if any of the developers are spending above $500 per month on their sandbox environment. What should you do?

Options:

A.

Create a single budget for all projects and configure budget alerts on this budget.

B.

Create a separate billing account per sandbox project and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per billing account.

C.

Create a budget per project and configure budget alerts on all of these budgets.

D.

Create a single billing account for all sandbox projects and enable BigQuery billing exports. Create a Data Studio dashboard to plot the spending per project.

Buy Now
Questions 67

You are running a data warehouse on BigQuery. A partner company is offering a recommendation engine based on the data in your data warehouse. The partner company is also running their application on Google Cloud. They manage the resources in their own project, but they need access to the BigQuery dataset in your project. You want to provide the partner company with access to the dataset What should you do?

Options:

A.

Create a Service Account in your own project, and grant this Service Account access to BigGuery in your project

B.

Create a Service Account in your own project, and ask the partner to grant this Service Account access to BigQuery in their project

C.

Ask the partner to create a Service Account in their project, and have them give the Service Account access to BigQuery in their project

D.

Ask the partner to create a Service Account in their project, and grant their Service Account access to the BigQuery dataset in your project

Buy Now
Questions 68

You have a virtual machine that is currently configured with 2 vCPUs and 4 GB of memory. It is running out of memory. You want to upgrade the virtual machine to have 8 GB of memory. What should you do?

Options:

A.

Rely on live migration to move the workload to a machine with more memory.

B.

Use gcloud to add metadata to the VM. Set the key to required-memory-size and the value to 8 GB.

C.

Stop the VM, change the machine type to n1-standard-8, and start the VM.

D.

Stop the VM, increase the memory to 8 GB, and start the VM.

Buy Now
Questions 69

You need to assign a Cloud Identity and Access Management (Cloud IAM) role to an external auditor. The auditor needs to have permissions to review your Google Cloud Platform (GCP) Audit Logs and also to review your Data Access logs. What should you do?

Options:

A.

Assign the auditor the IAM role roles/logging.privateLogViewer. Perform the export of logs to Cloud Storage.

B.

Assign the auditor the IAM role roles/logging.privateLogViewer. Direct the auditor to also review the logs for changes to Cloud IAM policy.

C.

Assign the auditor’s IAM user to a custom role that has logging.privateLogEntries.listpermission. Perform the export of logs to Cloud Storage.

D.

Assign the auditor’s IAM user to a custom role that has logging.privateLogEntries.listpermission. Direct the auditor to also review the logs for changes to Cloud IAM policy.

Buy Now
Questions 70

You have developed a containerized web application that will serve Internal colleagues during business hours. You want to ensure that no costs are incurred outside of the hours the application is used. You have just created a new Google Cloud project and want to deploy the application. What should you do?

Options:

A.

Deploy the container on Cloud Run for Anthos, and set the minimum number of instances to zero

B.

Deploy the container on Cloud Run (fully managed), and set the minimum number of instances to zero.

C.

Deploy the container on App Engine flexible environment with autoscaling. and set the value min_instances to zero in the app yaml

D.

Deploy the container on App Engine flexible environment with manual scaling, and set the value instances to zero in the app yaml

Buy Now
Questions 71

Your company has embraced a hybrid cloud strategy where some of the applications are deployed on Google Cloud. A Virtual Private Network (VPN) tunnel connects your Virtual Private Cloud (VPC) in Google Cloud with your company's on-premises network. Multiple applications in Google Cloud need to connect to an on-premises database server, and you want to avoid having to change the IP configuration in all of your applications when the IP of the database changes.

What should you do?

Options:

A.

Configure Cloud NAT for all subnets of your VPC to be used when egressing from the VM instances.

B.

Create a private zone on Cloud DNS, and configure the applications with the DNS name.

C.

Configure the IP of the database as custom metadata for each instance, and query the metadata server.

D.

Query the Compute Engine internal DNS from the applications to retrieve the IP of the database.

Buy Now
Questions 72

You are monitoring an application and receive user feedback that a specific error is spiking. You notice that the error is caused by a Service Account having insufficient permissions. You are able to solve the problem but want to be notified if the problem recurs. What should you do?

Options:

A.

In the Log Viewer, filter the logs on severity 'Error' and the name of the Service Account.

B.

Create a sink to BigQuery to export all the logs. Create a Data Studio dashboard on the exported logs.

C.

Create a custom log-based metric for the specific error to be used in an Alerting Policy.

D.

Grant Project Owner access to the Service Account.

Buy Now
Questions 73

Your team is running an on-premises ecommerce application. The application contains a complex set of microservices written in Python, and each microservice is running on Docker containers. Configurations are injected by using environment variables. You need to deploy your current application to a serverless Google Cloud cloud solution. What should you do?

Options:

A.

Use your existing CI/CD pipeline Use the generated Docker images and deploy them to Cloud Run. Update the configurations and the required endpoints.

B.

Use your existing continuous integration and delivery (CI/CD) pipeline. Use the generated Docker images and deploy them to Cloud Function. Use the same configuration as on-premises.

C.

Use the existing codebase and deploy each service as a separate Cloud Function Update the configurations and the required endpoints.

D.

Use your existing codebase and deploy each service as a separate Cloud Run Use the same configurations as on-premises.

Buy Now
Questions 74

You need to grant access for three users so that they can view and edit table data on a Cloud Spanner instance. What should you do?

Options:

A.

Run gcloud iam roles describe roles/spanner.databaseUser. Add the users to the role.

B.

Run gcloud iam roles describe roles/spanner.databaseUser. Add the users to a new group. Add the group to the role.

C.

Run gcloud iam roles describe roles/spanner.viewer --project my-project. Add the users to the role.

D.

Run gcloud iam roles describe roles/spanner.viewer --project my-project. Add the users to a new group. Add the group to the role.

Buy Now
Questions 75

You need to create a Compute Engine instance in a new project that doesn’t exist yet. What should you do?

Options:

A.

Using the Cloud SDK, create a new project, enable the Compute Engine API in that project, and then create the instance specifying your new project.

B.

Enable the Compute Engine API in the Cloud Console, use the Cloud SDK to create the instance, and then use the ––project flag to specify a new project.

C.

Using the Cloud SDK, create the new instance, and use the ––project flag to specify the new project.Answer yes when prompted by Cloud SDK to enable the Compute Engine API.

D.

Enable the Compute Engine API in the Cloud Console. Go to the Compute Engine section of the Console to create a new instance, and look for the Create In A New Project option in the creation form.

Buy Now
Questions 76

You need to reduce GCP service costs for a division of your company using the fewest possible steps. You need to turn off all configured services in an existing GCP project. What should you do?

Options:

A.

1. Verify that you are assigned the Project Owners IAM role for this project.2. Locate the project in the GCP console, click Shut down and then enter the project ID.

B.

1. Verify that you are assigned the Project Owners IAM role for this project.2. Switch to the project in the GCP console, locate the resources and delete them.

C.

1. Verify that you are assigned the Organizational Administrator IAM role for this project.2. Locate the project in the GCP console, enter the project ID and then click Shut down.

D.

1. Verify that you are assigned the Organizational Administrators IAM role for this project.2. Switch to the project in the GCP console, locate the resources and delete them.

Buy Now
Questions 77

You are assisting a new Google Cloud user who just installed the Google Cloud SDK on their VM. The server needs access to Cloud Storage. The user wants your help to create a new storage bucket. You need to make this change in multiple environments. What should you do?

Options:

A.

Use a Deployment Manager script to automate creating storage buckets in an appropriate region

B.

Use a local SSD to improve performance of the VM for the targeted workload

C.

Use the gsutii command to create a storage bucket in the same region as the VM

D.

Use a Persistent Disk SSD in the same zone as the VM to improve performance of the VM

Buy Now
Questions 78

You are responsible for a web application on Compute Engine. You want your support team to be notified automatically if users experience high latency for at least 5 minutes. You need a Google-recommended solution with no development cost. What should you do?

Options:

A.

Create an alert policy to send a notification when the HTTP response latency exceeds the specified threshold.

B.

Implement an App Engine service which invokes the Cloud Monitoring API and sends a notification in case of anomalies.

C.

Use the Cloud Monitoring dashboard to observe latency and take the necessary actions when the response latency exceeds the specified threshold.

D.

Export Cloud Monitoring metrics to BigQuery and use a Looker Studio dashboard to monitor your web applications latency.

Buy Now
Questions 79

You want to configure a solution for archiving data in a Cloud Storage bucket. The solution must be cost-effective. Data with multiple versions should be archived after 30 days. Previous versions are accessed once a month for reporting. This archive data is also occasionally updated at month-end. What should you do?

Options:

A.

Add a bucket lifecycle rule that archives data with newer versions after 30 days to Coldline Storage.

B.

Add a bucket lifecycle rule that archives data with newer versions after 30 days to Nearline Storage.

C.

Add a bucket lifecycle rule that archives data from regional storage after 30 days to Coldline Storage.

D.

Add a bucket lifecycle rule that archives data from regional storage after 30 days to Nearline Storage.

Buy Now
Questions 80

You have experimented with Google Cloud using your own credit card and expensed the costs to your company. Your company wants to streamline the billing process and charge the costs of your projects to their monthly invoice. What should you do?

Options:

A.

Grant the financial team the IAM role ofג€Billing Account Userג€ on the billing account linked to your credit card.

B.

Set up BigQuery billing export and grant your financial department IAM access to query the data.

C.

Create a ticket with Google Billing Support to ask them to send the invoice to your company.

D.

Change the billing account of your projects to the billing account of your company.

Buy Now
Questions 81

You need to create a copy of a custom Compute Engine virtual machine (VM) to facilitate an expected increase in application traffic due to a business acquisition. What should you do?

Options:

A.

Create a Compute Engine snapshot of your base VM. Create your images from that snapshot.

B.

Create a Compute Engine snapshot of your base VM. Create your instances from that snapshot.

C.

Create a custom Compute Engine image from a snapshot. Create your images from that image.

D.

Create a custom Compute Engine image from a snapshot. Create your instances from that image.

Buy Now
Questions 82

You are building a new version of an application hosted in an App Engine environment. You want to test the new version with 1% of users before you completely switch your application over to the new version. What should you do?

Options:

A.

Deploy a new version of your application in Google Kubernetes Engine instead of App Engine and then use GCP Console to split traffic.

B.

Deploy a new version of your application in a Compute Engine instance instead of App Engine and then use GCP Console to split traffic.

C.

Deploy a new version as a separate app in App Engine. Then configure App Engine using GCP Console to split traffic between the two apps.

D.

Deploy a new version of your application in App Engine. Then go to App Engine settings in GCP Console and split traffic between the current version and newly deployed versions accordingly.

Buy Now
Questions 83

Your organization is a financial company that needs to store audit log files for 3 years. Your organization has hundreds of Google Cloud projects. You need to implement a cost-effective approach for log file retention. What should you do?

Options:

A.

Create an export to the sink that saves logs from Cloud Audit to BigQuery.

B.

Create an export to the sink that saves logs from Cloud Audit to a Coldline Storage bucket.

C.

Write a custom script that uses logging API to copy the logs from Stackdriver logs to BigQuery.

D.

Export these logs to Cloud Pub/Sub and write a Cloud Dataflow pipeline to store logs to Cloud SQL.

Buy Now
Questions 84

You have a large 5-TB AVRO file stored in a Cloud Storage bucket. Your analysts are proficient only in SQL and need access to the data stored in this file. You want to find a cost-effective way to complete their request as soon as possible. What should you do?

Options:

A.

Load data in Cloud Datastore and run a SQL query against it.

B.

Create a BigQuery table and load data in BigQuery. Run a SQL query on this table and drop this table after you complete your request.

C.

Create external tables in BigQuery that point to Cloud Storage buckets and run a SQL query on these external tables to complete your request.

D.

Create a Hadoop cluster and copy the AVRO file to NDFS by compressing it. Load the file in a hive table and provide access to your analysts so that they can run SQL queries.

Buy Now
Questions 85

You have a web application deployed as a managed instance group. You have a new version of the application to gradually deploy. Your web application is currently receiving live web traffic. You want to ensure that the available capacity does not decrease during the deployment. What should you do?

Options:

A.

Perform a rolling-action start-update with maxSurge set to 0 and maxUnavailable set to 1.

B.

Perform a rolling-action start-update with maxSurge set to 1 and maxUnavailable set to 0.

C.

Create a new managed instance group with an updated instance template. Add the group to the backend service for the load balancer. When all instances in the new managed instance group are healthy, delete the old managed instance group.

D.

Create a new instance template with the new application version. Update the existing managed instance group with the new instance template. Delete the instances in the managed instance group to allow the managed instance group to recreate the instance using the new instance template.

Buy Now
Questions 86

You have a workload running on Compute Engine that is critical to your business. You want to ensure that the data on the boot disk of this workload is backed up regularly. You need to be able to restore a backup as quickly as possible in case of disaster. You also want older backups to be cleaned automatically to save on cost. You want to follow Google-recommended practices. What should you do?

Options:

A.

Create a Cloud Function to create an instance template.

B.

Create a snapshot schedule for the disk using the desired interval.

C.

Create a cron job to create a new disk from the disk using gcloud.

D.

Create a Cloud Task to create an image and export it to Cloud Storage.

Buy Now
Questions 87

You have been asked to create robust Virtual Private Network (VPN) connectivity between a new Virtual Private Cloud (VPC) and a remote site. Key requirements include dynamic routing, a shared address space of 10.19.0.1/22, and no overprovisioning of tunnels during a failover event. You want to follow Google-recommended practices to set up a high availability Cloud VPN. What should you do?

Options:

A.

Use a custom mode VPC network, configure static routes, and use active/passive routing

B.

Use an automatic mode VPC network, configure static routes, and use active/active routing

C.

Use a custom mode VPC network use Cloud Router border gateway protocol (86P) routes, and use active/passive routing

D.

Use an automatic mode VPC network, use Cloud Router border gateway protocol (BGP) routes and configure policy-based routing

Buy Now
Questions 88

You assist different engineering teams in deploying their infrastructure on Google Cloud. Your company has defined certain practices required for all workloads. You need to provide the engineering teams with a solution that enables teams to deploy their infrastructure independently without having to know all implementation details of the company's required practices. What should you do?

Options:

A.

Create a service account per team, and grant the service account the Project Editor role. Ask the teams to provision their infrastructure through the Google Cloud CLI (gcloud CLI), while impersonating their dedicated service account.

B.

Provide training for all engineering teams you work with to understand the company’s required practices. Allow the engineering teams to provision the infrastructure to best meet their needs.

C.

Configure organization policies to enforce your company’s required practices. Ask the teams to provision their infrastructure by using the Google Cloud console.

D.

Write Terraform modules for each component that are compliant with the company’s required practices, and ask teams to implement their infrastructure through these modules.

Buy Now
Questions 89

You need to create an autoscaling managed instance group for an HTTPS web application. You want to make sure that unhealthy VMs are recreated. What should you do?

Options:

A.

Create a health check on port 443 and use that when creating the Managed Instance Group.

B.

Select Multi-Zone instead of Single-Zone when creating the Managed Instance Group.

C.

In the Instance Template, add the label ‘health-check’.

D.

In the Instance Template, add a startup script that sends a heartbeat to the metadata server.

Buy Now
Questions 90

A colleague handed over a Google Cloud project for you to maintain. As part of a security checkup, you want to review who has been granted the Project Owner role. What should you do?

Options:

A.

In the Google Cloud console, validate which SSH keys have been stored as project-wide keys.

B.

Navigate to Identity-Aware Proxy and check the permissions for these resources.

C.

Enable Audit logs on the IAM & admin page for all resources, and validate the results.

D.

Use the gcloud projects get-iam-policy command to view the current role assignments.

Buy Now
Questions 91

Your application stores files on Cloud Storage by using the Standard Storage class. The application only requires access to files created in the last 30 days. You want to automatically save costs on files that are no longer accessed by the application. What should you do?

Options:

A.

Create a retention policy on the storage bucket of 30 days, and lock the bucket by using a retention policy lock.

B.

Enable object versioning on the storage bucket and add lifecycle rules to expire non-current versions after 30 days

C.

Create an object lifecycle on the storage bucket to change the storage class to Archive Storage for objects with an age over 30 days.

D.

Create a cron job in Cloud Scheduler to call a Cloud Functions instance every day to delete files older than 30 days.

Buy Now
Questions 92

You are using Google Kubernetes Engine with autoscaling enabled to host a new application. You want to expose this new application to the public, using HTTPS on a public IP address. What should you do?

Options:

A.

Create a Kubernetes Service of type NodePort for your application, and a Kubernetes Ingress to expose this Service via a Cloud Load Balancer.

B.

Create a Kubernetes Service of type ClusterIP for your application. Configure the public DNS name of your application using the IP of this Service.

C.

Create a Kubernetes Service of type NodePort to expose the application on port 443 of each node of the Kubernetes cluster. Configure the public DNS name of your application with the IP of every node of the cluster to achieve load-balancing.

D.

Create a HAProxy pod in the cluster to load-balance the traffic to all the pods of the application. Forward the public traffic to HAProxy with an iptable rule. Configure the DNS name of your application using the public IP of the node HAProxy is running on.

Buy Now
Questions 93

You have production and test workloads that you want to deploy on Compute Engine. Production VMs need to be in a different subnet than the test VMs. All the VMs must be able to reach each other over internal IP without creating additional routes. You need to set up VPC and the 2 subnets. Which configuration meets these requirements?

Options:

A.

Create a single custom VPC with 2 subnets. Create each subnet in a different region and with a different CIDR range.

B.

Create a single custom VPC with 2 subnets. Create each subnet in the same region and with the same CIDR range.

C.

Create 2 custom VPCs, each with a single subnet. Create each subnet is a different region and with a different CIDR range.

D.

Create 2 custom VPCs, each with a single subnet. Create each subnet in the same region and with the same CIDR range.

Buy Now
Questions 94

You are analyzing Google Cloud Platform service costs from three separate projects. You want to use this information to create service cost estimates by service type, daily and monthly, for the next six months using standard query syntax. What should you do?

Options:

A.

Export your bill to a Cloud Storage bucket, and then import into Cloud Bigtable for analysis.

B.

Export your bill to a Cloud Storage bucket, and then import into Google Sheets for analysis.

C.

Export your transactions to a local file, and perform analysis with a desktop tool.

D.

Export your bill to a BigQuery dataset, and then write time window-based SQL queries for analysis.

Buy Now
Questions 95

(Your digital media company stores a large number of video files on-premises. Each video file ranges from 100 MB to 100 GB. You are currently storing 150 TB of video data in your on-premises network, with no room for expansion. You need to migrate all infrequently accessed video files older than one year to Cloud Storage to ensure that on-premises storage remains available for new files. You must also minimize costs and control bandwidth usage. What should you do?)

Options:

A.

Create a Cloud Storage bucket. Establish an Identity and Access Management (IAM) role with write permissions to the bucket. Use the gsutil tool to directly copy files over the network to Cloud Storage.

B.

Set up a Cloud Interconnect connection between the on-premises network and Google Cloud. Establish a private endpoint for Filestore access. Transfer the data from the existing Network File System (NFS) to Filestore.

C.

Use Transfer Appliance to request an appliance. Load the data locally, and ship the appliance back to Google for ingestion into Cloud Storage.

D.

Use Storage Transfer Service to move the data from the selected on-premises file storage systems to a Cloud Storage bucket.

Buy Now
Questions 96

You created a Kubernetes deployment by running kubectl run nginx image=nginx replicas=1. After a few days, you decided you no longer want this deployment. You identified the pod and deleted it by running kubectl delete pod. You noticed the pod got recreated.

$ kubectlgetpods

NAME READY STATUS RESTARTS AGE

nginx-84748895c4-nqqmt 1/1 Running 0 9m41s

$ kubectldeletepod nginx-84748895c4-nqqmt

pod nginx-84748895c4-nqqmt deleted

$ kubectlgetpods

NAME READY STATUS RESTARTS AGE

nginx-84748895c4-k6bzl 1/1 Running 0 25s

What should you do to delete the deployment and avoid pod getting recreated?

Options:

A.

kubectl delete deployment nginx

B.

kubectl delete –deployment=nginx

C.

kubectl delete pod nginx-84748895c4-k6bzl –no-restart 2

D.

kubectl delete inginx

Buy Now
Questions 97

Your learn wants to deploy a specific content management system (CMS) solution lo Google Cloud. You need a quick and easy way to deploy and install the solution. What should you do?

Options:

A.

Search for the CMS solution in Google Cloud Marketplace. Use gcloud CLI to deploy the solution.

B.

Search for the CMS solution in Google Cloud Marketplace. Deploy the solution directly from Cloud Marketplace.

C.

Search for the CMS solution in Google Cloud Marketplace. Use Terraform and the Cloud Marketplace ID to deploy the solution with the appropriate parameters.

D.

Use the installation guide of the CMS provider. Perform the installation through your configuration management system.

Buy Now
Exam Name: Google Cloud Certified - Associate Cloud Engineer
Last Update: Aug 15, 2025
Questions: 325

PDF + Testing Engine

$57.75  $164.99

Testing Engine

$43.75  $124.99
buy now Associate-Cloud-Engineer testing engine

PDF (Q&A)

$36.75  $104.99
buy now Associate-Cloud-Engineer pdf
dumpsmate guaranteed to pass
24/7 Customer Support

DumpsMate's team of experts is always available to respond your queries on exam preparation. Get professional answers on any topic of the certification syllabus. Our experts will thoroughly satisfy you.

Site Secure

mcafee secure

TESTED 18 Aug 2025