Winter Sale - Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: dpm65

Associate-Cloud-Engineer Google Cloud Certified - Associate Cloud Engineer Questions and Answers

Questions 4

You are building an application that stores relational data from users. Users across the globe will use this application. Your CTO is concerned about the scaling requirements because the size of the user base is unknown. You need to implement a database solution that can scale with your user growth with minimum configuration changes. Which storage solution should you use?

Options:

A.

Cloud SQL

B.

Cloud Spanner

C.

Cloud Firestore

D.

Cloud Datastore

Buy Now
Questions 5

You have a Linux VM that must connect to Cloud SQL. You created a service account with the appropriate access rights. You want to make sure that the VM uses this service account instead of the default Compute Engine service account. What should you do?

Options:

A.

When creating the VM via the web console, specify the service account under the ‘Identity and API Access’ section.

B.

Download a JSON Private Key for the service account. On the Project Metadata, add that JSON as the value for the key compute-engine-service-account.

C.

Download a JSON Private Key for the service account. On the Custom Metadata of the VM, add that JSON as the value for the key compute-engine-service-account.

D.

Download a JSON Private Key for the service account. After creating the VM, ssh into the VM and save the JSON under ~/.gcloud/compute-engine-service-account.json.

Buy Now
Questions 6

You have a managed instance group comprised of preemptible VM's. All of the VM's keepdeleting and recreating themselves every minute. What is a possible cause of thisbehavior?

Options:

A.

Your zonal capacity is limited, causing all preemptible VM's to be shutdown torecover capacity. Try deploying your group to another zone.

B.

You have hit your instance quota for the region.

C.

Your managed instance group's VM's are toggled to only last 1 minute inpreemptible settings.

D.

Your managed instance group's health check is repeatedly failing, either to amisconfigured health check or misconfigured firewall rules not allowing the healthcheck to access the instance

Buy Now
Questions 7

You want to configure autohealing for network load balancing for a group of Compute Engine instances that run in multiple zones, using the fewest possible steps. You need to configure re-creation of VMs if they are unresponsive after 3 attempts of 10 seconds each. What should you do?

Options:

A.

Create an HTTP load balancer with a backend configuration that references an existing instance group. Set the health check to healthy (HTTP).

B.

Create an HTTP load balancer with a backend configuration that references an existing instance group. Define a balancing mode and set the maximum RPS to 10.

C.

Create a managed instance group. Set the Autohealing health check to healthy (HTTP).

D.

Create a managed instance group. Verify that the autoscaling setting is on.

Buy Now
Questions 8

You want to select and configure a solution for storing and archiving data on Google Cloud Platform. You need to support compliance objectives for data from one geographic location. This data is archived after 30 days and needs to be accessed annually. What should you do?

Options:

A.

Select Multi-Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Coldline Storage.

B.

Select Multi-Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Nearline Storage.

C.

Select Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Nearline Storage.

D.

Select Regional Storage. Add a bucket lifecycle rule that archives data after 30 days to Coldline Storage.

Buy Now
Questions 9

Your auditor wants to view your organization's use of data in Google Cloud. The auditor is most interested in auditing who accessed data in Cloud Storage buckets. You need to help the auditor access the data they need. What should you do?

Options:

A.

Assign the appropriate permissions, and then use Cloud Monitoring to review metrics

B.

Use the export logs API to provide the Admin Activity Audit Logs in the format they want

C.

Turn on Data Access Logs for the buckets they want to audit, and Then build a query in the log viewer that filters on Cloud Storage

D.

Assign the appropriate permissions, and then create a Data Studio report on Admin Activity Audit Logs

Buy Now
Questions 10

You need to deploy an application in Google Cloud using savorless technology. You want to test a new version of the application with a small percentage of production traffic. What should you do?

Options:

A.

Deploy the application lo Cloud. Run. Use gradual rollouts for traffic splitting .

B.

Deploy the application lo Google Kubemetes Engine. Use Anthos Service Mesh for traffic splitting.

C.

Deploy the application to Cloud functions. Saucily the version number in the functions name.

D.

Deploy the application to App Engine. For each new version, create a new service.

Buy Now
Questions 11

During a recent audit of your existing Google Cloud resources, you discovered several users with email addresses outside of your Google Workspace domain.

You want to ensure that your resources are only shared with users whose email addresses match your domain. You need to remove any mismatched users, and you want to avoid having to audit your resources to identify mismatched users. What should you do?

Options:

A.

Create a Cloud Scheduler task to regularly scan your projects and delete mismatched users.

B.

Create a Cloud Scheduler task to regularly scan your resources and delete mismatched users.

C.

Set an organizational policy constraint to limit identities by domain to automatically remove mismatched users.

D.

Set an organizational policy constraint to limit identities by domain, and then retroactively remove the existing mismatched users.

Buy Now
Questions 12

You are planning to migrate your containerized workloads to Google Kubernetes Engine (GKE). You need to determine which GKE option to use. Your solution must have high availability, minimal downtime, and the ability to promptly apply security updates to your nodes. You also want to pay only for the compute resources that your workloads use without managing nodes. You want to follow Google-recommended practices and minimize operational costs. What should you do?

Options:

A.

Configure a Standard multi-zonal GKE cluster.

B.

Configure an Autopilot GKE cluster.

C.

Configure a Standard zonal GKE cluster.

D.

Configure a Standard regional GKE cluster.

Buy Now
Questions 13

You are building a new version of an application hosted in an App Engine environment. You want to test the new version with 1% of users before you completely switch your application over to the new version. What should you do?

Options:

A.

Deploy a new version of your application in Google Kubernetes Engine instead of App Engine and then use GCP Console to split traffic.

B.

Deploy a new version of your application in a Compute Engine instance instead of App Engine and then use GCP Console to split traffic.

C.

Deploy a new version as a separate app in App Engine. Then configure App Engine using GCP Console to split traffic between the two apps.

D.

Deploy a new version of your application in App Engine. Then go to App Engine settings in GCP Console and split traffic between the current version and newly deployed versions accordingly.

Buy Now
Questions 14

(Your digital media company stores a large number of video files on-premises. Each video file ranges from 100 MB to 100 GB. You are currently storing 150 TB of video data in your on-premises network, with no room for expansion. You need to migrate all infrequently accessed video files older than one year to Cloud Storage to ensure that on-premises storage remains available for new files. You must also minimize costs and control bandwidth usage. What should you do?)

Options:

A.

Create a Cloud Storage bucket. Establish an Identity and Access Management (IAM) role with write permissions to the bucket. Use the gsutil tool to directly copy files over the network to Cloud Storage.

B.

Set up a Cloud Interconnect connection between the on-premises network and Google Cloud. Establish a private endpoint for Filestore access. Transfer the data from the existing Network File System (NFS) to Filestore.

C.

Use Transfer Appliance to request an appliance. Load the data locally, and ship the appliance back to Google for ingestion into Cloud Storage.

D.

Use Storage Transfer Service to move the data from the selected on-premises file storage systems to a Cloud Storage bucket.

Buy Now
Questions 15

You are designing an application that lets users upload and share photos. You expect your application to grow really fast and you are targeting a worldwide audience. You want to delete uploaded photos after 30 days. You want to minimize costs while ensuring your application is highly available. Which GCP storage solution should you choose?

Options:

A.

Persistent SSD on VM instances.

B.

Cloud Filestore.

C.

Multiregional Cloud Storage bucket.

D.

Cloud Datastore database.

Buy Now
Questions 16

Your company publishes large files on an Apache web server that runs on a Compute Engine instance. The Apache web server is not the only application running in the project. You want to receive an email when the egress network costs for the server exceed 100 dollars for the current month as measured by Google Cloud Platform (GCP). What should you do?

Options:

A.

Set up a budget alert on the project with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”

B.

Set up a budget alert on the billing account with an amount of 100 dollars, a threshold of 100%, and notification type of “email.”

C.

Export the billing data to BigQuery. Create a Cloud Function that uses BigQuery to sum the egress network costs of the exported billing data for the Apache web server for the current month and sends an email if it is over 100 dollars. Schedule the Cloud Function using Cloud Scheduler to run hourly.

D.

Use the Stackdriver Logging Agent to export the Apache web server logs to Stackdriver Logging. Create a Cloud Function that uses BigQuery to parse the HTTP response log data in Stackdriver for the current month and sends an email if the size of all HTTP responses, multiplied by current GCP egress prices, totals over 100 dollars. Schedule the Cloud Function using Cloud Scheduler to run hourly.

Buy Now
Questions 17

You need to select and configure compute resources for a set of batch processing jobs. These jobs take around 2 hours to complete and are run nightly. You want to minimize service costs. What should you do?

Options:

A.

Select Google Kubernetes Engine. Use a single-node cluster with a small instance type.

B.

Select Google Kubernetes Engine. Use a three-node cluster with micro instance types.

C.

Select Compute Engine. Use preemptible VM instances of the appropriate standard machine type.

D.

Select Compute Engine. Use VM instance types that support micro bursting.

Buy Now
Questions 18

Your company has developed a new application that consists of multiple microservices. You want to deploy the application to Google Kubernetes Engine (GKE), and you want to ensure that the cluster can scale as more applications are deployed in the future. You want to avoid manual intervention when each new application is deployed. What should you do?

Options:

A.

Deploy the application on GKE, and add a HorizontalPodAutoscaler to the deployment.

B.

Deploy the application on GKE, and add a VerticalPodAutoscaler to the deployment.

C.

Create a GKE cluster with autoscaling enabled on the node pool. Set a minimum and maximum for the size of the node pool.

D.

Create a separate node pool for each application, and deploy each application to its dedicated node pool.

Buy Now
Questions 19

You are working in a team that has developed a new application that needs to be deployed on Kubernetes. The production application is business critical and should be optimized for reliability. You need to provision a Kubernetes cluster and want to follow Google-recommended practices. What should you do?

Options:

A.

Create a GKE Autopilot cluster. Enroll the cluster in the rapid release channel.

B.

Create a GKE Autopilot cluster. Enroll the cluster in the stable release channel.

C.

Create a zonal GKE standard cluster. Enroll the cluster in the stable release channel.

D.

Create a regional GKE standard cluster. Enroll the cluster in the rapid release channel.

Buy Now
Questions 20

(You are deploying an application to Google Kubernetes Engine (GKE). The application needs to make API calls to a private Cloud Storage bucket. You need to configure your application Pods to authenticate to the Cloud Storage API, but your organization policy prevents the usage of service account keys. You want to follow Google-recommended practices. What should you do?)

Options:

A.

Create the GKE cluster and deploy the application. Request a security exception to create a Google service account key. Set the constraints/iam.serviceAccountKeyExpiryHours organization policy to 8 hours.

B.

Create the GKE cluster and deploy the application. Request a security exception to create a Google service account key. Set the constraints/iam.serviceAccountKeyExpiryHours organization policy to 24 hours.

C.

Create the GKE cluster with Workload Identity Federation. Configure the default node service account to access the bucket. Deploy the application into the cluster so the application can use the node service account permissions. Use Identity and Access Management (IAM) to grant the service account access to the bucket.

D.

Create the GKE cluster with Workload Identity Federation. Create a Google service account and a Kubernetes ServiceAccount, and configure both service accounts to use Workload Identity Federation. Attach the Kubernetes ServiceAccount to the application Pods and configure the Google service account to access the bucket with Identity and Access Management (IAM).

Buy Now
Questions 21

Your company uses Cloud Storage to store application backup files for disaster recovery purposes. You want to follow Google’s recommended practices. Which storage option should you use?

Options:

A.

Multi-Regional Storage

B.

Regional Storage

C.

Nearline Storage

D.

Coldline Storage

Buy Now
Questions 22

(Your company uses a multi-cloud strategy that includes Google Cloud. You want to centralize application logs in a third-party software-as-a-service (SaaS) tool from all environments. You need tointegrate logs originating from Cloud Logging, and you want to ensure the export occurs with the least amount of delay possible. What should you do?)

Options:

A.

Use a Cloud Scheduler cron job to trigger a Cloud Function that queries Cloud Logging and sends the logs to the SaaS tool.

B.

Create a Cloud Logging sink and configure Pub/Sub as the destination. Configure the SaaS tool to subscribe to the Pub/Sub topic to retrieve the logs.

C.

Create a Cloud Logging sink and configure Cloud Storage as the destination. Configure the SaaS tool to read the Cloud Storage bucket to retrieve the logs.

D.

Create a Cloud Logging sink and configure BigQuery as the destination. Configure the SaaS tool to query BigQuery to retrieve the logs.

Buy Now
Questions 23

You need to set up permissions for a set of Compute Engine instances to enable them to write data into a particular Cloud Storage bucket. You want to follow Google-recommended practices. What should you do?

Options:

A.

Create a service account with an access scope. Use the access scope ‘https://www.googleapis.com/auth/devstorage.write_only’.

B.

Create a service account with an access scope. Use the access scope ‘https://www.googleapis.com/auth/cloud-platform’.

C.

Create a service account and add it to the IAM role ‘storage.objectCreator’ for that bucket.

D.

Create a service account and add it to the IAM role ‘storage.objectAdmin’ for that bucket.

Buy Now
Questions 24

You need to create a new billing account and then link it with an existing Google Cloud Platform project. What should you do?

Options:

A.

Verify that you are Project Billing Manager for the GCP project. Update the existing project to link it to the existing billing account.

B.

Verify that you are Project Billing Manager for the GCP project. Create a new billing account and link the new billing account to the existing project.

C.

Verify that you are Billing Administrator for the billing account. Create a new project and link the new project to the existing billing account.

D.

Verify that you are Billing Administrator for the billing account. Update the existing project to link it to the existing billing account.

Buy Now
Questions 25

You are the Google Cloud systems administrator for your organization. User A reports that they received an error when attempting to access the Cloud SQL database in their Google Cloud project, while User B can access the database. You need to troubleshoot the issue for User A, while following Google-recommended practices.

What should you do first?

Options:

A.

Confirm that network firewall rules are not blocking traffic for User A.

B.

Review recent configuration changes that may have caused unintended modifications to permissions.

C.

Verify that User A has the Identity and Access Management (IAM) Project Owner role assigned.

D.

Review the error message that User A received.

Buy Now
Questions 26

You need to assign a Cloud Identity and Access Management (Cloud IAM) role to an external auditor. The auditor needs to have permissions to review your Google Cloud Platform (GCP) Audit Logs and also to review your Data Access logs. What should you do?

Options:

A.

Assign the auditor the IAM role roles/logging.privateLogViewer. Perform the export of logs to Cloud Storage.

B.

Assign the auditor the IAM role roles/logging.privateLogViewer. Direct the auditor to also review the logs for changes to Cloud IAM policy.

C.

Assign the auditor’s IAM user to a custom role that has logging.privateLogEntries.listpermission. Perform the export of logs to Cloud Storage.

D.

Assign the auditor’s IAM user to a custom role that has logging.privateLogEntries.listpermission. Direct the auditor to also review the logs for changes to Cloud IAM policy.

Buy Now
Questions 27

You have a number of applications that have bursty workloads and are heavily dependent on topics to decouple publishing systems from consuming systems. Your company would like to go serverless to enable developers to focus on writing code without worrying about infrastructure. Your solution architect has already identified Cloud Pub/Sub as a suitable alternative for decoupling systems. You have been asked to identify a suitable GCP Serverless service that is easy to use with Cloud Pub/Sub. You want the ability to scale down to zero when there is no traffic in order to minimize costs. You want to follow Google recommended practices. What should you suggest?

Options:

A.

Cloud Run for Anthos

B.

Cloud Run

C.

App Engine Standard

D.

Cloud Functions.

Buy Now
Questions 28

You have created a code snippet that should be triggered whenever a new file is uploaded to a Cloud Storage bucket. You want to deploy this code snippet. What should you do?

Options:

A.

Use App Engine and configure Cloud Scheduler to trigger the application using Pub/Sub.

B.

Use Cloud Functions and configure the bucket as a trigger resource.

C.

Use Google Kubernetes Engine and configure a CronJob to trigger the application using Pub/Sub.

D.

Use Dataflow as a batch job, and configure the bucket as a data source.

Buy Now
Questions 29

You have a large 5-TB AVRO file stored in a Cloud Storage bucket. Your analysts are proficient only in SQL and need access to the data stored in this file. You want to find a cost-effective way to complete their request as soon as possible. What should you do?

Options:

A.

Load data in Cloud Datastore and run a SQL query against it.

B.

Create a BigQuery table and load data in BigQuery. Run a SQL query on this table and drop this table after you complete your request.

C.

Create external tables in BigQuery that point to Cloud Storage buckets and run a SQL query on these external tables to complete your request.

D.

Create a Hadoop cluster and copy the AVRO file to NDFS by compressing it. Load the file in a hive table and provide access to your analysts so that they can run SQL queries.

Buy Now
Questions 30

Your company would like to store invoices and other financial documents in Google Cloud. You need to identify a Google-managed solution to store this information for your company. You must ensure that the documents are kept for a duration of three years. Your company's analysts need frequent access to invoices from the past six months. After six months, invoices should be archived for audit purposes only. You want to minimize costs and follow Google-recommended practices. What should you do?

Options:

A.

Use Cloud Storage with Object Lifecycle Management to change the object storage class to Coldline after six months.

B.

Use Cloud Storage with Object Lifecycle Management to change the object storage class to Standard after six months.

C.

Store your documents on Filestore and move the documents to Cloud Storage with object storage class set to Coldline after six months.

D.

Store your documents on Filestore and move the documents to Cloud Storage with object storage class set to Standard after six months.

Buy Now
Questions 31

You have sensitive data stored in three Cloud Storage buckets and have enabled data access logging. You want to verify activities for a particular user for these buckets, using the fewest possible steps. You need to verify the addition of metadata labels and which files have been viewed from those buckets. What should you do?

Options:

A.

Using the GCP Console, filter the Activity log to view the information.

B.

Using the GCP Console, filter the Stackdriver log to view the information.

C.

View the bucket in the Storage section of the GCP Console.

D.

Create a trace in Stackdriver to view the information.

Buy Now
Questions 32

You have created a new project in Google Cloud through the gcloud command line interface (CLI) and linked a billing account. You need to create a new Compute

Engine instance using the CLI. You need to perform the prerequisite steps. What should you do?

Options:

A.

Create a Cloud Monitoring Workspace.

B.

Create a VPC network in the project.

C.

Enable the compute googleapis.com API.

D.

Grant yourself the IAM role of Compute Admin.

Buy Now
Questions 33

You are working for a startup that was officially registered as a business 6 months ago. As your customer base grows, your use of Google Cloud increases. You want to allow all engineers to create new projects without asking them for their credit card information. What should you do?

Options:

A.

Create a Billing account, associate a payment method with it, and provide all project creators with permission to associate that billing account with their projects.

B.

Grant all engineer’s permission to create their own billing accounts for each new project.

C.

Apply for monthly invoiced billing, and have a single invoice tor the project paid by the finance team.

D.

Create a billing account, associate it with a monthly purchase order (PO), and send the PO to Google Cloud.

Buy Now
Questions 34

You manage three Google Cloud projects with the Cloud Monitoring API enabled. You want to follow Google-recommended practices to visualize CPU and network metrics for all three projects together. What should you do?

Options:

A.

1. Create a Cloud Monitoring Dashboard2. Collect metrics and publish them into the Pub/Sub topics 3. Add CPU and network Charts (or each of (he three projects

B.

1. Create a Cloud Monitoring Dashboard.2. Select the CPU and Network metrics from the three projects.3. Add CPU and network Charts lot each of the three protects.

C.

1 Create a Service Account and apply roles/viewer on the three projects2. Collect metrics and publish them lo the Cloud Monitoring API3. Add CPU and network Charts for each of the three projects.

D.

1. Create a fourth Google Cloud project2 Create a Cloud Workspace from the fourth project and add the other three projects

Buy Now
Questions 35

You have one project called proj-sa where you manage all your service accounts. You want to be able to use a service account from this project to take snapshots of VMs running in another project called proj-vm. What should you do?

Options:

A.

Download the private key from the service account, and add it to each VMs custom metadata.

B.

Download the private key from the service account, and add the private key to each VM’s SSH keys.

C.

Grant the service account the IAM Role of Compute Storage Admin in the project called proj-vm.

D.

When creating the VMs, set the service account’s API scope for Compute Engine to read/write.

Buy Now
Questions 36

Youare configuring Cloud DNS. You want !to create DNS records to pointhome.mydomain.com,mydomain.com. andwww.mydomain.comto the IP address of your Google Cloud load balancer. What should you do?

Options:

A.

Create one CNAME record to point mydomain.com to the load balancer, and create two A records to point WWW and HOME lo mydomain.com respectively.

B.

Create one CNAME record to point mydomain.com to the load balancer, and create two AAAA records to point WWW and HOME to mydomain.com respectively.

C.

Create one A record to point mydomain.com to the load balancer, and create two CNAME records to point WWW and HOME to mydomain.com respectively.

D.

Create one A record to point mydomain.com lo the load balancer, and create two NS records to point WWW and HOME to mydomain.com respectively.

Buy Now
Questions 37

You need to configure optimal data storage for files stored in Cloud Storage for minimal cost. The files are used in a mission-critical analytics pipeline that is used continually. The users are in Boston, MA (United States). What should you do?

Options:

A.

Configure regional storage for the region closest to the users Configure a Nearline storage class

B.

Configure regional storage for the region closest to the users Configure a Standard storage class

C.

Configure dual-regional storage for the dual region closest to the users Configure a Nearline storage class

D.

Configure dual-regional storage for the dual region closest to the users Configure a Standard storage class

Buy Now
Questions 38

You are building a multi-player gaming application that will store game information in a database. As the popularity of the application increases, you are concerned about delivering consistent performance. You need to ensure an optimal gaming performance for global users, without increasing the management complexity. What should you do?

Options:

A.

Use Cloud SQL database with cross-region replication to store game statistics in the EU, US, and APAC regions.

B.

Use Cloud Spanner to store user data mapped to the game statistics.

C.

Use BigQuery to store game statistics with a Redis on Memorystore instance in the front to provide global consistency.

D.

Store game statistics in a Bigtable database partitioned by username.

Buy Now
Questions 39

You created a Kubernetes deployment by running kubectl run nginx image=nginx labels=app=prod. Your Kubernetes cluster is also used by a number of other deployments. How can you find the identifier of the pods for this nginx deployment?

Options:

A.

kubectl get deployments –output=pods

B.

gcloud get pods –selector=”app=prod”

C.

kubectl get pods -I “app=prod”

D.

gcloud list gke-deployments -filter={pod }

Buy Now
Questions 40

Your company stores data from multiple sources that have different data storage requirements. These data include:

1. Customer data that is structured and read with complex queries

2. Historical log data that is large in volume and accessed infrequently

3. Real-time sensor data with high-velocity writes, which needs to be available for analysis but can tolerate some data loss

You need to design the most cost-effective storage solution that fulfills all data storage requirements. What should you do?

Options:

A.

Use Spanner for all data.

B.

Use Cloud SQL for customer data, Cloud Storage (Coldline) for historical logs, and BigQuery for sensor data.

C.

Use Cloud SQL for customer data, Cloud Storage (Archive) for historical logs, and Bigtable for sensor data.

D.

Use Firestore for customer data, Cloud Storage (Nearline) for historical logs, and Bigtable for sensor data.

Buy Now
Questions 41

A company wants to build an application that stores images in a Cloud Storage bucket and wants to generate thumbnails as well as resize the images. They want to use a google managed service that can scale up and scale down to zero automatically with minimal effort. You have been asked to recommend a service. Which GCP service would you suggest?

Options:

A.

Google Compute Engine

B.

Google App Engine

C.

Cloud Functions

D.

Google Kubernetes Engine

Buy Now
Questions 42

You want to select and configure a cost-effective solution for relational data on Google Cloud Platform. You are working with a small set of operational data in one geographic location. You need to support point-in-time recovery. What should you do?

Options:

A.

Select Cloud SQL (MySQL). Verify that the enable binary logging option is selected.

B.

Select Cloud SQL (MySQL). Select the create failover replicas option.

C.

Select Cloud Spanner. Set up your instance with 2 nodes.

D.

Select Cloud Spanner. Set up your instance as multi-regional.

Buy Now
Questions 43

Your company wants to standardize the creation and management of multiple Google Cloud resources using Infrastructure as Code. You want to minimize the amount of repetitive code needed to manage the environment What should you do?

Options:

A.

Create a bash script that contains all requirement steps as gcloud commands

B.

Develop templates for the environment using Cloud Deployment Manager

C.

Use curl in a terminal to send a REST request to the relevant Google API for each individual resource.

D.

Use the Cloud Console interface to provision and manage all related resources

Buy Now
Questions 44

You need to create a custom IAM role for use with a GCP service. All permissions in the role must be suitable for production use. You also want to clearly share with your organization the status of the custom role. This will be the first version of the custom role. What should you do?

Options:

A.

Use permissions in your role that use the ‘supported’ support level for role permissions. Set the role stage to ALPHA while testing the role permissions.

B.

Use permissions in your role that use the ‘supported’ support level for role permissions. Set the role stage to BETA while testing the role permissions.

C.

Use permissions in your role that use the ‘testing’ support level for role permissions. Set the role stage to ALPHA while testing the role permissions.

D.

Use permissions in your role that use the ‘testing’ support level for role permissions. Set the role stage to BETA while testing the role permissions.

Buy Now
Questions 45

Your company uses Pub/Sub for event-driven workloads. You have a subscription named email-updates attached to the new-orders topic. You need to fetch and acknowledge waiting messages from this subscription. What should you do?

Options:

A.

Use the gcloud pubsub subscriptions seek email-updates command.

B.

Use the gcloud pubsub topics describe new-orders command.

C.

Use the gcloud pubsub subscriptions pull email-updates —auto-ack command.

D.

Use the gcloud pubsub topics list-subscriptions new-orders —1ilter="email-updates" command.

Buy Now
Questions 46

You have a Bigtable instance that consists of three nodes that store personally identifiable information (Pll) data. You need to log all read or write operations, including any metadata or configuration reads of this database table, in your company's Security Information and Event Management (SIEM) system. What should you do?

Options:

A.

• Navigate to Cloud Mentioning in the Google Cloud console, and create a custom monitoring job for theBigtable instance to track all changes.• Create an alert by using webhook endpoints. with the SIEM endpoint as a receiver

B.

• Navigate to the Audit Logs page in the Google Cloud console, and enable Data Read. Data Write and Admin Read logs for the Bigtable instance• Create a Pub/Sub topic as a Cloud Logging sink destination, and add your SIEM as a subscriber to the topic.

C.

• Install the Ops Agent on the Bigtable instance during configuration. K• Create a service account with read permissions for the Bigtable instance.• Create a custom Dataflow job with this service account to export logs to the company's SIEM system.

D.

• Navigate to the Audit Logs page in the Google Cloud console, and enable Admin Write logs for theBiglable instance.• Create a Cloud Functions instance to export logs from Cloud Logging to your SIEM.

Buy Now
Questions 47

You want to configure an SSH connection to a single Compute Engine instance for users in the dev1 group. This instance is the only resource in this particular Google Cloud Platform project that the dev1 users should be able to connect to. What should you do?

Options:

A.

Set metadata to enable-oslogin=true for the instance. Grant the dev1 group the compute.osLogin role. Direct them to use the Cloud Shell to ssh to that instance.

B.

Set metadata to enable-oslogin=true for the instance. Set the service account to no service account for that instance. Direct them to use the Cloud Shell to ssh to that instance.

C.

Enable block project wide keys for the instance. Generate an SSH key for each user in the dev1 group. Distribute the keys to dev1 users and direct them to use their third-party tools to connect.

D.

Enable block project wide keys for the instance. Generate an SSH key and associate the key with that instance. Distribute the key to dev1 users and direct them to use their third-party tools to connect.

Buy Now
Questions 48

Your company developed a mobile game that is deployed on Google Cloud. Gamers are connecting to the game with their personal phones over the Internet. The game sends UDP packets to update the servers about the gamers' actions while they are playing in multiplayer mode. Your game backend can scale over multiple virtual machines (VMs), and you want to expose the VMs over a single IP address. What should you do?

Options:

A.

Configure an SSL Proxy load balancer in front of the application servers.

B.

Configure an Internal UDP load balancer in front of the application servers.

C.

Configure an External HTTP(s) load balancer in front of the application servers.

D.

Configure an External Network load balancer in front of the application servers.

Buy Now
Questions 49

Your company runs a variety of applications and workloads on Google Cloud and you are responsible for managing cloud costs. You need to identify a solution that enables you to perform detailed cost analysis You also must be able to visualize the cost data in multiple ways on the same dashboard What should you do?

Options:

A.

Use the cost breakdown report with the available filters from Cloud Billing to visualize the data

B.

Enable the Cloud Billing export to BigQuery. and use Looker Studio to visualize the data

C.

Run Queries in Cloud Monitoring Create dashboards to visualize the billing metrics

D.

Enable Cloud Monitoring metrics export to BigQuery and use Looker to visualize the data

Buy Now
Questions 50

Your VMs are running in a subnet that has a subnet mask of 255.255.255.240. The current subnet has no more free IP addresses and you require an additional 10 IP addresses for new VMs. The existing and new VMs should all be able to reach each other without additional routes. What should you do?

Options:

A.

Use gcloud to expand the IP range of the current subnet.

B.

Delete the subnet, and recreate it using a wider range of IP addresses.

C.

Create a new project. Use Shared VPC to share the current network with the new project.

D.

Create a new subnet with the same starting IP but a wider range to overwrite the current subnet.

Buy Now
Questions 51

You have been asked to set up the billing configuration for a new Google Cloud customer. Your customer wants to group resources that share common IAM policies. What should you do?

Options:

A.

Use labels to group resources that share common IAM policies

B.

Use folders to group resources that share common IAM policies

C.

Set up a proper billing account structure to group IAM policies

D.

Set up a proper project naming structure to group IAM policies

Buy Now
Questions 52

You are a Google Cloud organization administrator. You need to configure organization policies and log sinks on Google Cloud projects that cannot be removed by project users to comply with your company's security policies. The security policies are different for each company department Each company department has a user with the Project Owner role assigned to their projects. What should you do?

Options:

A.

Organize projects under folders for each department. Configure both organization policies and log sinks on the folders

B.

Organize projects under folders for each department. Configure organization policies on the organization and log sinks on the folders.

C.

Use a standard naming convention for projects that includes the department name. Configure organization policies on the organization and log sinks on the projects.

D.

Use a standard naming convention for projects that includes the department name. Configure both organization policies and log sinks on the projects.

Buy Now
Questions 53

Your customer wants you to create a secure, publicly accessible website with autoscaling based on the compute instance CPU load. You want to enhance performance by storing static content in Cloud Storage. Which resources are needed to distribute the user traffic?

Options:

A.

A cross-region internal Application Load Balancer together with Identity-Aware Proxy to allow only HTTPS traffic.

B.

A global external Application Load Balancer with a managed SSL certificate to distribute the load and a URL map to target the requests for the static content to the Cloud Storage backend.

C.

A global external Network Load Balancer pointing to the backend instances to distribute the load evenly. The web servers will forward the request to the Cloud Storage as needed.

D.

A global external Application Load Balancer to distribute the load and a URL map to target the requests for the static content to the Cloud Storage backend. Install the HTTPS certificates on the instance.

Buy Now
Questions 54

You need to immediately change the storage class of an existing Google Cloud bucket. You need to reduce service cost for infrequently accessed files stored in that bucket and for all files that will be added to that bucket in the future. What should you do?

Options:

A.

Use the gsutil to rewrite the storage class for the bucket Change the default storage class for the bucket

B.

Use the gsutil to rewrite the storage class for the bucket Set up Object Lifecycle management on the bucket

C.

Create a new bucket and change the default storage class for the bucket Set up Object Lifecycle management on lite bucket

D.

Create a new bucket and change the default storage class for the bucket import the files from the previous bucket into the new bucket

Buy Now
Questions 55

You have an on-premises data analytics set of binaries that processes data files in memory for about 45 minutes every midnight. The sizes of those data files range from 1 gigabyte to 16 gigabytes. You want to migrate this application to Google Cloud with minimal effort and cost. What should you do?

Options:

A.

Upload the code to Cloud Functions. Use Cloud Scheduler to start the application.

B.

Create a container for the set of binaries. Use Cloud Scheduler to start a Cloud Run job for the container.

C.

Create a container for the set of binaries Deploy the container to Google Kubernetes Engine (GKE) and use the Kubernetes scheduler to start the application.

D.

Lift and shift to a VM on Compute Engine. Use an instance schedule to start and stop the instance.

Buy Now
Questions 56

You are creating an application that will run on Google Kubernetes Engine. You have identified MongoDB as the most suitable database system for your application and want to deploy a managed MongoDB environment that provides a support SLA. What should you do?

Options:

A.

Create a Cloud Bigtable cluster and use the HBase API

B.

Deploy MongoDB Alias from the Google Cloud Marketplace

C.

Download a MongoDB installation package and run it on Compute Engine instances

D.

Download a MongoDB installation package, and run it on a Managed Instance Group

Buy Now
Questions 57

You are running an application on multiple virtual machines within a managed instance group and have autoscaling enabled. The autoscaling policy is configured so that additional instances are added to the group if the CPU utilization of instances goes above 80%. VMs are added until the instance group reaches its maximum limit of five VMs or until CPU utilization of instances lowers to 80%. The initial delay for HTTP health checks against the instances is set to 30 seconds. The virtual machine instances take around three minutes to become available for users. You observe that when the instance group autoscales, it adds more instances then necessary to support the levels of end-user traffic. You want to properly maintain instance group sizes when autoscaling. What should you do?

Options:

A.

Set the maximum number of instances to 1.

B.

Decrease the maximum number of instances to 3.

C.

Use a TCP health check instead of an HTTP health check.

D.

Increase the initial delay of the HTTP health check to 200 seconds.

Buy Now
Questions 58

You are writing a shell script that includes a few gcloud CLI commands to access some Google Cloud resources. You want to test the script in your local development environment with a service account in the most secure way. What should you do?

Options:

A.

Download the service account key file and save it in a secure location. Set the GOOGLE_APPLICATION_CREDENTIALS environment variable to the key file.

B.

Enable service account impersonation, and use the gcloud config set auth/impersonate_service_account command to use it by default.

C.

Generate an ID token for the service account. Use the token with the gcloud CLI commands.

D.

Download the service account key file, and use it to generate an access token. Use the token with the gcloud CLI commands.

Buy Now
Questions 59

You need to manage a Cloud Spanner Instance for best query performance. Your instance in production runs in a single Google Cloud region. You need to improve performance in the shortest amount of time. You want to follow Google best practices for service configuration. What should you do?

Options:

A.

Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 45% If you exceed this threshold, add nodes lo your instance.

B.

Create an alert in Cloud Monitoring to alert when the percentage to high priority CPU utilization reaches 45% Use database query statistics to identify queries that result in high CPU usage, and then rewrite those queries to optimize their resource usage

C.

Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 65% If you exceed this threshold, add nodes to your instance

D.

Create an alert in Cloud Monitoring to alert when the percentage of high priority CPU utilization reaches 65%. Use database query statistics to identity queries that result in high CPU usage, and then rewrite those queries to optimize their resource usage.

Buy Now
Questions 60

You deployed an App Engine application using gcloud app deploy, but it did not deploy to the intended project. You want to find out why this happened and where the application deployed. What should you do?

Options:

A.

Check the app.yaml file for your application and check project settings.

B.

Check the web-application.xml file for your application and check project settings.

C.

Go to Deployment Manager and review settings for deployment of applications.

D.

Go to Cloud Shell and run gcloud config list to review the Google Cloud configuration used for deployment.

Buy Now
Questions 61

You are the project owner of a GCP project and want to delegate control to colleagues to manage buckets and files in Cloud Storage. You want to follow Google-recommended practices. Which IAM roles should you grant your colleagues?

Options:

A.

Project Editor

B.

Storage Admin

C.

Storage Object Admin

D.

Storage Object Creator

Buy Now
Questions 62

You need to track and verity modifications to a set of Google Compute Engine instances in your Google Cloud project. In particular, you want to verify OS system patching events on your virtual machines (VMs). What should you do?

Options:

A.

Review the Compute Engine activity logs Select and review the Admin Event logs

B.

Review the Compute Engine activity logs Select and review the System Event logs

C.

Install the Cloud Logging Agent In Cloud Logging review the Compute Engine syslog logs

D.

Install the Cloud Logging Agent In Cloud Logging, review the Compute Engine operation logs

Buy Now
Questions 63

Your management has asked an external auditor to review all the resources in a specific project. The security team has enabled the Organization Policy called Domain Restricted Sharing on the organization node by specifying only your Cloud Identity domain. You want the auditor to only be able to view, but not modify, the resources in that project. What should you do?

Options:

A.

Ask the auditor for their Google account, and give them the Viewer role on the project.

B.

Ask the auditor for their Google account, and give them the Security Reviewer role on the project.

C.

Create a temporary account for the auditor in Cloud Identity, and give that account the Viewer role on the project.

D.

Create a temporary account for the auditor in Cloud Identity, and give that account the Security Reviewer role on the project.

Buy Now
Questions 64

You are building an application that processes data files uploaded from thousands of suppliers. Your primary goals for the application are data security and the expiration of aged data. You need to design the application to:

•Restrict access so that suppliers can access only their own data.

•Give suppliers write access to data only for 30 minutes.

•Delete data that is over 45 days old.

You have a very short development cycle, and you need to make sure that the application requires minimal maintenance. Which two strategies should you use? (Choose two.)

Options:

A.

Build a lifecycle policy to delete Cloud Storage objects after 45 days.

B.

Use signed URLs to allow suppliers limited time access to store their objects.

C.

Set up an SFTP server for your application, and create a separate user for each supplier.

D.

Build a Cloud function that triggers a timer of 45 days to delete objects that have expired.

E.

Develop a script that loops through all Cloud Storage buckets and deletes any buckets that are older than 45 days.

Buy Now
Questions 65

You recently discovered that your developers are using many service account keys during their development process. While you work on a long term improvement, you need to quickly implement a process to enforce short-lived service account credentials in your company. You have the following requirements:

• All service accounts that require a key should be created in a centralized project called pj-sa.

• Service account keys should only be valid for one day.

You need a Google-recommended solution that minimizes cost. What should you do?

Options:

A.

Implement a Cloud Run job to rotate all service account keys periodically in pj-sa. Enforce an org policy to deny service account key creation with an exception to pj-sa.

B.

Implement a Kubernetes Cronjob to rotate all service account keys periodically. Disable attachment ofservice accounts to resources in all projects with an exception to pj-sa.

C.

Enforce an org policy constraint allowing the lifetime of service account keys to be 24 hours. Enforce an org policy constraint denying service account key creation with an exception on pj-sa.

D.

Enforce a DENY org policy constraint over the lifetime of service account keys for 24 hours. Disable attachment of service accounts to resources in all projects with an exception to pj-sa.

Buy Now
Questions 66

You are managing several Google Cloud Platform (GCP) projects and need access to all logs for the past 60 days. You want to be able to explore and quickly analyze the log contents. You want to follow Google- recommended practices to obtain the combined logs for all projects. What should you do?

Options:

A.

Navigate to Stackdriver Logging and select resource.labels.project_id="*"

B.

Create a Stackdriver Logging Export with a Sink destination to a BigQuery dataset. Configure the table expiration to 60 days.

C.

Create a Stackdriver Logging Export with a Sink destination to Cloud Storage. Create a lifecycle rule to delete objects after 60 days.

D.

Configure a Cloud Scheduler job to read from Stackdriver and store the logs in BigQuery. Configure the table expiration to 60 days.

Buy Now
Questions 67

You need to host an application on a Compute Engine instance in a project shared with other teams. You want to prevent the other teams from accidentally causing downtime on that application. Which feature should you use?

Options:

A.

Use a Shielded VM.

B.

Use a Preemptible VM.

C.

Use a sole-tenant node.

D.

Enable deletion protection on the instance.

Buy Now
Questions 68

Your organization has user identities in Active Directory. Your organization wants to use Active Directory as their source of truth for identities. Your organization wants to have full control over the Google accounts used by employees for all Google services, including your Google Cloud Platform (GCP) organization. What should you do?

Options:

A.

Use Google Cloud Directory Sync (GCDS) to synchronize users into Cloud Identity.

B.

Use the cloud Identity APIs and write a script to synchronize users to Cloud Identity.

C.

Export users from Active Directory as a CSV and import them to Cloud Identity via the Admin Console.

D.

Ask each employee to create a Google account using self signup. Require that each employee use their company email address and password.

Buy Now
Questions 69

Your customer has implemented a solution that uses Cloud Spanner and notices some read latency-related performance issues on one table. This table is accessed only by their users using a primary key. The table schema is shown below.

Associate-Cloud-Engineer Question 69

You want to resolve the issue. What should you do?

Associate-Cloud-Engineer Question 69

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Buy Now
Questions 70

You have several hundred microservice applications running in a Google Kubernetes Engine (GKE) cluster. Each microservice is a deployment with resource limits configured for each container in the deployment. You've observed that the resource limits for memory and CPU are not appropriately set for many of the microservices. You want to ensure that each microservice has right sized limits for memory and CPU. What should you do?

Options:

A.

Modify the cluster's node pool machine type and choose a machine type with more memory and CPU.

B.

Configure a Horizontal Pod Autoscaler for each microservice.

C.

Configure GKE cluster autoscaling.

D.

Configure a Vertical Pod Autoscaler for each microservice.

Buy Now
Questions 71

You have a Compute Engine instance hosting a production application. You want to receive an email if the instance consumes more than 90% of its CPU resources for more than 15 minutes. You want to use Google services. What should you do?

Options:

A.

1. Create a consumer Gmail account.2.Write a script that monitors the CPU usage.3.When the CPU usage exceeds the threshold, have that script send an email using the Gmail account and smtp.gmail.com on port 25 as SMTP server.

B.

1. Create a Stackdriver Workspace, and associate your Google Cloud Platform (GCP) project with it.2.Create an Alerting Policy in Stackdriver that uses the threshold as a trigger condition.3.Configure your email address in the notification channel.

C.

1. Create a Stackdriver Workspace, and associate your GCP project with it.2.Write a script that monitors the CPU usage and sends it as a custom metric to Stackdriver.3.Create an uptime check for the instance in Stackdriver.

D.

1. In Stackdriver Logging, create a logs-based metric to extract the CPU usage by using this regular expression: CPU Usage: ([0-9] {1,3}) %2.In Stackdriver Monitoring, create an Alerting Policy based on this metric.3.Configure your email address in the notification channel.

Buy Now
Questions 72

You are developing a new application and are looking for a Jenkins installation to build and deploy your source code. You want to automate the installation as quickly and easily as possible. What should you do?

Options:

A.

Deploy Jenkins through the Google Cloud Marketplace.

B.

Create a new Compute Engine instance. Run the Jenkins executable.

C.

Create a new Kubernetes Engine cluster. Create a deployment for the Jenkins image.

D.

Create an instance template with the Jenkins executable. Create a managed instance group with this template.

Buy Now
Questions 73

(You need to migrate multiple PostgreSQL databases from your on-premises data center to Google Cloud. You want to significantly improve the performance of your databases while minimizing changes to your data schema and application code. You expect to exceed 150 TB of data per geographical region. You want to follow Google-recommended practices and minimize your operational costs. What should you do?)

Options:

A.

Migrate your data to AlloyDB.

B.

Migrate your data to Spanner.

C.

Migrate your data to Firebase.

D.

Migrate your data to Bigtable.

Buy Now
Questions 74

You need to update a deployment in Deployment Manager without any resource downtime in the deployment. Which command should you use?

Options:

A.

gcloud deployment-manager deployments create --config

B.

gcloud deployment-manager deployments update --config

C.

gcloud deployment-manager resources create --config

D.

gcloud deployment-manager resources update --config

Buy Now
Questions 75

Your company is moving from an on-premises environment to Google Cloud Platform (GCP). You have multiple development teams that use Cassandra environments as backend databases. They all need a development environment that is isolated from other Cassandra instances. You want to move to GCP quickly and with minimal support effort. What should you do?

Options:

A.

1. Build an instruction guide to install Cassandra on GCP.2. Make the instruction guide accessible to your developers.

B.

1. Advise your developers to go to Cloud Marketplace.2. Ask the developers to launch a Cassandra image for their development work.

C.

1. Build a Cassandra Compute Engine instance and take a snapshot of it.2. Use the snapshot to create instances for your developers.

D.

1. Build a Cassandra Compute Engine instance and take a snapshot of it.2.Upload the snapshot to Cloud Storage and make it accessible to your developers.3.Build instructions to create a Compute Engine instance from the snapshot so that developers can do it themselves.

Buy Now
Questions 76

Your organization uses G Suite for communication and collaboration. All users in your organization have a G Suite account. You want to grant some G Suite users access to your Cloud Platform project. What should you do?

Options:

A.

Enable Cloud Identity in the GCP Console for your domain.

B.

Grant them the required IAM roles using their G Suite email address.

C.

Create a CSV sheet with all users’ email addresses. Use the gcloud command line tool to convert them into Google Cloud Platform accounts.

D.

In the G Suite console, add the users to a special group called cloud-console-users@yourdomain.com. Rely on the default behavior of the Cloud Platform to grant users access if they are members of this group.

Buy Now
Questions 77

You have deployed an application on a Compute Engine instance. An external consultant needs to access the Linux-based instance. The consultant is connected to your corporate network through a VPN connection, but the consultant has no Google account. What should you do?

Options:

A.

Instruct the external consultant to use the gcloud compute ssh command line tool by using Identity-Aware Proxy to access the instance.

B.

Instruct the external consultant to use the gcloud compute ssh command line tool by using the public IP address of the instance to access it.

C.

Instruct the external consultant to generate an SSH key pair, and request the public key from the consultant.Add the public key to the instance yourself, and have the consultant access the instance through SSH with their private key.

D.

Instruct the external consultant to generate an SSH key pair, and request the private key from the consultant.Add the private key to the instance yourself, and have the consultant access the instance through SSH with their public key.

Buy Now
Questions 78

You are running out of primary internal IP addresses in a subnet for a custom mode VPC. The subnet has the IP range 10.0.0.0/20. and the IP addresses are primarily used by virtual machines in the project. You need to provide more IP addresses for the virtual machines. What should you do?

Options:

A.

Change the subnet IP range from 10.0.0.0/20 to 10.0.0.0/22.

B.

Change the subnet IP range from 10.0 0.0/20 to 10.0.0.0718.

C.

Add a secondary IP range 10.1.0.0/20 to the subnet.

D.

Convert the subnet IP range from IPv4 to IPv6

Buy Now
Questions 79

Your company wants to provide engineers with access to explore Google Cloud freely in a sandbox environment. The total budget for testing across your organization is $1,000. You need to ensure that engineers are notified when the budget is about to be reached. You want to automate your solution as much as possible. What should you do?

Options:

A.

Configure a billing data export to a BigQuery dataset on the Cloud Billing account. Create a dashboard for all costs related to the sandbox experiments. Share the dashboard with all engineers.

B.

Create a Google Cloud Folder and ensure that all sandbox projects are located under that Folder. Create a Budget Alert for $1,000 and scope it to the Folder. Configure an email alert to billing administrators and users once the budget is 90% reached.

C.

Create a separate Cloud Billing account for all sandbox projects. Link a credit card with a limit of $1,000 to this billing account. Ensure all sandbox projects are linked to this new Cloud Billing account.

D.

Create an email template reminding people to regularly check their Google Cloud spend. Create a Cloud Run function that sends the email to all the project owners. Create a daily job in Cloud Scheduler that triggers the Cloud Run function. Deploy this solution for each sandbox.

Buy Now
Questions 80

Your coworker has helped you set up several configurations for gcloud. You've noticed that you're running commands against the wrong project. Being new to the company, you haven't yet memorized any of the projects. With the fewest steps possible, what's the fastest way to switch to the correct configuration?

Options:

A.

Run gcloud configurations list followed by gcloud configurations activate .

B.

Run gcloud config list followed by gcloud config activate.

C.

Run gcloud config configurations list followed by gcloud config configurations activate.

D.

Re-authenticate with the gcloud auth login command and select the correct configurations on login.

Buy Now
Questions 81

Your team is building a website that handles votes from a large user population. The incoming votes will arrive at various rates. You want to optimize the storage and processing of the votes. What should you do?

Options:

A.

Save the incoming votes to Firestore. Use Cloud Scheduler to trigger a Cloud Functions instance to periodically process the votes.

B.

Use a dedicated instance to process the incoming votes. Send the votes directly to this instance.

C.

Save the incoming votes to a JSON file on Cloud Storage. Process the votes in a batch at the end of the day.

D.

Save the incoming votes to Pub/Sub. Use the Pub/Sub topic to trigger a Cloud Functions instance to process the votes.

Buy Now
Questions 82

You have an object in a Cloud Storage bucket that you want to share with an external company. The object contains sensitive data. You want access to the content to be removed after four hours. The external company does not have a Google account to which you can grant specific user-based access privileges. You want to use the most secure method that requires the fewest steps. What should you do?

Options:

A.

Create a signed URL with a four-hour expiration and share the URL with the company.

B.

Set object access to ‘public’ and use object lifecycle management to remove the object after four hours.

C.

Configure the storage bucket as a static website and furnish the object’s URL to the company. Delete the object from the storage bucket after four hours.

D.

Create a new Cloud Storage bucket specifically for the external company to access. Copy the object to that bucket. Delete the bucket after four hours have passed.

Buy Now
Questions 83

You have downloaded and installed the gcloud command line interface (CLI) and have authenticated with your Google Account. Most of your Compute Engine instances in your project run in the europe-west1-d zone. You want to avoid having to specify this zone with each CLI command when managing these instances. What should you do?

Options:

A.

Set the europe-west1-d zone as the default zone using the gcloud config subcommand.

B.

In the Settings page for Compute Engine under Default location, set the zone to europe–west1-d.

C.

In the CLI installation directory, create a file called default.conf containing zone=europe–west1–d.

D.

Create a Metadata entry on the Compute Engine page with key compute/zone and value europe–west1–d.

Buy Now
Questions 84

You are creating a Google Kubernetes Engine (GKE) cluster with a cluster autoscaler feature enabled. You need to make sure that each node of the cluster will run a monitoring pod that sends container metrics to a third-party monitoring solution. What should you do?

Options:

A.

Deploy the monitoring pod in a StatefulSet object.

B.

Deploy the monitoring pod in a DaemonSet object.

C.

Reference the monitoring pod in a Deployment object.

D.

Reference the monitoring pod in a cluster initializer at the GKE cluster creation time.

Buy Now
Questions 85

You received a JSON file that contained a private key of a Service Account in order to get access to several resources in a Google Cloud project. You downloaded and installed the Cloud SDK and want to use this private key for authentication and authorization when performing gcloud commands. What should you do?

Options:

A.

Use the command gcloud auth login and point it to the private key

B.

Use the command gcloud auth activate-service-account and point it to the private key

C.

Place the private key file in the installation directory of the Cloud SDK and rename it to "credentials ison"

D.

Place the private key file in your home directory and rename it to ‘’GOOGLE_APPUCATION_CREDENTiALS".

Buy Now
Questions 86

You significantly changed a complex Deployment Manager template and want to confirm that the dependencies of all defined resources are properly met before committing it to the project. You want the most rapid feedback on your changes. What should you do?

Options:

A.

Use granular logging statements within a Deployment Manager template authored in Python.

B.

Monitor activity of the Deployment Manager execution on the Stackdriver Logging page of the GCP Console.

C.

Execute the Deployment Manager template against a separate project with the same configuration, and monitor for failures.

D.

Execute the Deployment Manager template using the –-preview option in the same project, and observe the state of interdependent resources.

Buy Now
Questions 87

You will have several applications running on different Compute Engine instances in the same project. You want to specify at a more granular level the service account each instance uses when calling Google Cloud APIs. What should you do?

Options:

A.

When creating the instances, specify a Service Account for each instance

B.

When creating the instances, assign the name of each Service Account as instance metadata

C.

After starting the instances, use gcloud compute instances update to specify a Service Account for each instance

D.

After starting the instances, use gcloud compute instances update to assign the name of the relevant Service Account as instance metadata

Buy Now
Questions 88

You have production and test workloads that you want to deploy on Compute Engine. Production VMs need to be in a different subnet than the test VMs. All the VMs must be able to reach each other over internal IP without creating additional routes. You need to set up VPC and the 2 subnets. Which configuration meets these requirements?

Options:

A.

Create a single custom VPC with 2 subnets. Create each subnet in a different region and with a different CIDR range.

B.

Create a single custom VPC with 2 subnets. Create each subnet in the same region and with the same CIDR range.

C.

Create 2 custom VPCs, each with a single subnet. Create each subnet is a different region and with a different CIDR range.

D.

Create 2 custom VPCs, each with a single subnet. Create each subnet in the same region and with the same CIDR range.

Buy Now
Questions 89

You are running multiple VPC-native Google Kubernetes Engine clusters in the same subnet. The IPs available for the nodes are exhausted, and you want to ensure that the clusters can grow in nodes when needed. What should you do?

Options:

A.

Create a new subnet in the same region as the subnet being used.

B.

Add an alias IP range to the subnet used by the GKE clusters.

C.

Create a new VPC, and set up VPC peering with the existing VPC.

D.

Expand the CIDR range of the relevant subnet for the cluster.

Buy Now
Questions 90

You are building a product on top of Google Kubernetes Engine (GKE). You have a single GKE cluster. For each of your customers, a Pod is running in that cluster, and your customers can run arbitrary code inside their Pod. You want to maximize the isolation between your customers’ Pods. What should you do?

Options:

A.

Use Binary Authorization and whitelist only the container images used by your customers’ Pods.

B.

Use the Container Analysis API to detect vulnerabilities in the containers used by your customers’ Pods.

C.

Create a GKE node pool with a sandbox type configured to gvisor. Add the parameter runtimeClassName: gvisor to the specification of your customers’ Pods.

D.

Use the cos_containerd image for your GKE nodes. Add a nodeSelector with the value cloud.google.com/gke-os-distribution: cos_containerd to the specification of your customers’ Pods.

Buy Now
Questions 91

You are storing sensitive information in a Cloud Storage bucket. For legal reasons, you need to be able to record all requests that read any of the stored data. You want to make sure you comply with these requirements. What should you do?

Options:

A.

Enable the Identity Aware Proxy API on the project.

B.

Scan the bucker using the Data Loss Prevention API.

C.

Allow only a single Service Account access to read the data.

D.

Enable Data Access audit logs for the Cloud Storage API.

Buy Now
Questions 92

You are building an archival solution for your data warehouse and have selected Cloud Storage to archive your data. Your users need to be able to access this archived data once a quarter for some regulatory requirements. You want to select a cost-efficient option. Which storage option should you use?

Options:

A.

Coldline Storage

B.

Nearline Storage

C.

Regional Storage

D.

Multi-Regional Storage

Buy Now
Questions 93

You need to enable traffic between multiple groups of Compute Engine instances that are currently running two different GCP projects. Each group of Compute Engine instances is running in its own VPC. What should you do?

Options:

A.

Verify that both projects are in a GCP Organization. Create a new VPC and add all instances.

B.

Verify that both projects are in a GCP Organization. Share the VPC from one project and request that the Compute Engine instances in the other project use this shared VPC.

C.

Verify that you are the Project Administrator of both projects. Create two new VPCs and add all instances.

D.

Verify that you are the Project Administrator of both projects. Create a new VPC and add all instances.

Buy Now
Questions 94

(Your company’s developers use an automation that you recently built to provision Linux VMs in Compute Engine within a Google Cloud project to perform various tasks. You need to manage the Linux account lifecycle and access for these users. You want to follow Google-recommended practices to simplify access management while minimizing operational costs. What should you do?)

Options:

A.

Enable OS Login for all VMs. Use IAM roles to grant user permissions.

B.

Enable OS Login for all VMs. Write custom startup scripts to update user permissions.

C.

Require your developers to create public SSH keys. Make the owner of the public key the root user.

D.

Require your developers to create public SSH keys. Write custom startup scripts to update user permissions.

Buy Now
Questions 95

You are assigned to maintain a Google Kubernetes Engine (GKE) cluster named dev that was deployed on Google Cloud. You want to manage the GKE configuration using the command line interface (CLI). You have just downloaded and installed the Cloud SDK. You want to ensure that future CLI commands by default address this specific cluster. What should you do?

Options:

A.

Use the command gcloud config set container/cluster dev.

B.

Use the command gcloud container clusters update dev.

C.

Create a file called gke.default in the ~/.gcloud aname.

D.

Create a file called defaults.json in the ~/.gcloud folder that contains the cluster name.

Buy Now
Questions 96

You deployed an LDAP server on Compute Engine that is reachable via TLS through port 636 using UDP. You want to make sure it is reachable by clients over that port. What should you do?

Options:

A.

Add the network tag allow-udp-636 to the VM instance running the LDAP server.

B.

Create a route called allow-udp-636 and set the next hop to be the VM instance running the LDAP server.

C.

Add a network tag of your choice to the instance. Create a firewall rule to allow ingress on UDP port 636 for that network tag.

D.

Add a network tag of your choice to the instance running the LDAP server. Create a firewall rule to allow egress on UDP port 636 for that network tag.

Buy Now
Questions 97

Your company wants to migrate your data from an on-premises relational database to Google Cloud. Your current database can no longer scale with respect to the growth of your users and you expect the number of users to rapidly grow. You need to choose a relational database that allows you to globally scale while minimizing your management and administration efforts. You also want to follow Google-recommended practices. What should you do?

Options:

A.

Use Cloud SQL

B.

Use Filestore.

C.

Use Spanner.

D.

Use BigQuery.

Buy Now
Questions 98

You are asked to set up application performance monitoring on Google Cloud projects A, B, and C as a single pane of glass. You want to monitor CPU, memory, and disk. What should you do?

Options:

A.

Enable API and then share charts from project A, B, and C.

B.

Enable API and then give the metrics.reader role to projects A, B, and C.

C.

Enable API and then use default dashboards to view all projects in sequence.

D.

Enable API, create a workspace under project A, and then add project B and C.

Buy Now
Questions 99

You are working for a hospital that stores Its medical images in an on-premises data room. The hospital wants to use Cloud Storage for archival storage of these images. The hospital wants an automated process to upload any new medical images to Cloud Storage. You need to design and implement a solution. What should you do?

Options:

A.

Deploy a Dataflow job from the batch template "Datastore lo Cloud Storage" Schedule the batch job on the desired interval

B.

In the Cloud Console, go to Cloud Storage Upload the relevant images to the appropriate bucket

C.

Create a script that uses the gsutil command line interface to synchronize the on-premises storage with Cloud Storage Schedule the script as a cron job

D.

Create a Pub/Sub topic, and enable a Cloud Storage trigger for the Pub/Sub topic. Create an application that sends all medical images to the Pub/Sub lope

Buy Now
Questions 100

You’ve deployed a microservice called myapp1 to a Google Kubernetes Engine cluster using the YAML file specified below:

Associate-Cloud-Engineer Question 100

You need to refactor this configuration so that the database password is not stored in plain text. You want to follow Google-recommended practices. What should you do?

Options:

A.

Store the database password inside the Docker image of the container, not in the YAML file.

B.

Store the database password inside a Secret object. Modify the YAML file to populate the DB_PASSWORD environment variable from the Secret.

C.

Store the database password inside a ConfigMap object. Modify the YAML file to populate the DB_PASSWORD environment variable from the ConfigMap.

D.

Store the database password in a file inside a Kubernetes persistent volume, and use a persistent volume claim to mount the volume to the container.

Buy Now
Questions 101

Your company implemented BigQuery as an enterprise data warehouse. Users from multiple business units run queries on this data warehouse. However, you notice that query costs for BigQuery are very high, and you need to control costs. Which two methods should you use? (Choose two.)

Options:

A.

Split the users from business units to multiple projects.

B.

Apply a user- or project-level custom query quota for BigQuery data warehouse.

C.

Create separate copies of your BigQuery data warehouse for each business unit.

D.

Split your BigQuery data warehouse into multiple data warehouses for each business unit.

E.

Change your BigQuery query model from on-demand to flat rate. Apply the appropriate number of slots to each Project.

Buy Now
Questions 102

An employee was terminated, but their access to Google Cloud Platform (GCP) was not removed until 2 weeks later. You need to find out this employee accessed any sensitive customer information after their termination. What should you do?

Options:

A.

View System Event Logs in Stackdriver. Search for the user’s email as the principal.

B.

View System Event Logs in Stackdriver. Search for the service account associated with the user.

C.

View Data Access audit logs in Stackdriver. Search for the user’s email as the principal.

D.

View the Admin Activity log in Stackdriver. Search for the service account associated with the user.

Buy Now
Exam Name: Google Cloud Certified - Associate Cloud Engineer
Last Update: Nov 19, 2025
Questions: 343

PDF + Testing Engine

$57.75  $164.99

Testing Engine

$43.75  $124.99
buy now Associate-Cloud-Engineer testing engine

PDF (Q&A)

$36.75  $104.99
buy now Associate-Cloud-Engineer pdf
dumpsmate guaranteed to pass
24/7 Customer Support

DumpsMate's team of experts is always available to respond your queries on exam preparation. Get professional answers on any topic of the certification syllabus. Our experts will thoroughly satisfy you.

Site Secure

mcafee secure

TESTED 19 Nov 2025