Question 211
You need to configure a Deployment on Google Kubernetes Engine (GKE). You want to include a check that verifies that the containers can connect to the database. If the Pod is failing to connect, you want a script on the container to run to complete a graceful shutdown. How should you configure the Deployment?
A. Create two jobs: one that checks whether the container can connect to the database, and another that runs the shutdown script if the Pod is failing.
B. Create the Deployment with a livenessProbe for the container that will fail if the container can't connect to the database. Configure a Prestop lifecycle handler that runs the shutdown script if the container is failing.
C. Create the Deployment with a PostStart lifecycle handler that checks the service availability. Configure a PreStop lifecycle handler that runs the shutdown script if the container is failing.
D. Create the Deployment with an initContainer that checks the service availability. Configure a Prestop lifecycle handler that runs the shutdown script if the Pod is failing.
Question 212
You are responsible for deploying a new API. That API will have three different URL paths:
• https://yourcompany.com/students
• https://yourcompany.com/teachers
• https://yourcompany.com/classes
You need to configure each API URL path to invoke a different function in your code. What should you do?
A. Create one Cloud Function as a backend service exposed using an HTTPS load balancer.
B. Create three Cloud Functions exposed directly.
C. Create one Cloud Function exposed directly.
D. Create three Cloud Functions as three backend services exposed using an HTTPS load balancer.
Question 213
You are deploying a microservices application to Google Kubernetes Engine (GKE). The application will receive daily updates. You expect to deploy a large number of distinct containers that will run on the Linux operating system (OS). You want to be alerted to any known OS vulnerabilities in the new containers. You want to follow Google-recommended best practices. What should you do?
A. Use the gcloud CLI to call Container Analysis to scan new container images. Review the vulnerability results before each deployment.
B. Enable Container Analysis, and upload new container images to Artifact Registry. Review the vulnerability results before each deployment.
C. Enable Container Analysis, and upload new container images to Artifact Registry. Review the critical vulnerability results before each deployment.
D. Use the Container Analysis REST API to call Container Analysis to scan new container images. Review the vulnerability results before each deployment.
Question 214
You are a developer at a large organization. You have an application written in Go running in a production Google Kubernetes Engine (GKE) cluster. You need to add a new feature that requires access to BigQuery. You want to grant BigQuery access to your GKE cluster following Google-recommended best practices. What should you do?
A. Create a Google service account with BigQuery access. Add the JSON key to Secret Manager, and use the Go client library to access the JSON key.
B. Create a Google service account with BigQuery access. Add the Google service account JSON key as a Kubernetes secret, and configure the application to use this secret.
C. Create a Google service account with BigQuery access. Add the Google service account JSON key to Secret Manager, and use an init container to access the secret for the application to use.
D. Create a Google service account and a Kubernetes service account. Configure Workload Identity on the GKE cluster, and reference the Kubernetes service account on the application Deployment.
Question 215
You have an application written in Python running in production on Cloud Run. Your application needs to read/write data stored in a Cloud Storage bucket in the same project. You want to grant access to your application following the principle of least privilege. What should you do?
A. Create a user-managed service account with a custom Identity and Access Management (IAM) role.
B. Create a user-managed service account with the Storage Admin Identity and Access Management (IAM) role.
C. Create a user-managed service account with the Project Editor Identity and Access Management (IAM) role.
D. Use the default service account linked to the Cloud Run revision in production.
Question 216
Your team is developing unit tests for Cloud Function code. The code is stored in a Cloud Source Repositories repository. You are responsible for implementing the tests. Only a specific service account has the necessary permissions to deploy the code to Cloud Functions. You want to ensure that the code cannot be deployed without first passing the tests. How should you configure the unit testing process?
A. Configure Cloud Build to deploy the Cloud Function. If the code passes the tests, a deployment approval is sent to you.
B. Configure Cloud Build to deploy the Cloud Function, using the specific service account as the build agent. Run the unit tests after successful deployment.
C. Configure Cloud Build to run the unit tests. If the code passes the tests, the developer deploys the Cloud Function.
D. Configure Cloud Build to run the unit tests, using the specific service account as the build agent. If the code passes the tests, Cloud Build deploys the Cloud Function.
Question 217
Your team detected a spike of errors in an application running on Cloud Run in your production project. The application is configured to read messages from Pub/Sub topic A, process the messages, and write the messages to topic B. You want to conduct tests to identify the cause of the errors. You can use a set of mock messages for testing. What should you do?
A. Deploy the Pub/Sub and Cloud Run emulators on your local machine. Deploy the application locally, and change the logging level in the application to DEBUG or INFO. Write mock messages to topic A, and then analyze the logs.
B. Use the gcloud CLI to write mock messages to topic A. Change the logging level in the application to DEBUG or INFO, and then analyze the logs.
C. Deploy the Pub/Sub emulator on your local machine. Point the production application to your local Pub/Sub topics. Write mock messages to topic A, and then analyze the logs.
D. Use the Google Cloud console to write mock messages to topic A. Change the logging level in the application to DEBUG or INFO, and then analyze the logs.
Question 218
You are developing a Java Web Server that needs to interact with Google Cloud services via the Google Cloud API on the user's behalf. Users should be able to authenticate to the Google Cloud API using their Google Cloud identities. Which workflow should you implement in your web application?
A. 1. When a user arrives at your application, prompt them for their Google username and password.
2. Store an SHA password hash in your application's database along with the user's username.
3. The application authenticates to the Google Cloud API using HTTPs requests with the user's username and password hash in the Authorization request header.
B. 1. When a user arrives at your application, prompt them for their Google username and password.
2. Forward the user's username and password in an HTTPS request to the Google Cloud authorization server, and request an access token.
3. The Google server validates the user's credentials and returns an access token to the application.
4. The application uses the access token to call the Google Cloud API.
C. 1. When a user arrives at your application, route them to a Google Cloud consent screen with a list of requested permissions that prompts the user to sign in with SSO to their Google Account.
2. After the user signs in and provides consent, your application receives an authorization code from a Google server.
3. The Google server returns the authorization code to the user, which is stored in the browser's cookies.
4. The user authenticates to the Google Cloud API using the authorization code in the cookie.
D. 1. When a user arrives at your application, route them to a Google Cloud consent screen with a list of requested permissions that prompts the user to sign in with SSO to their Google Account.
2. After the user signs in and provides consent, your application receives an authorization code from a Google server.
3. The application requests a Google Server to exchange the authorization code with an access token.
4. The Google server responds with the access token that is used by the application to call the Google Cloud API.
Question 219
You recently developed a new application. You want to deploy the application on Cloud Run without a Dockerfile. Your organization requires that all container images are pushed to a centrally managed container repository. How should you build your container using Google Cloud services? (Choose two.)
A. Push your source code to Artifact Registry.
B. Submit a Cloud Build job to push the image.
C. Use the pack build command with pack CLI.
D. Include the --source flag with the gcloud run deploy CLI command.
E. Include the --platform=kubernetes flag with the gcloud run deploy CLI command.
Question 220
You work for an organization that manages an online ecommerce website. Your company plans to expand across the world; however, the estore currently serves one specific region. You need to select a SQL database and configure a schema that will scale as your organization grows. You want to create a table that stores all customer transactions and ensure that the customer (CustomerId) and the transaction (TransactionId) are unique. What should you do?
A. Create a Cloud SQL table that has TransactionId and CustomerId configured as primary keys. Use an incremental number for the TransactionId.
B. Create a Cloud SQL table that has TransactionId and CustomerId configured as primary keys. Use a random string (UUID) for the Transactionid.
C. Create a Cloud Spanner table that has TransactionId and CustomerId configured as primary keys. Use a random string (UUID) for the TransactionId.
D. Create a Cloud Spanner table that has TransactionId and CustomerId configured as primary keys. Use an incremental number for the TransactionId.