Win IT Exam with Last Dumps 2025


Google Professional-Cloud-Developer Exam

Page 36/38
Viewing Questions 351 360 out of 375 Questions
94.74%

Question 351
You have a Cloud Run service that needs to connect to a Cloud SQL instance in a different project. You provisioned the Cloud Run service account with the Cloud SQL Client IAM role on the project that is hosting Cloud SQL. However, when you test the connection, the connection fails. You want to fix the connection failure while following Google-recommended practices. What should you do?
A. Add the cloudsql.instances.connect IAM permission to the Cloud Run service account.
B. Request additional API quota for Cloud SQL Auth Proxy,
C. Enable the Cloud SQL Admin API in both projects.
D. Migrate the Cloud SQL instance into the same project as the Cloud Run service.

Question 352
You developed a Python script that retrieves information from files that are uploaded to Cloud Storage and writes the information to Bigtable. You have completed testing on your local environment and created the python-script service account with the Bigtable User IAM role. You want to deploy the code with the appropriate authentication while following Google-recommended practices. What should you do?
A. 1. Deploy your code to Cloud Functions. Create a Cloud Storage trigger.
2. Configure IAM binding for authentication.
B. 1. Deploy your code to Cloud Functions. Create a Cloud Storage trigger.
2. Create a service account key for authentication
C. 1. Deploy your image to Cloud Run. Create a trigger in Cloud Scheduler that triggers the service every minute.
2. Configure IAM binding for authentication.
D. 1. Deploy your image to Cloud Run. Create a trigger in Cloud Scheduler that triggers the service every minute.
2. Create a service account key for authentication.

Question 353
You are a developer at a large organization. Your team uses Git for source code management (SCM). You want to ensure that your team follows Google-recommended best practices to manage code to drive higher rates of software delivery. Which SCM process should your team use?
A. Each developer commits their code to the main branch before each product release, conducts testing, and rolls back if integration issues are detected.
B. Each group of developers copies the repository, commits their changes to their repository, and merges their code into the main repository before each product release.
C. Each developer creates a branch for their own work, commits their changes to their branch, and merges their code into the main branch daily.
D. Each group of developers creates a feature branch from the main branch for their work, commits their changes to their branch, and merges their code into the main branch before each major release.

Question 354
You work for an ecommerce company. You are developing a new application with the following requirements:
• The application must have access to the most up-to-date data at all times.
• Due to company policy, data older than 30 days must be automatically deleted.
You need to determine which service should host the database, and how to configure the data deletion. You want to use the most efficient solution. What should you do?
A. Configure Spanner to host the database. Use Data Catalog to delete data older than 30 days.
B. Configure Spanner to host the database. Create a time-to-live policy that deletes data older than 30 days.
C. Configure Bigtable to host the database. Create a time-to-live policy that deletes data older than 30 days.
D. Configure Bigtable to host the database. Create a garbage collection policy in Bigtable that deletes data older than 30 days.

Question 355
Your team is responsible for developing multiple microservices. These microservices are deployed in Cloud Run and connected to a Cloud SQL instance. You typically conduct tests in a local environment prior to deploying new features. However, the external IP was recently removed from your Cloud SQL instance, and you are unable to perform the tests. You need to connect to the database to conduct tests with the most updated data. You want to follow Google-recommended practices. What should you do?
A. Export the data from the database to a Cloud Storage bucket. Create a database on your computer and import the data.
B. Create a Cloud VPN tunnel from your computer to your Google Cloud project, and connect to the Cloud SQL instance.
C. Add your IP as an authorized network on the Cloud SQL instance.
D. Create a VM in the same VPC as the Cloud SQL instance. Connect to the VM by using Identity-Aware Proxy for TCP forwarding. Install and configure the Cloud SQL Auth Proxy.


Question 356
You have an application running on Cloud Run that receives a large volume of traffic. You need to deploy a new version of the application. You want your deployment process to minimize the risk of downtime while following Google-recommended practices. What should you do?
A. Use Cloud Run emulator to test changes locally before deploying the new version of the application to the production Cloud Run service.
B. Use Cloud Build to create a pipeline, and configure a test stage before the deployment stage. When all tests pass, deploy the application to Cloud Run, and direct 100% of users to this new version of the application. Roll back if any issues are detected.
C. Use Cloud Load Balancing to route a percentage of production traffic to a separate Cloud Run service running the new version of the application. If performance meets expectations, gradually increase the percentage of users until the new Cloud Run service reaches 100%.
D. Use traffic splitting to have a small percentage of users test out new features on the new revision of the application on the production Cloud Run service. If performance meets expectations, gradually increase the percentage of users until it reaches 100%.

Question 357
You are deploying a containerized application to GKE. You have set up a build pipeline by using Cloud Build that builds a Java application and pushes the application container image to Artifact Registry. Your build pipeline executes multiple sequential steps that reference Docker container images with the same layers.
You notice that the Cloud Build pipeline runs are taking longer than expected to complete. How should you optimize the Docker image build process?
A. Add the --squash parameter to the Docker build steps to combine newly built layers into a single layer.
B. Configure Cloud Build to use a private pool in your VPC for pipeline executions.
C. Specify the cached image by adding the --cache-from argument in your build config file with the image as a cache source.
D. Store container artifacts on Cloud Storage. Configure Cloud CDN on the Cloud Storage bucket to enable caching on edge locations.

Question 358
You are developing a dashboard that aggregates temperature readings from thousands of IoT devices monitoring a city's ambient temperature. You expect a large amount of viewing traffic resulting in a large amount of data egress once the dashboard is live. The dashboard temperature display data doesn't need to be real-time and can tolerate a few seconds of lag. You decide to deploy Memorystore for Redis as the storage backend. You want to ensure that the dashboard will be highly available. How should you configure the service in Memorystore for Redis?
A. Update Memorystore for Redis to the latest version.
B. Configure Memorystore to use read replicas.
C. Use Private Service Access to enable low-latency network throughput.
D. Set up Serverless VPC Access to avoid receiving traffic over the internet.

Question 359
You are responsible for developing a new ecommerce application that is running on Cloud Run. You need to connect your application to a Cloud SQL database that is in a separate project. This project is on an isolated network dedicated to multiple databases without a public IP. You need to connect your application to this database. What should you do?
A. Create a Private Service Connect endpoint on your network. Create a Serverless VPC Access connector on your project. Use Cloud SQL Language Connectors to create an internal connection.
B. Configure VPC Network Peering between both networks. In Cloud Run, create a Cloud SQL connection that uses the internal IP. Use Cloud SQL Language Connectors to interact with the database.
C. Configure private services access on your project. In Cloud Run, create a Cloud SQL connection. Use Cloud SQL Language Connectors to interact with the database.
D. Create a subnet on your VPC. Create a Serverless VPC Access connector on your project using the new subnet. In Cloud Run, create a Cloud SQL connection. Use Cloud SQL Language Connectors to interact with the database.

Question 360
You are deploying new workloads on a GKE Autopilot mode cluster. You need to ensure that the Pods are scheduled on nodes that use Arm architecture CPUs. The cluster currently has no Arm-based CPUs. You want to minimize cluster operations. How should you ensure that the workloads on the GKE cluster will use Arm-based CPU nodes?
A. Apply Pod tolerations to request GKE to avoid scheduling Pods on nodes that do not have Arm-based CPUs.
B. Apply node taints on node pools to tell GKE to only schedule your workloads on Arm-based CPU nodes.
C. Deploy a new cluster in GKE Standard mode, and set up a node pool with Arm-based CPU nodes. Use nodeSelector in your manifest to ensure that Pods are scheduled in this node pool.
D. Request the Scale-out compute class and the arm64 architecture in your manifest.



Premium Version