Winter Special Sale Limited Time 60% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: 2493360325

Good News !!! Professional-Cloud-Architect Google Certified Professional - Cloud Architect (GCP) is now Stable and With Pass Result

Professional-Cloud-Architect Practice Exam Questions and Answers

Google Certified Professional - Cloud Architect (GCP)

Last Update 3 days ago
Total Questions : 275

Google Certified Professional - Cloud Architect (GCP) is stable now with all latest exam questions are added 3 days ago. Incorporating Professional-Cloud-Architect practice exam questions into your study plan is more than just a preparation strategy.

Professional-Cloud-Architect exam questions often include scenarios and problem-solving exercises that mirror real-world challenges. Working through Professional-Cloud-Architect dumps allows you to practice pacing yourself, ensuring that you can complete all Google Certified Professional - Cloud Architect (GCP) practice test within the allotted time frame.

Professional-Cloud-Architect PDF

Professional-Cloud-Architect PDF (Printable)
$50
$124.99

Professional-Cloud-Architect Testing Engine

Professional-Cloud-Architect PDF (Printable)
$58
$144.99

Professional-Cloud-Architect PDF + Testing Engine

Professional-Cloud-Architect PDF (Printable)
$72.8
$181.99
Question # 1

For this question, refer to the EHR Healthcare case study. In the past, configuration errors put public IP addresses on backend servers that should not have been accessible from the Internet. You need to ensure that no one can put external IP addresses on backend Compute Engine instances and that external IP addresses can only be configured on frontend Compute Engine instances. What should you do?

Options:

A.  

Create an Organizational Policy with a constraint to allow external IP addresses only on the frontend Compute Engine instances.

B.  

Revoke the compute.networkAdmin role from all users in the project with front end instances.

C.  

Create an Identity and Access Management (IAM) policy that maps the IT staff to the compute.networkAdmin role for the organization.

D.  

Create a custom Identity and Access Management (IAM) role named GCE_FRONTEND with the compute.addresses.create permission.

Discussion 0
Question # 2

For this question, refer to the EHR Healthcare case study. EHR has single Dedicated Interconnect

connection between their primary data center and Googles network. This connection satisfies

EHR’s network and security policies:

• On-premises servers without public IP addresses need to connect to cloud resources

without public IP addresses

• Traffic flows from production network mgmt. servers to Compute Engine virtual

machines should never traverse the public internet.

You need to upgrade the EHR connection to comply with their requirements. The new

connection design must support business critical needs and meet the same network and

security policy requirements. What should you do?

Options:

A.  

Add a new Dedicated Interconnect connection

B.  

Upgrade the bandwidth on the Dedicated Interconnect connection to 100 G

C.  

Add three new Cloud VPN connections

D.  

Add a new Carrier Peering connection

Discussion 0
Question # 3

For this question, refer to the EHR Healthcare case study. You need to define the technical architecture for hybrid connectivity between EHR's on-premises systems and Google Cloud. You want to follow Google's recommended practices for production-level applications. Considering the EHR Healthcare business and technical requirements, what should you do?

Options:

A.  

Configure two Partner Interconnect connections in one metro (City), and make sure the Interconnect connections are placed in different metro zones.

B.  

Configure two VPN connections from on-premises to Google Cloud, and make sure the VPN devices on-premises are in separate racks.

C.  

Configure Direct Peering between EHR Healthcare and Google Cloud, and make sure you are peering at least two Google locations.

D.  

Configure two Dedicated Interconnect connections in one metro (City) and two connections in another metro, and make sure the Interconnect connections are placed in different metro zones.

Discussion 0
Question # 4

For this question, refer to the Dress4Win case study. You are responsible for the security of data stored in

Cloud Storage for your company, Dress4Win. You have already created a set of Google Groups and assigned the appropriate users to those groups. You should use Google best practices and implement the simplest design to meet the requirements.

Considering Dress4Win’s business and technical requirements, what should you do?

Options:

A.  

Assign custom IAM roles to the Google Groups you created in order to enforce security requirements.

Encrypt data with a customer-supplied encryption key when storing files in Cloud Storage.

B.  

Assign custom IAM roles to the Google Groups you created in order to enforce security requirements.

Enable default storage encryption before storing files in Cloud Storage.

C.  

Assign predefined IAM roles to the Google Groups you created in order to enforce security requirements.

Utilize Google’s default encryption at rest when storing files in Cloud Storage.

D.  

Assign predefined IAM roles to the Google Groups you created in order to enforce security requirements. Ensure that the default Cloud KMS key is set before storing files in Cloud Storage.

Discussion 0
Question # 5

For this question, refer to the Dress4Win case study. Dress4Win is expected to grow to 10 times its size in 1 year with a corresponding growth in data and traffic that mirrors the existing patterns of usage. The CIO has set the target of migrating production infrastructure to the cloud within the next 6 months. How will you configure the solution to scale for this growth without making major application changes and still maximize the ROI?

Options:

A.  

Migrate the web application layer to App Engine, and MySQL to Cloud Datastore, and NAS to Cloud Storage. Deploy RabbitMQ, and deploy Hadoop servers using Deployment Manager.

B.  

Migrate RabbitMQ to Cloud Pub/Sub, Hadoop to BigQuery, and NAS to Compute Engine with Persistent Disk storage. Deploy Tomcat, and deploy Nginx using Deployment Manager.

C.  

Implement managed instance groups for Tomcat and Nginx. Migrate MySQL to Cloud SQL, RabbitMQ to Cloud Pub/Sub, Hadoop to Cloud Dataproc, and NAS to Compute Engine with Persistent Disk storage.

D.  

Implement managed instance groups for the Tomcat and Nginx. Migrate MySQL to Cloud SQL, RabbitMQ to Cloud Pub/Sub, Hadoop to Cloud Dataproc, and NAS to Cloud Storage.

Discussion 0
Question # 6

For this question, refer to the Dress4Win case study. To be legally compliant during an audit, Dress4Win must be able to give insights in all administrative actions that modify the configuration or metadata of resources on Google Cloud.

What should you do?

Options:

A.  

Use Stackdriver Trace to create a trace list analysis.

B.  

Use Stackdriver Monitoring to create a dashboard on the project’s activity.

C.  

Enable Cloud Identity-Aware Proxy in all projects, and add the group of Administrators as a member.

D.  

Use the Activity page in the GCP Console and Stackdriver Logging to provide the required insight.

Discussion 0
Question # 7

For this question, refer to the Dress4Win case study. Which of the compute services should be migrated as –is and would still be an optimized architecture for performance in the cloud?

Options:

A.  

Web applications deployed using App Engine standard environment

B.  

RabbitMQ deployed using an unmanaged instance group

C.  

Hadoop/Spark deployed using Cloud Dataproc Regional in High Availability mode

D.  

Jenkins, monitoring, bastion hosts, security scanners services deployed on custom machine types

Discussion 0
Question # 8

For this question, refer to the TerramEarth case study.

TerramEarth has equipped unconnected trucks with servers and sensors to collet telemetry data. Next year they want to use the data to train machine learning models. They want to store this data in the cloud while reducing costs. What should they do?

Options:

A.  

Have the vehicle’ computer compress the data in hourly snapshots, and store it in a Google Cloud storage (GCS) Nearline bucket.

B.  

Push the telemetry data in Real-time to a streaming dataflow job that compresses the data, and store it in Google BigQuery.

C.  

Push the telemetry data in real-time to a streaming dataflow job that compresses the data, and store it in Cloud Bigtable.

D.  

Have the vehicle's computer compress the data in hourly snapshots, a Store it in a GCS Coldline bucket.

Discussion 0
Question # 9

For this question, refer to the TerramEarth case study.

To speed up data retrieval, more vehicles will be upgraded to cellular connections and be able to transmit data to the ETL process. The current FTP process is error-prone and restarts the data transfer from the start of the file when connections fail, which happens often. You want to improve the reliability of the solution and minimize data transfer time on the cellular connections. What should you do?

Options:

A.  

Use one Google Container Engine cluster of FTP servers. Save the data to a Multi-Regional bucket. Run the ETL process using data in the bucket.

B.  

Use multiple Google Container Engine clusters running FTP servers located in different regions. Save the data to Multi-Regional buckets in us, eu, and asia. Run the ETL process using the data in the bucket.

C.  

Directly transfer the files to different Google Cloud Multi-Regional Storage bucket locations in us, eu, and asia using Google APIs over HTTP(S). Run the ETL process using the data in the bucket.

D.  

Directly transfer the files to a different Google Cloud Regional Storage bucket location in us, eu, and asia using Google APIs over HTTP(S). Run the ETL process to retrieve the data from each Regional bucket.

Discussion 0
Question # 10

For this question, refer to the TerramEarth case study.

TerramEarth's CTO wants to use the raw data from connected vehicles to help identify approximately when a vehicle in the development team to focus their failure. You want to allow analysts to centrally query the vehicle data. Which architecture should you recommend?

A)

Question # 10

B)

Question # 10

C)

Question # 10

D)

Question # 10

Options:

A.  

Option A

B.  

Option B

C.  

Option C

D.  

Option D

Discussion 0
Get Professional-Cloud-Architect dumps and pass your exam in 24 hours!

Free Exams Sample Questions