getcertified4sure.com

Professional-Cloud-Architect Exam

A Review Of 100% Correct Professional-Cloud-Architect Latest Exam




We provide real Professional-Cloud-Architect exam questions and answers braindumps in two formats. Download PDF & Practice Tests. Pass Google Professional-Cloud-Architect Exam quickly & easily. The Professional-Cloud-Architect PDF type is available for reading and printing. You can print more and practice many times. With the help of our Google Professional-Cloud-Architect dumps pdf and vce product and material, you can easily pass the Professional-Cloud-Architect exam.

Online Professional-Cloud-Architect free questions and answers of New Version:

NEW QUESTION 1

Your company is building a new architecture to support its data-centric business focus. You are responsible for setting up the network. Your company’s mobile and web-facing applications will be deployed on-premises, and all data analysis will be conducted in GCP. The plan is to process and load 7 years of archived .csv files totaling 900 TB of data and then continue loading 10 TB of data daily. You currently have an existing 100-MB internet connection.
What actions will meet your company’s needs?

  • A. Compress and upload both achieved files and files uploaded daily using the qsutil –m option.
  • B. Lease a Transfer Appliance, upload archived files to it, and send it, and send it to Google to transfer archived data to Cloud Storag
  • C. Establish a connection with Google using a Dedicated Interconnect or Direct Peering connection and use it to upload files daily.
  • D. Lease a Transfer Appliance, upload archived files to it, and send it, and send it to Google to transferarchived data to Cloud Storag
  • E. Establish one Cloud VPN Tunnel to VPC networks over the public internet, and compares and upload files daily using the gsutil –m option.
  • F. Lease a Transfer Appliance, upload archived files to it, and send it to Google to transfer archived data to Cloud Storag
  • G. Establish a Cloud VPN Tunnel to VPC networks over the public internet, and compress and upload files daily.

Answer: B

Explanation:
https://cloud.google.com/interconnect/docs/how-to/direct-peering

NEW QUESTION 2

Your customer support tool logs all email and chat conversations to Cloud Bigtable for retention and analysis. What is the recommended approach for sanitizing this data of personally identifiable information or payment card information before initial storage?

  • A. Hash all data using SHA256
  • B. Encrypt all data using elliptic curve cryptography
  • C. De-identify the data with the Cloud Data Loss Prevention API
  • D. Use regular expressions to find and redact phone numbers, email addresses, and credit card numbers

Answer: A

Explanation:
Reference: https://cloud.google.com/solutions/pci-dss-compliance-ingcp#

NEW QUESTION 3

Your company has multiple on-premises systems that serve as sources for reporting. The data has not been maintained well and has become degraded over time. You want to use Google-recommended practices to detect anomalies in your company data. What should you do?

  • A. Upload your files into Cloud Storag
  • B. Use Cloud Datalab to explore and clean your data.
  • C. Upload your files into Cloud Storag
  • D. Use Cloud Dataprep to explore and clean your data.
  • E. Connect Cloud Datalab to your on-premises system
  • F. Use Cloud Datalab to explore and clean your data.
  • G. Connect Cloud Dataprep to your on-premises system
  • H. Use Cloud Dataprep to explore and clean your data.

Answer: B

Explanation:
https://cloud.google.com/dataprep/

NEW QUESTION 4

Your company has successfully migrated to the cloud and wants to analyze their data stream to optimize operations. They do not have any existing code for this analysis, so they are exploring all their options. These options include a mix of batch and stream processing, as they are running some hourly jobs and
live-processing some data as it comes in. Which technology should they use for this?

  • A. Google Cloud Dataproc
  • B. Google Cloud Dataflow
  • C. Google Container Engine with Bigtable
  • D. Google Compute Engine with Google BigQuery

Answer: B

Explanation:
Dataflow is for processing both the Batch and Stream.
Cloud Dataflow is a fully-managed service for transforming and enriching data in stream (real time) and batch (historical) modes with equal reliability and expressiveness -- no more complex workarounds or compromises needed.
References: https://cloud.google.com/dataflow/

NEW QUESTION 5

One of the developers on your team deployed their application in Google Container Engine with the Dockerfile below. They report that their application deployments are taking too long.
Professional-Cloud-Architect dumps exhibit
You want to optimize this Dockerfile for faster deployment times without adversely affecting the app’s functionality.
Which two actions should you take? Choose 2 answers.

  • A. Remove Python after running pip.
  • B. Remove dependencies from requirements.txt.
  • C. Use a slimmed-down base image like Alpine linux.
  • D. Use larger machine types for your Google Container Engine node pools.
  • E. Copy the source after the package dependencies (Python and pip) are installed.

Answer: CE

Explanation:
The speed of deployment can be changed by limiting the size of the uploaded app, limiting the complexity of the build necessary in the Dockerfile, if present, and by ensuring a fast and reliable internet connection.
Note: Alpine Linux is built around musl libc and busybox. This makes it smaller and more resource efficient than traditional GNU/Linux distributions. A container requires no more than 8 MB and a minimal installation to disk requires around 130 MB of storage. Not only do you get a fully-fledged Linux environment but a large selection of packages from the repository.
References: https://groups.google.com/forum/#!topic/google-appengine/hZMEkmmObDU https://www.alpinelinux.org/about/

NEW QUESTION 6

For this question, refer to the Mountkirk Games case study
Mountkirk Games needs to create a repeatable and configurable mechanism for deploying isolated application environments. Developers and testers can access each other's environments and resources, but they cannot access staging or production resources. The staging environment needs access to some services from production.
What should you do to isolate development environments from staging and production?

  • A. Create a project for development and test and another for staging and production.
  • B. Create a network for development and test and another for staging and production.
  • C. Create one subnetwork for development and another for staging and production.
  • D. Create one project for development, a second for staging and a third for production.

Answer: D

NEW QUESTION 7

You are designing a large distributed application with 30 microservices. Each of your distributed microservices needs to connect to a database back-end. You want to store the credentials securely. Where should you store the credentials?

  • A. In the source code
  • B. In an environment variable
  • C. In a secret management system
  • D. In a config file that has restricted access through ACLs

Answer: C

Explanation:
https://cloud.google.com/docs/authentication/production#providing_credentials_to_your_application

NEW QUESTION 8

For this question, refer to the Mountkirk Games case study.
Mountkirk Games' gaming servers are not automatically scaling properly. Last month, they rolled out a new feature, which suddenly became very popular. A record number of users are trying to use the service, but many of them are getting 503 errors and very slow response times. What should they investigate first?

  • A. Verify that the database is online.
  • B. Verify that the project quota hasn't been exceeded.
  • C. Verify that the new feature code did not introduce any performance bugs.
  • D. Verify that the load-testing team is not running their tool against production.

Answer: B

Explanation:
* 503 is service unavailable error. If the database was online everyone would get the 503 error. https://cloud.google.com/docs/quota#capping_usage

NEW QUESTION 9

Your company has decided to build a backup replica of their on-premises user authentication PostgreSQL database on Google Cloud Platform. The database is 4 TB, and large updates are frequent. Replication requires private address space communication. Which networking approach should you use?

  • A. Google Cloud Dedicated Interconnect
  • B. Google Cloud VPN connected to the data center network
  • C. A NAT and TLS translation gateway installed on-premises
  • D. A Google Compute Engine instance with a VPN server installed connected to the data center network

Answer: A

Explanation:
https://cloud.google.com/docs/enterprise/best-practices-for-enterprise-organizations
Google Cloud Dedicated Interconnect provides direct physical connections and RFC 1918 communication between your on-premises network and Google’s network. Dedicated Interconnect enables you to transfer large amounts of data between networks, which can be more cost effective than purchasing additional bandwidth over the public Internet or using VPN tunnels.
Benefits:
Professional-Cloud-Architect dumps exhibit Traffic between your on-premises network and your VPC network doesn't traverse the public Internet.
Traffic traverses a dedicated connection with fewer hops, meaning there are less points of failure where traffic might get dropped or disrupted.
Professional-Cloud-Architect dumps exhibit Your VPC network's internal (RFC 1918) IP addresses are directly accessible from your on-premises network. You don't need to use a NAT device or VPN tunnel to reach internal IP addresses. Currently, you can only reach internal IP addresses over a dedicated connection. To reach Google external IP addresses, you must use a separate connection.
Professional-Cloud-Architect dumps exhibit You can scale your connection to Google based on your needs. Connection capacity is delivered over one or more 10 Gbps Ethernet connections, with a maximum of eight connections (80 Gbps total per interconnect).
Professional-Cloud-Architect dumps exhibit The cost of egress traffic from your VPC network to your on-premises network is reduced. A dedicated connection is generally the least expensive method if you have a high-volume of traffic to and from Google’s network.
References: https://cloud.google.com/interconnect/docs/details/dedicated

NEW QUESTION 10

Your web application has several VM instances running within a VPC. You want to restrict communications between instances to only the paths and ports you authorize, but you don’t want to rely on static IP addresses or subnets because the app can autoscale. How should you restrict communications?

  • A. Use separate VPCs to restrict traffic
  • B. Use firewall rules based on network tags attached to the compute instances
  • C. Use Cloud DNS and only allow connections from authorized hostnames
  • D. Use service accounts and configure the web application particular service accounts to have access

Answer: B

NEW QUESTION 11

You want your Google Kubernetes Engine cluster to automatically add or remove nodes based on CPUload. What should you do?

  • A. Configure a HorizontalPodAutoscaler with a target CPU usag
  • B. Enable the Cluster Autoscaler from the GCP Console.
  • C. Configure a HorizontalPodAutoscaler with a target CPU usag
  • D. Enable autoscaling on the managed instance group for the cluster using the gcloud command.
  • E. Create a deployment and set the maxUnavailable and maxSurge propertie
  • F. Enable the Cluster Autoscaler using the gcloud command.
  • G. Create a deployment and set the maxUnavailable and maxSurge propertie
  • H. Enable autoscaling on the cluster managed instance group from the GCP Console.

Answer: B

NEW QUESTION 12

Your customer wants to capture multiple GBs of aggregate real-time key performance indicators (KPIs) from their game servers running on Google Cloud Platform and monitor the KPIs with low latency. How should they capture the KPIs?

  • A. Store time-series data from the game servers in Google Bigtable, and view it using Google Data Studio.
  • B. Output custom metrics to Stackdriver from the game servers, and create a Dashboard in Stackdriver Monitoring Console to view them.
  • C. Schedule BigQuery load jobs to ingest analytics files uploaded to Cloud Storage every ten minutes, and visualize the results in Google Data Studio.
  • D. Insert the KPIs into Cloud Datastore entities, and run ad hoc analysis and visualizations of them inCloud Datala

Answer: A

Explanation:
https://cloud.google.com/monitoring/api/v3/metrics-details#metric-kinds

NEW QUESTION 13

You are designing an application for use only during business hours. For the minimum viable product release, you’d like to use a managed product that automatically “scales to zero” so you don’t incur costs when there is no activity.
Which primary compute resource should you choose?

  • A. Cloud Functions
  • B. Compute Engine
  • C. Kubernetes Engine
  • D. AppEngine flexible environment

Answer: A

Explanation:
https://cloud.google.com/serverless-options

NEW QUESTION 14

You write a Python script to connect to Google BigQuery from a Google Compute Engine virtual machine. The script is printing errors that it cannot connect to BigQuery. What should you do to fix the script?

  • A. Install the latest BigQuery API client library for Python
  • B. Run your script on a new virtual machine with the BigQuery access scope enabled
  • C. Create a new service account with BigQuery access and execute your script with that user
  • D. Install the bq component for gccloud with the command gcloud components install bq.

Answer: B

Explanation:
The error is most like caused by the access scope issue. When create new instance, you have the default Compute engine default service account but most serves access including BigQuery is not enable. Create an instance Most access are not enabled by default You have default service account but don't have the permission (scope) you can stop the instance, edit, change scope and restart it to enable the scope access. Of course, if you Run your script on a new virtual machine with the BigQuery access scope enabled, it also works
https://cloud.google.com/compute/docs/access/service-accounts

NEW QUESTION 15

Your company places a high value on being responsive and meeting customer needs quickly. Their primary business objectives are release speed and agility. You want to reduce the chance of security errors being accidentally introduced. Which two actions can you take? Choose 2 answers

  • A. Ensure every code check-in is peer reviewed by a security SME.
  • B. Use source code security analyzers as part of the CI/CD pipeline.
  • C. Ensure you have stubs to unit test all interfaces between components.
  • D. Enable code signing and a trusted binary repository integrated with your CI/CD pipeline.
  • E. Run a vulnerability security scanner as part of your continuous-integration /continuous-delivery (CI/CD) pipeline.

Answer: BE

Explanation:
https://docs.microsoft.com/en-us/vsts/articles/security-validation-cicd-pipeline?view=vsts

NEW QUESTION 16

You need to set up Microsoft SQL Server on GCP. Management requires that there’s no downtime in case of a data center outage in any of the zones within a GCP region. What should you do?

  • A. Configure a Cloud SQL instance with high availability enabled.
  • B. Configure a Cloud Spanner instance with a regional instance configuration.
  • C. Set up SQL Server on Compute Engine, using Always On Availability Groups using Windows Failover Clusterin
  • D. Place nodes in different subnets.
  • E. Set up SQL Server Always On Availability Groups using Windows Failover Clusterin
  • F. Place nodes in different zones.

Answer: D

Explanation:
https://cloud.google.com/sql/docs/sqlserver/configure-ha

NEW QUESTION 17

You need to develop procedures to test a disaster plan for a mission-critical application. You want to use Google-recommended practices and native capabilities within GCP.
What should you do?

  • A. Use Deployment Manager to automate service provisionin
  • B. Use Activity Logs to monitor and debug your tests.
  • C. Use Deployment Manager to automate provisionin
  • D. Use Stackdriver to monitor and debug your tests.
  • E. Use gcloud scripts to automate service provisionin
  • F. Use Activity Logs monitor and debug your tests.
  • G. Use automated scripts to automate service provisionin
  • H. Use Activity Logs monitor and debug your tests.

Answer: B

Explanation:
https://cloud.google.com/solutions/dr-scenarios-planning-guide

NEW QUESTION 18

You are designing a mobile chat application. You want to ensure people cannot spoof chat messages, by providing a message were sent by a specific user.
What should you do

  • A. Tag messages client side with the originating user identifier and the destination user.
  • B. Encrypt the message client side using block-based encryption with a shared key.
  • C. Use public key infrastructure (PKI) to encrypt the message client side using the originating user's private key.
  • D. Use a trusted certificate authority to enable SSL connectivity between the client application and the server.

Answer: C

NEW QUESTION 19

One of your primary business objectives is being able to trust the data stored in your application. You want to log all changes to the application data. How can you design your logging system to verify authenticity of your logs?

  • A. Write the log concurrently in the cloud and on premises.
  • B. Use a SQL database and limit who can modify the log table.
  • C. Digitally sign each timestamp and log entry and store the signature.
  • D. Create a JSON dump of each log entry and store it in Google Cloud Storage.

Answer: C

Explanation:
https://cloud.google.com/storage/docs/access-logs
References: https://cloud.google.com/logging/docs/reference/tools/gcloud-logging

NEW QUESTION 20

You are deploying a PHP App Engine Standard service with SQL as the backend. You want to minimize the number of queries to the database.
What should you do?

  • A. Set the memcache service level to dedicate
  • B. Create a key from the hash of the query, and return database values from memcache before issuing a query to Cloud SQL.
  • C. Set the memcache service level to dedicate
  • D. Create a cron task that runs every minute to populate the cache with keys containing query results.
  • E. Set the memcache service level to share
  • F. Create a cron task that runs every minute to save all expected queries to a key called “cached-queries”.
  • G. Set the memcache service level to share
  • H. Create a key called “cached-queries”, and return database values from the key before using a query to Cloud SQL.

Answer: A

Explanation:
https://cloud.google.com/appengine/docs/standard/php/memcache/using

NEW QUESTION 21
......

Recommend!! Get the Full Professional-Cloud-Architect dumps in VCE and PDF From Downloadfreepdf.net, Welcome to Download: https://www.downloadfreepdf.net/Professional-Cloud-Architect-pdf-download.html (New 170 Q&As Version)