getcertified4sure.com

SAA-C03 Exam

A Review Of Free SAA-C03 Free Question




Accurate of SAA-C03 latest exam materials and test for Amazon-Web-Services certification for examinee, Real Success Guaranteed with Updated SAA-C03 pdf dumps vce Materials. 100% PASS AWS Certified Solutions Architect - Associate (SAA-C03) exam Today!

Also have SAA-C03 free dumps questions for you:

NEW QUESTION 1
A company wants to run a gaming application on Amazon EC2 instances that are part of an Auto Scaling group in the AWS Cloud. The application will transmit data by using UDP packets. The company wants to ensure that the application can scale out and in as traffic increases and decreases.
What should a solutions architect do to meet these requirements?

  • A. Attach a Network Load Balancer to the Auto Scaling group
  • B. Attach an Application Load Balancer to the Auto Scaling group.
  • C. Deploy an Amazon Route 53 record set with a weighted policy to route traffic appropriately
  • D. Deploy a NAT instance that is configured with port forwarding to the EC2 instances in the Auto Scaling group.

Answer: B

NEW QUESTION 2
A company is running a high performance computing (HPC) workload on AWS across many Linux based Amazon EC2 instances. The company needs a shared storage system that is capable of sub-millisecond latencies, hundreds of Gbps of throughput and millions of IOPS. Users will store millions of small files.
Which solution meets these requirements?

  • A. Create an Amazon Elastic File System (Amazon EFS) file system Mount me file system on each of the EC2 instances
  • B. Create an Amazon S3 bucket Mount the S3 bucket on each of the EC2 instances
  • C. Ensure that the EC2 instances ate Amazon Elastic Block Store (Amazon EBS) optimized Mount Provisioned lOPS SSD (io2) EBS volumes with Multi-Attach on each instance
  • D. Create an Amazon FSx for Lustre file syste
  • E. Mount the file system on each of the EC2 instances

Answer: D

NEW QUESTION 3
A solutions architect is designing the cloud architecture for a new application being deployed on AWS. The process should run in parallel while adding and removing application nodes as needed based on the number of fobs to be processed. The processor application is stateless. The solutions architect must ensure that the application is loosely copied and the job items are durably stored
Which design should the solutions architect use?

  • A. Create an Amazon SNS topic to send the jobs that need to be processed Create an Amazon Machine Image (AMI) that consists of the processor application Create a launch configuration that uses the AMI Create an Auto Scaling group using the launch configuration Set the scaling policy for the Auto Scaling group to add and remove nodes based on CPU usage
  • B. Create an Amazon SQS queue to hold the jobs that need to be processed Create an Amazon Machine image (AMI) that consists of the processor application Create a launch configuration that uses the AM' Create an Auto Scaling group using the launch configuration Set the scaling policy for the Auto Scaling group to add and remove nodes based on network usage
  • C. Create an Amazon SQS queue to hold the jobs that needs to be processed Create an Amazon Machine image (AMI) that consists of the processor application Create a launch template that uses the AMI Create an Auto Scaling group using the launch template Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of items in the SQS queue
  • D. Create an Amazon SNS topic to send the jobs that need to be processed Create an Amazon Machine Image (AMI) that consists of the processor application Create a launch template that uses the AMI Create an Auto Scaling group using the launch template Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of messages published to the SNS topic

Answer: C

Explanation:
"Create an Amazon SQS queue to hold the jobs that needs to be processed. Create an Amazon EC2 Auto Scaling group for the compute application. Set the scaling policy for the Auto Scaling group to add and remove nodes based on the number of items in the SQS queue"
In this case we need to find a durable and loosely coupled solution for storing jobs. Amazon SQS is ideal for this use case and can be configured to use dynamic scaling based on the number of jobs waiting in the queue.To configure this scaling you can use the backlog per instance metric with the target value being the acceptable backlog per instance to maintain. You can calculate these numbers as follows: Backlog per instance: To calculate your backlog per instance, start with the ApproximateNumberOfMessages queue attribute to determine the length of the SQS queue

NEW QUESTION 4
A company's ecommerce website has unpredictable traffic and uses AWS Lambda functions to directly access a private Amazon RDS for PostgreSQL DB instance. The company wants to maintain predictable database performance and ensure that the Lambda invocations do not overload the database with too many connections.
What should a solutions architect do to meet these requirements?

  • A. Point the client driver at an RDS custom endpoint Deploy the Lambda functions inside a VPC
  • B. Point the client driver at an RDS proxy endpoint Deploy the Lambda functions inside a VPC
  • C. Point the client driver at an RDS custom endpoint Deploy the Lambda functions outside a VPC
  • D. Point the client driver at an RDS proxy endpoint Deploy the Lambda functions outside a VPC

Answer: B

NEW QUESTION 5
A solution architect is using an AWS CloudFormation template to deploy a three-tier web application. The web application consist of a web tier and an application that stores and retrieves user data in Amazon DynamoDB tables. The web and application tiers are hosted on Amazon EC2 instances, and the database tier is not publicly accessible. The application EC2 instances need to access the Dynamo tables Without exposing API credentials in the template.
What should the solution architect do to meet the requirements?

  • A. Create an IAM role to read the DynamoDB table
  • B. Associate the role with the application instances by referencing an instance profile.
  • C. Create an IAM role that has the required permissions to read and write from the DynamoDB table
  • D. Add the role to the EC2 instance profile, and associate the instances profile with the application instances.
  • E. Use the parameter section in the AWS CloudFormation template to have the user input access and secret keys from an already-created IAM user that has the required permissions to read and write from the DynamoDB tables.
  • F. Create an IAM user in the AWS CloudFormation template that has the required permissions to read and write from the DynamoDB table
  • G. Use the GetAtt function to retrieve the access secret keys, and pass them to the application instances through the user data.

Answer: B

NEW QUESTION 6
A gaming company wants to launch a new internet-facing application in multiple AWS Regions. The application will use the TCP and UDP protocols for communication. The company needs to provide high availability and minimum latency for global users.
Which combination of actions should a solutions architect take to meet these requirements? (Select TWO.)

  • A. Create internal Network Load Balancers in front of the application in each Region
  • B. Create external Application Load Balancers in front of the application in each Region
  • C. Create an AWS Global Accelerator accelerator to route traffic to the load balancers in each Region
  • D. Configure Amazon Route 53 to use a geolocation routing policy to distribute the traffic
  • E. Configure Amazon CloudFront to handle the traffic and route requests to the application in each Region

Answer: AC

NEW QUESTION 7
A company has created an image analysis application in which users can upload photos and add photo frames to their images. The users upload images and metadata to indicate which photo frames they want to add to their images. The application uses a single Amazon EC2 instance and Amazon DynamoDB to store the metadata.
The application is becoming more popular, and the number of users is increasing. The company expects the number of concurrent users to vary significantly depending on the time of day and day of week. The company must ensure that the application can scale to meet the needs of the growing user base.
Which solution meats these requirements?

  • A. Use AWS Lambda to process the photo
  • B. Store the photos and metadata in DynamoDB.
  • C. Use Amazon Kinesis Data Firehose to process the photos and to store the photos and metadata.
  • D. Use AWS Lambda to process the photo
  • E. Store the photos in Amazon S3. Retain DynamoDB to store the metadata.
  • F. Increase the number of EC2 instances to thre
  • G. Use Provisioned IOPS SSD (io2) Amazon Elastic Block Store (Amazon EBS) volumes to store the photos and metadata.

Answer: A

NEW QUESTION 8
A company wants to migrate its existing on-premises monolithic application to AWS.
The company wants to keep as much of the front- end code and the backend code as possible. However, the company wants to break the application into smaller applications. A different team will manage each application. The company needs a highly scalable solution that minimizes operational overhead.
Which solution will meet these requirements?

  • A. Host the application on AWS Lambda Integrate the application with Amazon API Gateway.
  • B. Host the application with AWS Amplif
  • C. Connect the application to an Amazon API Gateway API that is integrated with AWS Lambda.
  • D. Host the application on Amazon EC2 instance
  • E. Set up an Application Load Balancer with EC2 instances in an Auto Scaling group as targets.
  • F. Host the application on Amazon Elastic Container Service (Amazon ECS) Set up an Application Load Balancer with Amazon ECS as the target.

Answer: C

NEW QUESTION 9
A company is experiencing sudden increases in demand. The company needs to provision large Amazon EC2 instances from an Amazon Machine image (AMI) The instances will run m an Auto Scaling group. The company needs a solution that provides minimum initialization latency to meet the demand.
Which solution meets these requirements?

  • A. Use the aws ec2 register-image command to create an AMI from a snapshot Use AWS Step Functions to replace the AMI in the Auto Scaling group
  • B. Enable Amazon Elastic Block Store (Amazon EBS) fast snapshot restore on a snapshot Provision an AMI by using the snapshot Replace the AMI m the Auto Scaling group with the new AMI
  • C. Enable AMI creation and define lifecycle rules in Amazon Data Lifecycle Manager (Amazon DLM) Create an AWS Lambda function that modifies the AMI in the Auto Scaling group
  • D. Use Amazon EventBridge (Amazon CloudWatch Events) to invoke AWS Backup lifecycle policies that provision AMIs Configure Auto Scaling group capacity limits as an event source in EventBridge (CloudWatch Events)

Answer: B

NEW QUESTION 10
A company has an application with a REST-based interface that allows data to be received in near-real time from a third-party vendor Once received the application processes and stores the data for further analysis. The application is running on Amazon EC2 instances.
The third-party vendor has received many 503 Service Unavailable Errors when sending data to the application When the data volume spikes, the compute capacity reaches its maximum limit and the application is unable to process all requests.
Which design should a solutions architect recommend to provide a more scalable solution?

  • A. Use Amazon Kinesis Data Streams to ingest the data Process the data using AWS Lambda function.
  • B. Use Amazon API Gateway on top of the existing applicatio
  • C. Create a usage plan with a quota limit for the third-party vendor
  • D. Use Amazon Simple Notification Service (Amazon SNS) to ingest the data Put the EC2 instances in an Auto Scaling group behind an Application Load Balancer
  • E. Repackage the application as a container Deploy the application using Amazon Elastic Container Service (Amazon ECS) using the EC2 launch type with an Auto Scaling group

Answer: A

NEW QUESTION 11
A company uses NFS to store large video files in on-premises network attached storage. Each video file ranges in size from 1MB to 500 GB. The total storage is 70 TB and is no longer growing. The company decides to migrate the video files to Amazon S3. The company must migrate the video files as soon as possible while using the least possible network bandwidth.
Which solution will meet these requirements?

  • A. Create an S3 bucket Create an 1AM role that has permissions to write to the S3 bucke
  • B. Use the AWS CLI to copy all files locally to the S3 bucket.
  • C. Create an AWS Snowball Edge jo
  • D. Receive a Snowball Edge device on premise
  • E. Use the Snowball Edge client to transfer data to the devic
  • F. Return the device so that AWS can import the data intoAmazon S3.
  • G. Deploy an S3 File Gateway on premise
  • H. Create a public service endpoint to connect to the S3 File Gateway Create an S3 bucket Create a new NFS file share on the S3 File Gateway Point the new file share to the S3 bucke
  • I. Transfer the data from the existing NFS file share to the S3 File Gateway.
  • J. Set up an AWS Direct Connect connection between the on-premises network and AW
  • K. Deploy an S3 File Gateway on premise
  • L. Create a public virtual interlace (VIF) to connect to the S3 File Gatewa
  • M. Create an S3 bucke
  • N. Create a new NFS file share on the S3 File Gatewa
  • O. Point the new file share to the S3 bucke
  • P. Transfer the data from the existing NFS file share to the S3 File Gateway.

Answer: C

NEW QUESTION 12
A company is building an ecommerce application and needs to store sensitive customer information. The company needs to give customers the ability to complete purchase transactions on the website. The company also needs to ensure that sensitive customer data is protected, even from database administrators.
Which solution meets these requirements?

  • A. Store sensitive data in an Amazon Elastic Block Store (Amazon EBS) volum
  • B. Use EBS encryption to encrypt the dat
  • C. Use an IAM instance role to restrict access.
  • D. Store sensitive data in Amazon RDS for MySQ
  • E. Use AWS Key Management Service (AWS KMS) client-side encryption to encrypt the data.
  • F. Store sensitive data in Amazon S3. Use AWS Key Management Service (AWS KMS) service-side encryption the dat
  • G. Use S3 bucket policies to restrict access.
  • H. Store sensitive data in Amazon FSx for Windows Serve
  • I. Mount the file share on application servers.Use Windows file permissions to restrict access.

Answer: C

NEW QUESTION 13
A company that recently started using AWS establishes a Site-to-Site VPN between its on-premises data center and AWS. The company’s security mandate states that traffic originating from on premises should stay within the company’s private IP space when communicating with an Amazon Elastic Container Service (Amazon ECS) cluster that is hosting a sample web application.
Which solution meets this requirement?

  • A. Configure a gateway endpoint for Amazon EC
  • B. Modify the route table to include an entry pointing to the ECS cluster.
  • C. Create a Network Load Balancer and AWS PrivateLink endpoint for Amazon ECS in the same VPC that is hosting the ECS cluster.
  • D. Create a Network Load Balancer in one VPC and an AWS PrivateLink endpoint for Amazon ECS in another VP
  • E. Connect the two by using VPC peering.
  • F. Configure an Amazon Route record with Amazon ECS as the targe
  • G. Apply a server certificate to Route 53 from AWS Certificate Manager (ACM) for SSL offloading.

Answer: A

NEW QUESTION 14
A company wants to migrate its on-premises application to AWS. The application produces output files that vary in size from tens of gigabytes to hundreds of terabytes The application data must be stored in a standard file system structure The company wants a solution that scales automatically, is highly available, and requires minimum operational overhead.
Which solution will meet these requirements?

  • A. Migrate the application to run as containers on Amazon Elastic Container Service (Amazon ECS) Use Amazon S3 for storage
  • B. Migrate the application to run as containers on Amazon Elastic Kubernetes Service (Amazon EKS) Use Amazon Elastic Block Store (Amazon EBS) for storage
  • C. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
  • D. Use Amazon Elastic File System (Amazon EFS) for storage.
  • E. Migrate the application to Amazon EC2 instances in a Multi-AZ Auto Scaling grou
  • F. Use Amazon Elastic Block Store (Amazon EBS) for storage.

Answer: C

NEW QUESTION 15
A company is storing sensitive user information in an Amazon S3 bucket The company wants to provide secure access to this bucket from the application tier running on Ama2on EC2 instances inside a VPC
Which combination of steps should a solutions architect take to accomplish this? (Select TWO.)

  • A. Configure a VPC gateway endpoint (or Amazon S3 within the VPC
  • B. Create a bucket policy to make the objects to the S3 bucket public
  • C. Create a bucket policy that limits access to only the application tier running in the VPC
  • D. Create an 1AM user with an S3 access policy and copy the IAM credentials to the EC2 instance
  • E. Create a NAT instance and have the EC2 instances use the NAT instance to access the S3 bucket

Answer: BD

NEW QUESTION 16
A company uses a legacy application to produce data in CSV format The legacy application stores the output data In Amazon S3 The company is deploying a new commercial off-the-shelf (COTS) application that can perform complex SQL queries to analyze data that is stored Amazon Redshift and Amazon S3 only However the COTS application cannot process the csv files that the legacy application produces The company cannot update the legacy application to produce data in another format The company needs to implement a solution so that the COTS application can use the data that the legacy applicator produces.
Which solution will meet these requirements with the LEAST operational overhead?

  • A. Create a AWS Glue extract, transform, and load (ETL) job that runs on a schedul
  • B. Configure the ETL job to process the .csv files and store the processed data in Amazon Redshit.
  • C. Develop a Python script that runs on Amazon EC2 instances to convert th
  • D. csv files to sql files invoke the Python script on cron schedule to store the output files in Amazon S3.
  • E. Create an AWS Lambda function and an Amazon DynamoDB tabl
  • F. Use an S3 event to invoke the Lambda functio
  • G. Configure the Lambda function to perform an extract transform, and load (ETL) job to process the .csv files and store the processed data in the DynamoDB table.
  • H. Use Amazon EventBridge (Amazon CloudWatch Events) to launch an Amazon EMR cluster on a weekly schedul
  • I. Configure the EMR cluster to perform an extract, tractform, and load (ETL) job to process the .csv files and store the processed data in an Amazon Redshift table.

Answer: C

NEW QUESTION 17
An image hosting company uploads its large assets to Amazon S3 Standard buckets. The company uses multipart upload in parallel by using S3 APIs and overwrites if the same object is uploaded again. For the first 30 days after upload, the objects will be accessed frequently. The objects will be used less frequently after 30 days, but the access patterns for each object will be inconsistent. The company must optimize its S3 storage costs while maintaining high availability and resiliency of stored assets.
Which combination of actions should a solutions architect recommend to meet these requirements? (Select TWO.)

  • A. Move assets to S3 Intelligent-Tiering after 30 days.
  • B. Configure an S3 Lifecycle policy to clean up incomplete multipart uploads.
  • C. Configure an S3 Lifecycle policy to clean up expired object delete markers.
  • D. Move assets to S3 Standard-Infrequent Access (S3 Standard-IA) after 30 days.
  • E. Move assets to S3 One Zone-Infrequent Access (S3 One Zone-IA) after 30 days.

Answer: CD

NEW QUESTION 18
......

Thanks for reading the newest SAA-C03 exam dumps! We recommend you to try the PREMIUM Dumpscollection.com SAA-C03 dumps in VCE and PDF here: https://www.dumpscollection.net/dumps/SAA-C03/ (0 Q&As Dumps)