Free Amazon AWS-DEA-C01 Exam Questions

Become Amazon Certified with updated AWS-DEA-C01 exam questions and correct answers

Page:    1 / 117      
Total 582 Questions | Updated On: Dec 17, 2025
Add To Cart
Question 1

Files from multiple data sources arrive in an Amazon S3 bucket on a regular basis. A data engineer wants to ingest new files into Amazon Redshift in near real time when the new files arrive in the S3 bucket. Which solution will meet these requirements?


Answer: D
Question 2

A sales company uses AWS Glue ETL to collect, process, and ingest data into an Amazon S3 bucket. The AWS Glue pipeline creates a new file in the S3 bucket every hour. File sizes vary from 200 KB to 300 KB. The company wants to build a sales prediction model by using data from the previous 5 years. The historic data includes 44,000 files. The company builds a second AWS Glue ETL pipeline by using the smallest worker type. The second pipeline retrieves the historic files from the S3 bucket and processes the files for downstream analysis. The company notices significant performance issues with the second ETL pipeline. The company needs to improve the performance of the second pipeline. Which solution will meet this requirement MOST cost-effectively?


Answer: D
Question 3

A company needs to partition the Amazon S3 storage that the company uses for a data lake. The partitioning will use a path of the S3 object keys in the following format:

s3://bucket/prefix/year=2023/month=01/day=01

A data engineer must ensure that the AWS Glue Data Catalog synchronizes with the S3 storage when the company adds new partitions to the bucket.

Which solution will meet these requirements with the LEAST latency?


Answer: B
Question 4

A company is integrating a business intelligence (BI) tool with their data warehouse hosted on Microsoft SQL Server. The BI team requires regular data extracts to be transformed and stored in Amazon S3 for further analysis. The BI team needs a solution to manage this ETL process efficiently and at a low cost.

Which AWS service or feature is the most cost-effective for orchestrating an ETL pipeline that extracts data from Microsoft SQL Server, transforms it, and loads it into Amazon S3?


Answer: D
Question 5

A company receives test results from testing facilities that are located around the world. The company storesthe test results in millions of 1 KB JSON files in an Amazon S3 bucket. A data engineer needs to process thefiles, convert them into Apache Parquet format, and load them into Amazon Redshift tables. The dataengineer uses AWS Glue to process the files, AWS Step Functions to orchestrate the processes, and AmazonEventBridge to schedule jobs.The company recently added more testing facilities. The time required to process files is increasing. The dataengineer must reduce the data processing time.Which solution will MOST reduce the data processing time?


Answer: B
Page:    1 / 117      
Total 582 Questions | Updated On: Dec 17, 2025
Add To Cart

© Copyrights DumpsCertify 2025. All Rights Reserved

We use cookies to ensure your best experience. So we hope you are happy to receive all cookies on the DumpsCertify.