×

Special Offer! Black Friday Price Drop! Extra 20% OFF- Ends In  Coupon code: DG2020

Free Amazon AWS-DEA-C01 Exam Questions

Become Amazon Certified with updated AWS-DEA-C01 exam questions and correct answers

Page:    1 / 110      
Total 546 Questions | Updated On: Nov 26, 2025
Add To Cart
Question 1

A sales company uses AWS Glue ETL to collect, process, and ingest data into an Amazon S3 bucket. The AWS Glue pipeline creates a new file in the S3 bucket every hour. File sizes vary from 200 KB to 300 KB. The company wants to build a sales prediction model by using data from the previous 5 years. The historic data includes 44,000 files. The company builds a second AWS Glue ETL pipeline by using the smallest worker type. The second pipeline retrieves the historic files from the S3 bucket and processes the files for downstream analysis. The company notices significant performance issues with the second ETL pipeline. The company needs to improve the performance of the second pipeline. Which solution will meet this requirement MOST cost-effectively?


Answer: D
Question 2

A Data Engineering Team is designing a system to manage a series of dependent data processing and transformation jobs. These jobs involve various AWS services, including AWS Lambda for data manipulation, Amazon S3 for data storage, and AWS Glue for ETL operations. The team needs a robust solution to orchestrate these jobs, ensuring that they execute in a specific sequence and with conditional error handling.

Which AWS service should the team use for the most effective orchestration of these data processing workflows?


Answer: B
Question 3

A company has as JSON file that contains personally identifiable information (PIT) data and non-PII data. The company needs to make the data available for querying and analysis. The non-PII data must be available to everyone in the company. The PII data must be available only to a limited group of employees. Which solution will meet these requirements with the LEAST operational overhead?


Answer: C
Question 4

A sales company uses AWS Glue ETL to collect, process, and ingest data into an Amazon S3 bucket. The AWS Glue pipeline creates a new file in the S3 bucket every hour. File sizes vary from 200 KB to 300 KB. The company wants to build a sales prediction model by using data from the previous 5 years. The historic data includes 44,000 files. The company builds a second AWS Glue ETL pipeline by using the smallest worker type. The second pipeline retrieves the historic files from the S3 bucket and processes the files for downstream analysis. The company notices significant performance issues with the second ETL pipeline. The company needs to improve the performance of the second pipeline. Which solution will meet this requirement MOST cost-effectively?


Answer: D
Question 5

Files from multiple data sources arrive in an Amazon S3 bucket on a regular basis. A data engineer wants to ingest new files into Amazon Redshift in near real time when the new files arrive in the S3 bucket. Which solution will meet these requirements?


Answer: D
Page:    1 / 110      
Total 546 Questions | Updated On: Nov 26, 2025
Add To Cart

© Copyrights DumpsCertify 2025. All Rights Reserved

We use cookies to ensure your best experience. So we hope you are happy to receive all cookies on the DumpsCertify.