Free Amazon DAS-C01 Exam Questions

Become Amazon Certified with updated DAS-C01 exam questions and correct answers

Page:    1 / 42      
Total 210 Questions | Updated On: Nov 11, 2025
Add To Cart
Question 1

An IOT company is collecting data from multiple sensors and is streaming the data to Amazon Managed Streaming for Apache Kafka (Amazon MSK). Each sensor type has its own topic, and each topic has the same number of partitions. The company is planning to turn on more sensors. However, the company wants to evaluate which sensor types are producing the most data sothat the company can scale accordingly. The company needs to know which sensor types have the largest values for the following metrics: ByteslnPerSec and MessageslnPerSec. Which level of monitoring for Amazon MSK will meet these requirements?


Answer: B
Question 2

A manufacturing company is storing data from its operational systems in Amazon S3. The company's business analysts need to perform one-time queries of the data in Amazon S3 with Amazon Athena. The company needs to access the Athena service from the on-premises network by using a JDBC connection. The company has created a VPC. Security policies mandate that requests to AWS services cannot traverse the internet. Which combination of steps should a data analytics specialist take to meet these requirements? (Select TWO.) 


Answer: A,D
Question 3

A manufacturing company has many loT devices in different facilities across the world The company is using Amazon Kinesis Data Streams to collect the data from the devices
The company's operations team has started to observe many WnteThroughputExceeded exceptions The operations team determines that the reason is the number of records that are being written to certain shards The data contains device ID capture date measurement type, measurement value and facility ID The facility ID is used as the partition key
Which action will resolve this issue?


Answer: B
Question 4

An operations team notices that a few AWS Glue jobs for a given ETL application are failing. The AWS Glue jobs read a large number of small JSON files from an Amazon S3 bucket and write the data to a different S3 bucket in Apache Parquet format with no major transformations. Upon initial investigation, a data engineer notices the following error message in the History tab on the AWS Glue console: ''Command Failed with Exit Code 1.''
Upon further investigation, the data engineer notices that the driver memory profile of the failed jobs crosses the safe threshold of 50% usage quickly and reaches 90--95% soon after. The average memory usage across all executors continues to be less than 4%.
The data engineer also notices the following error while examining the related Amazon CloudWatch Logs.
What should the data engineer do to solve the failure in the MOST cost-effective way?


Answer: B
Question 5

A company wants to collect and process events data from different departments in near-real time. Before storing the data in Amazon S3, the company needs to clean the data by standardizing the format of the address and timestamp columns. The data varies in size based on the overall load at each particular point in time. A single data record can be 100 KB-10 MB.
How should a data analytics specialist design the solution for data ingestion?


Answer: B
Page:    1 / 42      
Total 210 Questions | Updated On: Nov 11, 2025
Add To Cart

© Copyrights DumpsCertify 2025. All Rights Reserved

We use cookies to ensure your best experience. So we hope you are happy to receive all cookies on the DumpsCertify.