Practice Exam

Question 28 of 75

Ingest Large Files with Event Triggering

You work for MDFT Pro, a well-known training agency that receives daily video recordings of training sessions from multiple campuses. Claire, a Data Engineer at MDFT Pro, manages a Fabric lakehouse called TrainingVideos that stores these recordings for later transcription and analysis. Each campus uploads one large video file (approximately 500 GB) to an Azure Data Lake Storage account at the end of each business day. These files must be ingested into the lakehouse without any transformations to preserve the original quality and metadata. The ingestion process should automatically trigger whenever a new file appears in the storage account, ensuring minimal delay between file upload and availability in the lakehouse. Additionally, the solution must provide optimal throughput to handle the large file sizes efficiently, minimizing the time required to complete each ingestion operation.

Which type of Fabric item should Claire use to ingest the data?

Choose the correct answer from the options below.

Explanations for each answer:

Learn more about event-driven pipelines:
Event-Driven Data Pipelines
Next Question
Discuss this question on social media: