What is Azure Data Factory?
A cloud-based ETL/ELT service for orchestration and data integration.
What is a pipeline?
A logical container for activities
What is an activity?
A single step/task inside a pipeline.
What is a dataset?
A metadata reference to data (table, folder, or file).
What is a linked service?
A secure connection to a data store or compute resource
What is an integration runtime?
Compute infrastructure used to execute activities.
What are the 3 IR types?
Azure IR, Self-hosted IR, Azure SSIS IR.
What is the purpose of triggers?
Automated pipeline execution.
What are the 3 trigger types?
Schedule, Tumbling Window, Event.
What is control flow?
Pipeline logic (branching, looping and conditions)
What is a pipeline run?
An execution instance of an acitivity.
What is ARM deployment?
Infrastructure-as-code deployment of ADF
What is Git mode in ADF?
Development mode enabling version control.
What is live mode?
Linked to actual production ADF publishing.
Where are pipeline definitions stored?
JSON under the hood.
Does ADF store data?
No - it only orchestrates and moves it.
What is PolyBase used for in ADF?
High-speed SQL DW/Synapse ingestion.
Can ADF call Databricks notebooks?
Yes, via Databricks Activity
Can ADF call Azure Functions?
Yes, via Azure Function Activity.
Can ADF call REST APIs?
Yes, via Web Activity.
Can ADF generate files?
Yes, via Copy Activity sink or Data Flow.
What is a global parameter?
Workspace-level parameter reused across pipelines.
What is dynamic content?
Runtime expressions for flexibility.
Does ADF support schema drift?
Yes - especially in Data Flows.