Question

This article data-ingestion says that there are two types of data ingestion

Data can be streamed in real time or ingested in batches.

What type of data ingestion does Azure Data Factory's pipelines use when moving data from On-Premises/Cloud sources services to Azure Services (Azure Blob Storage, Azure Data Lake Store, etc)

While going through the documentation I found the following:

ADF provides the services and tooling to compose and integrate data, build data pipelines, and monitor their status in real time

Does that mean that Data is streamed in real time?

Was it helpful?

Solution

According to the following picture found in the documentation, it has a Batch Ingest

enter image description here

You could also use triggers to move data incrementally. But it is not real time. There is frequency limitation.

Licensed under: CC-BY-SA with attribution
Not affiliated with dba.stackexchange
scroll top