You define an input FTP dataset with the compression type property zip file from FTP server, decompress it to get the files inside, and land those files in Azure Data Lake Store. With the compression type property as GZip. ![]() Read data from a plain-text file from on-premises File System, compress it using GZip format, and write the compressed data to an Azure blob.You define the input Azure Blob dataset with the compression type property Read GZIP compressed data from an Azure blob, decompress it, and write result data to an Azure SQL database.Property in an output dataset, the copy activity compress then write data to the sink. When you specify compression property in an input dataset, the copy activity read the compressed data from the source and decompress it and when you specify the There are a few sample scenarios is the following doc :Īzure Data Factory supports compress/decompress data during copy.
0 Comments
Leave a Reply. |