Bulk loading data to Azure SQL

In this tutorial, we will learn about bulk loading data to Azure SQL and see various bulk inserting techniques along with some examples.

Bulk import and export to Azure SQL database

Bulk importing or bulk loading means importing a large amount of data from a file into a database. If you encounter the need to import a large amount of data from a file into an Azure SQL database, you can use various techniques.

There are 4 major ways of doing it:

  1. Using the bcp utility
  2. BULK INSERT statement
  3. INSERT … SELECT * FROM OPENROWSET() statement
  4. Using the SQL Server Import and Export Wizard

I will explain all of the above techniques in the below sections with the help of examples.

Bulk export using BCP utility

Before bulk importing the data, let’s also understand bulk exporting it into a file. We will use the BCP utility to export data from an Azure SQL database to a local file.

Open the command prompt, make changes according to your details in the below command, and then execute it.

bcp dbo.Department out C:\Users\Blades\Documents\Customers.txt -c -t -S mysql1000.database.windows.net -d DemoDatabase -G -U Blades@fewlines4bijuhotmail.onmicrosoft.com -P Test@123
  • In the above command, firstly we specified the table name whose data we want to export.
  • The out flag specifies that we are exporting the data to a data file and we specify the path to the data file after the out flag.
  • The -t specifies that the connection is trusted.
  • mysql1000.database.windows.net is the server name.
  • The -G flag specifies that I am using the Active Directory authentication for authenticating to my database.
  • The -U and -P flags specify the active directory username and the password respectively.
  • The -d flag specifies the database name to connect.

Once executed successfully, you will see an output like below in the image:

bcp export in azure sql database

Now in your specified path, you can see that a file is created and the data is copied into the file.

bcp export in azure sql database

This way, you can export data into a file using the BCP utility.

Also. check: Backup Azure database to local SQL Server

Bulk import using BCP utility

Now we will use the BCP utility to bulk load data into an Azure SQL database table.

Firstly, I will delete all the rows from the dbo.Department table using the DELETE statement as below:

DELETE FROM [dbo].[Department]
bcp import example

You can see that the table is now empty. Now I will use the BCP utility to bulk import data into this table. Look at the command below:

bcp dbo.Department in C:\Users\Blades\Documents\Customers.txt -c -t -S mysql1000.database.windows.net -d DemoDatabase -G -U Blades@fewlines4bijuhotmail.onmicrosoft.com -P Test@123

The above command is almost the same as the one we used to export the data. The only difference is that I have changed the out flag to the in flag for specifying the type of operation to be imported.

bcp import example

Now let me verify the data in my Azure SQL database. I will fetch all the dbo records.Department table.

bcp import example

You can see that all the records from the text file are imported into the table in my Azure SQL database.

Thus, you might have learned how to use the BCP utility to bulk load data into an Azure SQL database.

Also, read: How to rename a database in Azure SQL

Bulk insert CSV into Azure SQL database

In this section, you will learn how to use SQL Server Import and Export Wizard to bulk insert data into an Azure SQL database. You will learn how to bulk insert a CSV file into an Azure SQL database.

I have created a sample CSV file for this demonstration purpose. This CSV file stores data of random persons.

bulk insert data to azure SQL

I will use this CSV file to create a new table in my Azure SQL database. The steps involved in the process are below.

  • Run SQL Server Import and Export Wizard from the start menu or SQL Server management studio.
  • Select the Flat File Source option from the drop-down menu in the data source options.
  • Click on Browse and select your CSV file. Click on Next.
bulk loading data to azure sql
  • Now, you will see a preview of your CSV file. Since you are selecting a CSV file as the source file, the Column Delimiter option must be set to Comma. Once you have reviewed it, click on Next.
  • Now you have to enter the destination details i.e the database details. Select the Destination option to .NET Framework Data provider for SQL Server.
  • In the ConnectionString column, enter the connection string you use to connect to your database.
  • The string below is a sample connection string. Enter your database details in this string and paste them into the ConnectionString column. Click on Next.
Server=tcp:<server name>.database.windows.net,1433;Initial Catalog=<database name>;Persist Security Info=False;User ID=<database user name>;Password=<password>;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;
bulk insert csv into Azure SQL database
  • Here, you will be asked to select the tables from the CSV file that you want to copy. You will also see the destination table name. You can change the name of this destination table if you want. Otherwise, click on Next.
bulk insert CSV Azure
  • After this step, review the data type mapping of the columns and then click Next.
  • You will now be asked to save the SSIS package so that you can run it again or use it in automation. Click on Next and then click on Finish.
  • You will see an output like in the below screen if the process successfully completes.
Bulk insert CSV Azure SQL

Now you can verify in the database that the table has been created and the data has been imported.

Azure SQL database bulk insert CSV

Thus, you might have learned how to bulk insert data from a CSV file into an Azure SQL database.

Read: How to Connect to Azure SQL database

Bulk insert from Azure blob storage to Azure SQL database

If you want to use the BULK INSERT statement to insert data into an Azure SQL database, you must store your data in an Azure blob container. To do this, you need to create a storage account.

If you don’t know how to create a storage account in Azure, read our article “Backup and restore SQL Server to Azure blob storage.”

The first step is to upload your file to the Azure blob container. For this example, I will use the same CSV file that I created in the above section. I will also delete all the data from the dbo.TestData table.

We do not have to drop the table because we need its structure to insert data into it.

Now follow the below steps to bulk insert the data from the CSV file into the Azure SQL database table:

  • Open the Query Editor window of the Azure SQL database.
  • Create a Master key for encryption purposes using the below query:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'Your Password';
GO
  • The next step is to create a database scoped credential. Skip this step if you have enabled public access to your blob container. Otherwise, execute the below command to create a credential:
CREATE DATABASE SCOPED CREDENTIAL MyAzureCredential
 WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
 SECRET = '******srt=sco&sp=rwac&se=2017-02-01T00:55:34Z&st=2016-12-29T16:55:34Z***************';
CREATE EXTERNAL DATA SOURCE MyAzureStorage
WITH ( TYPE = BLOB_STORAGE,
          LOCATION = 'https://groupazurestorage.blob.core.windows.net/sqlserverbackupdb'
          , CREDENTIAL= MyAzureCredential
);
  • In the above command, you must use your Azure blob container’s Shared Access Signature(SAS) token.
  • To generate an SAS token, open your Azure blob container and click on Shared access tokens. Now, enter the details, such as the type of permission and the time period for which the token will be valid. Click on Generate SAS token and URL.
bulk loading data to azure sql
  • Paste this SAS token into the SECRET flag of the above CREATE CREDENTIAL command.
  • Finally, execute the BULK INSERT command to import the data from the data file into the Azure SQL database table.
BULK INSERT [dbo].[TestData]
FROM 'TestData.csv'
WITH (DATA_SOURCE = 'MyAzureStorage',
FORMAT = 'CSV');
bulk insert azure blob storage
  • Now you can verify in the table that the rows are successfully inserted.

Thus, you might have learned how to use the BULK INSERT statement to insert data into an Azure SQL database in bulk.

Note: You do not need to create a database scoped credential if you have enabled public access to your Azure blob container.

Read: Pause and resume Azure SQL database

Azure SQL database bulk insert JSON

This section will teach you how to bulk insert data from a JSON file into an Azure SQL database table.

For this example, I will use the OPENROWSET function to read data from a JSON file and insert it into an Azure SQL database table.

I have created a sample JSON file containing data of the orders’ details table named OrderDetails.

Azure SQL database bulk insert JSON
  • Firstly, I have to upload this file into an Azure blob container.
  • Also, if you have not enabled public access to your Azure blob container, you must create a credential in your database that you will use to connect to it.
  • Secondly, you have to create an external data source, i.e., you have to configure the Azure blob container as such.
  • The previous section explained these steps. You can follow them to create credentials and an external data source.

Once you have completed the above steps, you must create a table in the database into which you will insert this data. In my example, I created a table named dbo.OrderDetails.

  • Now execute the below T-SQL query to bulk insert data into the table:
DECLARE
    @json varchar(max)
SELECT @json = BulkColumn FROM OPENROWSET(
   BULK 'OrderDetails.json',
   DATA_SOURCE = 'MyAzureStorage',
    SINGLE_CLOB
    ) AS DataFile

INSERT INTO dbo.OrderDetails
SELECT * FROM OPENJSON(@json)
WITH
(
    [Order Number] int,
    [Amount] real,
    [Date] DATETIME,
    [Customer ID] int,
    [Salesman ID] int
)
  • In the above code, firstly, I have declared a variable where I will store my JSON data.
  • Then I used the SELECT statement with the OPENROWSET() function to fetch the JSON data from the file stored in the Azure blob container.
  • Finally, I inserted this data into the dbo.OrderDetails table.
  • I have used the OPENJSON table valued function to read and format the JSON data.
  • I have defined the columns in which the values inside the JSON data will be stored along with their data types.
  • Ensure these columns and their data types match your columns and their data types in the Azure SQL database table.
Azure SQL bulk insert JSON
  • The number of rows affected is shown. This means the data was successfully inserted into the table.
  • You can verify in the database that the data is stored in the table.

Hence, you can import or bulk insert data stored in a JSON file into an Azure SQL database.

Read: Reset Password Azure SQL database

Bulk insert Azure data studio

This section will show you an example of a bulk insert operation using the Azure Data Studio. I will use the BULK INSERT statement to insert data into an Azure SQL database in bulk.

I will insert data from a CSV file stored in an Azure blob container in this example. The CSV file will look like this:

bulk insert azure data studio
  • The first step is to open the Azure Data Studio and connect to your Azure SQL database.
  • Create a master key in the database if you have not created it earlier by executing the below T-SQL query in Azure Data Studio:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'Your Password';
GO
  • If you have configured the Azure blob container to accept only private access, then create a database-scoped credential. Otherwise, if the access is set to public, you can ignore this step.
CREATE DATABASE SCOPED CREDENTIAL MyAzureCredential
 WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
 SECRET = '******srt=sco&sp=rwac&se=2017-02-01T00:55:34Z&st=2016-12-29T16:55:34Z***************';
  • In the above code, SECRET is the SAS token you can generate from your Azure blob container.
  • Then create an external data source by executing the below
CREATE EXTERNAL DATA SOURCE MyAzureStorage
WITH ( TYPE = BLOB_STORAGE,
          LOCATION = 'https://groupazurestorage.blob.core.windows.net/sqlserverbackupdb'
          , CREDENTIAL= MyAzureCredential
);
  • The LOCATION will be the link to your Azure blob container in the above code snippet.
  • Finally, execute the BULK INSERT command to insert data into the table from the CSV file.
BULK INSERT [dbo].[EmployeeDetails]
FROM 'Employees.csv'
WITH (DATA_SOURCE = 'MyAzureStorage',
FORMAT = 'CSV',
FIRSTROW = 2);
  • In the above code snippet, FIRSTROW is set to 2 because the rows start from the second line in the CSV file as the first line contains the column headers.
bulk insert from Azure blob storage to Azure SQL database

The number of rows affected is 11. This means that all 11 records are inserted into the table.

Hence, you can use the BULK INSERT statement in Azure Data Studio to insert data from a file into an Azure SQL database in bulk.

Read: Migrate SQL Server to Azure SQL database

Azure SQL bulk insert from local

Azure SQL does not support reading files from local storage for bulk operations. Therefore, you cannot carry out bulk insert operations from local storage.

If you try to execute a BULK INSERT statement like I have defined below, you will face an error, as shown in the screen below.

My Code:

SELECT * FROM OPENROWSET(
   BULK 'C:\Users\Blades\Documents\TestData.csv',
   SINGLE_CLOB) AS DATA;
Azure SQL bulk insert from local

However, there is an alternative method to do it.

  • To carry out bulk insert operations, you must create an Azure storage account and upload your file to it.
  • After that, you have to create a credential in the database so that it can connect to the Azure storage. Then, you will be able to carry out bulk insert operations.
  • I have explained this method in one of the sections above with an example.

Also, take a look at some more Azure SQL tutorials.

Thus, you might have learned how to carry bulk loading data to Azure SQL in various ways.

Top 200 SQL Server Interview Questions and Answers

Free PDF On Top 200 SQL Server Interview Questions And Answers

Download A 40 pages PDF And Learn Now.