Bulk loading data to Azure SQL

In this tutorial, we will learn about bulk loading data to Azure SQL. And we will see various bulk inserting techniques along with some examples.

  • Bulk import and export to Azure SQL database
  • Bulk export using BCP utility
  • Bulk import using BCP utility
  • Bulk insert CSV into Azure SQL database
  • Bulk insert from Azure blob storage to Azure SQL database
  • Azure SQL database bulk insert JSON
  • Bulk insert Azure data studio
  • Azure SQL bulk insert from local

Bulk import and export to Azure SQL database

Bulk importing or bulk loading means importing a large amount of data from a file into a database. If you encounter the need to import a large amount of data from a file into an Azure SQL database, you can use various techniques.

There are majorly 4 ways of doing it:

  1. Using the bcp utility
  2. BULK INSERT statement
  3. INSERT … SELECT * FROM OPENROWSET() statement
  4. Using the SQL Server Import and Export Wizard

I will explain all of the above techniques in the below sections with the help of examples.

Bulk export using BCP utility

Before bulk importing the data, let us also understand bulk exporting the data into a file. We will use the BCP utility to bulk export data from an Azure SQL database to a local file.

Open the command prompt and, make changes according to your details in the below command and then execute it.

bcp dbo.Department out C:\Users\Blades\Documents\Customers.txt -c -t -S mysql1000.database.windows.net -d DemoDatabase -G -U Blades@fewlines4bijuhotmail.onmicrosoft.com -P Test@123
  • In the above command, firstly we specified the table name whose data we want to export.
  • The out flag specifies that we are exporting the data to a data file and we specify the path to the data file after the out flag.
  • The -t specifies that the connection is trusted.
  • mysql1000.database.windows.net is the server name.
  • The -G flag specifies that I am using the Active Directory authentication for authenticating to my database.
  • The -U and -P flags specifies the active directory username and the password respectively.
  • The -d flag specifies the database name to connect.

Once executed successfully, you will see an output like below in the image:

bcp export in azure sql database
Exporting data to a file using the BCP utility

Now in your specified path, you can see that a file is created and the data is copied into the file.

bcp export in azure sql database
Data exported into a text file

In this way, you can use the BCP utility to export data into a file.

Also. check: Backup Azure database to local SQL Server

Bulk import using BCP utility

Now we will use the BCP utility to bulk load data into an Azure SQL database table.

Firstly, I will delete all the rows from the dbo.Department table using the DELETE statement as below:

DELETE FROM [dbo].[Department]
bcp import example
Deleting all the rows of the table

You can see that the table is now empty. Now I will use the BCP utility to bulk import data into this table. Look at the command below:

bcp dbo.Department in C:\Users\Blades\Documents\Customers.txt -c -t -S mysql1000.database.windows.net -d DemoDatabase -G -U Blades@fewlines4bijuhotmail.onmicrosoft.com -P Test@123

The above command is almost the same as we used to export the data. The only difference is that for specifying the type of operation to be imported, I have changed the out flag to the in flag.

bcp import example
Importing data into the table using BCP utility

Now let me verify the data in my Azure SQL database. I will fetch all the records of the dbo.Department table.

bcp import example
Data imported into the Azure SQL database table

You can see that all the records from the text file are imported into the table in my Azure SQL database.

Thus, you might have learned how you can bulk load data into an Azure SQL database using the BCP utility.

Also, read: How to rename a database in Azure SQL

Bulk insert CSV into Azure SQL database

In this section, you will learn how you can use SQL Server Import and Export Wizard to bulk insert data into an Azure SQL database. You will learn how you can bulk insert a CSV file into an Azure SQL database.

I have created a sample CSV file for this demonstration purpose. This CSV file stores data of random persons.

bulk insert data to azure SQL
Sample CSV file

I will create a new table in my Azure SQL database using this CSV file. Below are the steps involved in the process.

  • Run SQL Server Import and Export Wizard from the start menu or from SQL Server management studio.
  • In the data source options, select the Flat File Source option from the drop-down menu.
  • Click on Browse and select your CSV file. Click on Next.
bulk loading data to azure sql
Select data source
  • Now you will see a preview of your CSV file. You are selecting a CSV file as the source file. Thus, the Column Delimiter option must be set to Comma. Once reviewed, click on Next.
  • Now you have to enter the destination details i.e the database details. Select the Destination option to .Net Framewrok Data provider for SqlServer.
  • In the ConnectionString column, you have to enter the connection string that you use to connect to your database.
  • The below string is a sample connection string. Enter your database details in this string and paste this string into the ConnectionString column. Click on Next.
Server=tcp:<server name>.database.windows.net,1433;Initial Catalog=<database name>;Persist Security Info=False;User ID=<database user name>;Password=<password>;MultipleActiveResultSets=False;Encrypt=True;TrustServerCertificate=False;Connection Timeout=30;
bulk insert csv into Azure SQL database
Select a destination
  • Here you will be asked to select the tables from the CSV file that you want to copy. You will also see the destination table name here. You can change this destination table name if you want. Otherwise, click on Next.
bulk insert CSV Azure
Selecting the tables to copy
  • After this step, review the data type mapping of the columns and then click Next.
  • Now you will be asked to save the SSIS package so that you can run it again or use in automation. Click on Next and then click on Finish.
  • You will see an output like in the below screen if the process succesfully completes.
Bulk insert CSV Azure SQL
Output on successful execution

Now you can verify in the database that the table is created and the data is also imported.

Azure SQL database bulk insert CSV
Data imported in the Azure SQL database

Thus, you might have learned how you can bulk insert data from a CSV file into an Azure SQL database.

Read: How to Connect to Azure SQL database

Bulk insert from Azure blob storage to Azure SQL database

If you want to use the BULK INSERT statement to insert data into an Azure SQL database, you must have your data stored in an Azure blob container. For this, you need to create a storage account.

If you don’t know how to create a storage account in Azure, you can read our article Backup and restore SQL Server to Azure blob storage.

The first thing is to upload your file to the Azure blob container. For this example, I will be using the same CSV file that I created in the above section. Also, I will delete all the data from the dbo.TestData table.

We do not have to drop the table because we need the structure of the table in order to insert data into it.

Now follow the below steps to bulk insert the data from the CSV file into the Azure SQL database table:

  • Open the Query Editor window of the Azure SQL database.
  • Create a Master key for encryption purposes using the below query:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'Your Password';
GO
  • The next step is to create a database scoped credential. Skip this step if you have enabled the public access for your blob container. Otherwise, execute the below command to create a credential:
CREATE DATABASE SCOPED CREDENTIAL MyAzureCredential
 WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
 SECRET = '******srt=sco&sp=rwac&se=2017-02-01T00:55:34Z&st=2016-12-29T16:55:34Z***************';
CREATE EXTERNAL DATA SOURCE MyAzureStorage
WITH ( TYPE = BLOB_STORAGE,
          LOCATION = 'https://groupazurestorage.blob.core.windows.net/sqlserverbackupdb'
          , CREDENTIAL= MyAzureCredential
);
  • In the above command, you have to use your Shared Access Signature(SAS) token of your Azure blob container.
  • To generate an SAS token, open your Azure blob container and click on Shared access tokens. Now enter the details like the type of permission, the time period for which the token will be valid, etc. Click on Generate SAS token and URL.
bulk loading data to azure sql
Generate SAS token
  • Paste this SAS token into the SECRET flag of the above CREATE CREDENTIAL command.
  • Finally, execute the BULK INSERT command to import the data from the data file into the Azure SQL database table.
BULK INSERT [dbo].[TestData]
FROM 'TestData.csv'
WITH (DATA_SOURCE = 'MyAzureStorage',
FORMAT = 'CSV');
bulk insert azure blob storage
Bulk insert data into Azure SQL database
  • Now you can verify in the table that the rows are successfully inserted.

Thus, you might have learned how you can use the BULK INSERT statement to bulk insert data into an Azure SQL database.

Note: You do not need to create a database scoped credential if you have enabled public access to your Azure blob container.

Read: Pause and resume Azure SQL database

Azure SQL database bulk insert JSON

In this section, you will learn how you can bulk insert data from a JSON file into an Azure SQL database table.

For this example, I will be using the OPENROWSET function to read data from a JSON file and insert it into an Azure SQL database table.

I have created a sample JSON file containing data of orders’ details table with the name OrderDetails.

Azure SQL database bulk insert JSON
Sample JSON file
  • Firstly, I have to upload this file into an Azure blob container.
  • Also, if you have not enabled public access to your Azure blob container, you have to create a credential in your database that you will use to connect to your Azure blob container.
  • Secondly, you have to create an external data source i.e. you have to configure the Azure blob container as an external data source.
  • These steps are explained in the previous section. You can follow them to create credentials and an external data source.

Once you have completed the above steps, you have to create a table in the database in which you will insert this data. I have created a table named dbo.OrderDetails in my example.

  • Now execute the below T-SQL query to bulk insert data into the table:
DECLARE
    @json varchar(max)
SELECT @json = BulkColumn FROM OPENROWSET(
   BULK 'OrderDetails.json',
   DATA_SOURCE = 'MyAzureStorage',
    SINGLE_CLOB
    ) AS DataFile

INSERT INTO dbo.OrderDetails
SELECT * FROM OPENJSON(@json)
WITH
(
    [Order Number] int,
    [Amount] real,
    [Date] DATETIME,
    [Customer ID] int,
    [Salesman ID] int
)
  • In the above code, firstly, I have declared a variable in which I will store my JSON data.
  • Then I used the SELECT statement with the OPENROWSET() function to fetch the JSON data from the file stored in the Azure blob container.
  • Finally, I inserted this data into the dbo.OrderDetails table.
  • I have used the OPENJSON table valued function to read and format the JSON data.
  • I have defined the columns in which the values inside the JSON data will be stored along with their data types.
  • Make sure these columns and their data types should match with your columns and their data types in the Azure SQL database table.
Azure SQL bulk insert JSON
Bulk insert JSON data into an Azure SQL database
  • You can see the number of rows are affected. This means the data is successfully inserted into the table.
  • You can verify in the database that the data is stored in the table.

Hence, in this way you can import or bulk insert data stored in a JSON file into an Azure SQL database.

Read: Reset Password Azure SQL database

Bulk insert Azure data studio

In this section, I will show you an example of a bulk insert operation using the Azure Data Studio. I will use the BULK INSERT statement to bulk insert data into an Azure SQL database.

In this example, I will insert data from a CSV file stored in an Azure blob container. The CSV file will look like this:

bulk insert azure data studio
The sample CSV file
  • The first step is to open the Azure Data Studio and connect to your Azure SQL database.
  • Create a master key in the database if you have not created it earlier by executing the below T-SQL query in Azure Data Studio:
CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'Your Password';
GO
  • Then create a database scoped credential if you have configured the Azure blob container to accept only private access. Otherwise, if the access is set to public access, you can ignore this step.
CREATE DATABASE SCOPED CREDENTIAL MyAzureCredential
 WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
 SECRET = '******srt=sco&sp=rwac&se=2017-02-01T00:55:34Z&st=2016-12-29T16:55:34Z***************';
  • In the above code, SECRET is the SAS token that you can generate from your Azure blob container.
  • Then create an external data source by executing the below
CREATE EXTERNAL DATA SOURCE MyAzureStorage
WITH ( TYPE = BLOB_STORAGE,
          LOCATION = 'https://groupazurestorage.blob.core.windows.net/sqlserverbackupdb'
          , CREDENTIAL= MyAzureCredential
);
  • In the above code snippet, the LOCATION will be the link to your Azure blob container.
  • Finally, execute the BULK INSERT command to insert data into the table from the CSV file.
BULK INSERT [dbo].[EmployeeDetails]
FROM 'Employees.csv'
WITH (DATA_SOURCE = 'MyAzureStorage',
FORMAT = 'CSV',
FIRSTROW = 2);
  • In the above code snippet, FIRSTROW is set to 2 because the rows starts from the second line in the CSV file as the first line contains the column headers.
bulk insert from Azure blob storage to Azure SQL database
Bulk insert data into an Azure SQL database

You can see that the number of rows affected is 11. This means that all 11 records are inserted into the table.

Hence, in this way you can use the BULK INSERT statement in Azure data studio to bulk insert data from a file into an Azure SQL database.

Read: Migrate SQL Server to Azure SQL database

Azure SQL bulk insert from local

Azure SQL does not support reading files from local storage for bulk operations. Therefore, you cannot carry out bulk insert operations from local storage.

If you try to execute a BULK INSERT statement as I have defined below, you will face an error as shown in the below screen.

My Code:

SELECT * FROM OPENROWSET(
   BULK 'C:\Users\Blades\Documents\TestData.csv',
   SINGLE_CLOB) AS DATA;
Azure SQL bulk insert from local
Error Message if you try to read a local file from Azure SQL database

However, there is an alternative method to do it.

  • For carrying out bulk insert operations, you have to create a storage account in Azure. Then you have to upload your file into the Azure storage.
  • After that, you have to create a credential in the database so that the database can connect to the Azure storage. Then you will be able to carry out bulk insert operations.
  • I have explained this method in one of the sections above with an example.

Also, take a look at some more Azure SQL tutorials.

Thus, you might have learned how you can carry out bulk loading data to Azure SQL in various ways. We covered various examples including the below list:

  • Bulk import and export to Azure SQL database
  • Bulk export using BCP utility
  • Bulk import using BCP utility
  • Bulk insert CSV into Azure SQL database
  • Bulk insert from Azure blob storage to Azure SQL database
  • Azure SQL database bulk insert JSON
  • Bulk insert Azure data studio
  • Azure SQL bulk insert from local