How to Upload Txt File to Azure in Ssms

Introduction

Majority INSERT is a popular method to import data from a local file to SQL Server. This characteristic is supported by the moment in SQL Server on-bounds.

Yet, there is a new feature that is supported only in SQL Server 2017 on-bounds. This feature allows importing information from a file stored in an Azure storage account to SQL Server on-premises using Bulk INSERT. This feature volition be supported in Azure SQL versions in the future.

In this article, we volition prove two examples. The first example will show how to use the traditional BULK INSERT argument from a local CSV file to Azure and the 2d case volition prove how to import information from a CSV file stored in Azure to SQL Server on-premises.

If y'all are new in the Azure world, this article don't worry, as nosotros will include step by step instructions to guide you until the end. If you lot have experience in Azure and SQL Server, merely yous exercise non know much about this item new feature, this article may also exist helpful.

Azure is growing each day and SQL Server is improving the features to connect SQL Server on-premises to Azure. BULK INSERT is a powerful tool to import data because it is fast and it tin can exist hands combined with T-SQL code.

Requirements

  1. SQL Server 2017 installed. If you take SQL Server 2016 or older you will exist able to follow the first example only
  2. An Azure account

Get started

How to import information from a local file to SQL Server on-premises

In this first example, we will create a CSV file with customer data and and then we will import the CSV file to a SQL Server table using Majority INSERT.

Start, we will create a file named mycustomers.csv with the following data:

i,Peter,Jackson,pjackson@hotmail.com
2,Jason,Smith,jsmith@gmail.com
3,Joe,Raasi,jraasi@hotmail.com

So we will create a SQL Server table where we will load the data:

We will load the information using the Majority INSERT statement:

The BULK INSERT statement will import the data from the mycustomers.csv file to the table listcustomer. The field terminator in this file is a comma. The row terminator is a new line (\due north).

If everything is OK, the table will be populated. Y'all can run this query to verify:

The issue displayed is the post-obit:

How to import data from a file in an Azure storage account to SQL Server on-premises

The starting time example tin e brun in SQL Server 2017 or older versions. The 2d instance requires SQL Server 2017 and it is a new feature.

We will load the CSV file to an Azure storage account and then we will load the information to SQL Server 2017 on-premises.

Open the Azure Portal and go to more services (>) and click on Storage Accounts (you tin can work with the classic or the new one):

Printing +Add to create a new Storage business relationship:

Specify a name, a deployment model. In this example, we volition utilize the archetype deployment and a standard performance. Press create:

It will take some minutes to create the storage account. Click the storage business relationship:

Become to Overview and click on Blobs:

Click +Container:

Specify a name for the container and press OK:

Press Upload to load the file in the container:

Upload the file mycustomers created in the first instance with CSV information:

We uploaded data to an Azure storage business relationship in a container. Now, open SSMS in a local machine and connect to a local SQL Server.

We volition commencement create a master cardinal:

The master fundamental is a symmetric central used to protect certificates, private keys and asymmetric keys.

In the side by side pace, we will create a database credential to admission to the Azure Storage. The credential proper noun is azurecred:

The secret password can be obtained from the Azure Portal>Storage Business relationship in the Access keys department. You can use the main or secondary cardinal:

The next step is to create an external data source. The external data source tin exist used to access to Hadoop or in this case to an Azure Account. The proper noun of this data source is client. The type is hulk storage. We will apply the credential just created before:

The location tin be obtained in Azure Portal>Blob Storage Account>Container properties:

If everything is OK, you lot will be able to see the external data source in SSMS:

We volition create a table named listcustomerAzure to store the data:

We will employ now the Bulk INSERT to insert data into the listcustomerAzure tabular array from the file custmers.csv stored in Azure. Nosotros will invoke the external data source just created:

A common trouble is the mistake bulletin 'Cannot bulk load'. The fault message says that you don't have file access rights:

This trouble tin be solved by modifying the admission to the folder. Go to Azure and go to Azure storage account and select the container. Click the option Access policy:

Change the Public admission level to Blob (anonymous read admission for blobs only) and save the configuration:

Run the Majority INSERT sentence once more. If everything is OK now, you will be able to admission to the data:

Conclusions

Bulk INSERT is a very fast option to load massive data. It is a popular tool for one-time versions of SQL Server and new ones. SQL Server 2017 supports the ability to run BULK INSERT statements to load data from Azure storage accounts to SQL Server on-premises.

To import data from an Azure storage account, you need to create a master primal and then create a credential with a cardinal to the Azure storage account. Finally, y'all create an external data source with that credential. Once created the external information source, you can use the Bulk INSERT. Y'all may need to change the access policies to the container.

Note that a Windows business relationship is non used to connect to Azure. That is the nearly of import divergence between a local Bulk INSERT and a Majority INSERT to an Azure Business relationship. In a local Majority INSERT performance, the local SQL login must have permissions to the external file. In contrast, when you BULK INSERT an Azure file, the credentials are used, and the Windows local login permissions are irrelevant.

BULK INSERT is not supported in Azure SQL Data Warehouse or Parallel Data Warehouse and the selection to import files stored in Azure is by the moment only supported in SQL Server on-premises and not in Azure SQL databases nonetheless.

  • Author
  • Contempo Posts

Daniel Calbimonte

nelsonneye2000.blogspot.com

Source: https://www.sqlshack.com/use-bulk-insert-import-data-locally-azure/

0 Response to "How to Upload Txt File to Azure in Ssms"

Post a Comment

Iklan Atas Artikel

Iklan Tengah Artikel 1

Iklan Tengah Artikel 2

Iklan Bawah Artikel