This article explains the process of bulk transfer to Azure Blob with little reference to T-SQL.
When working with SQL Server databases, one of the most common problems we face is the insufficient space for storing data. Although SQL users are given the option of using the Azure cloud to store data, that also proves to be insufficient when working with Big Data. So, one of the common solution many SQL users opt for, involves making use of Azure Blob storage for storing SQL Server databases. However, when you opt for making use of Azure Blob for storing data, you also need to know how transfer data from Azure Blob to SQL Server database. This is one of those processes that will have to be completed in multiple stages, which are explained below.
Signing in to Azure using PowerShell
- Before you can make use of PowerShell for transferring data, you first need to have a valid subscription, that gives you access to the service. Once you have subscribed, you will then have to enter genuine login credentials to proceed.
- There might be multiple subscriptions attached to an account, so make sure you are selecting the correct one out of all.
Creating and loading Blob Container
- Whatever data you have on Azure, is stored in a resource group. So you need a resource group, a storage account, as well as blob container, and all of these should be named accordingly. This is one of the prerequisites for the PowerShell cmdlets to work effectively.
- Everytime you add files in blob storage, you will also have to add packing list. This is the file that has names of all files which are to be added in Azure SQL databases.
- After creating a blob storage, you would be needing a PowerShell script to be able to load the given data in the blob.
Creating and Defining Azure SQL Database
- In the created resource group on Azure, you also need to create a SQL Server database, within this resource group.
- It is now time to define the created Azure Database, for this you need to visit the SSMS.
- Connecting to server is important because this is where you will be creating T- SQL scripts.
- Select Database Engine as the server type,now enter the appropriate name of the created Azure SQL server database.
- For the security option, select SQL Server authentication, and enter the login details of the administrator of the server.
- You would be required to create two schemas, one Active and another Stage. The Active schema will store combined data. Stage schema on the other hand will have the intermediate results and auditing information.
- It is now time for you to execute the T – SQL script, you will now come across three new tables in the object explorer window.
Defining External Sources of Data
- To ensure that the T – SQL commands like BULK INSERT are able to get access to the data in the Azure blob, it is important for you to define external data sources.
- This command is able to recognize CSV files in Azure blob, so make sure your script is created in this format.
- If the first bulk insert attempt fails, try again after truncating the staging table and then restarting the process.
Bulk insert operations and data transfer operations can at times cause SQL errors. In case you encounter any such issue, immediately grab hold of a powerful tool that can repair sql.
Victor Simon is a data recovery expert in DataNumen, Inc., which is the world leader in data recovery technologies, including mdb repair and sql recovery software products. For more information visit https://www.datanumen.com/