Moving data around the enterprise, ideally, we want to do it using APIs and controlled interfaces. Sometimes this is not possible, you might have applications where APIs are not available or the data you need is simply not exportable any other way than using a CSV export from the application front end.
Pulling data into a central location can open possibilities for your business such as reconciliation, reporting and visualisation across applications.
I this week's blog post we will show you that it is possible to bulk insert a csv file using Microsoft Forms, Power Automate and some simple stored procedures.
1. First we will create a very basic Form, today we will only have one question, and this will be an upload file question with the file type allowed set to Excel.
2. Now we have the form we need to create the scripts which will take the data and bulk insert these into SQL server. We will be loading the file into Azure blob storage so we need to setup some things on or SQL Server to connect.
2a. Create a master key:
CREATE MASTER KEY ENCRYPTION BY PASSWORD='Password here';
2b. Create a database scoped credential using the IDENTITY which must be SHARED ACCESS SIGNATURE. You will need to have a SAS token generated for the blob storage account. Make sure that your SAS token does not have a leading ?, that it has at minimum read permission on the object that should be loaded, and that the expiration period is valid .
CREATE DATABASE SCOPED CREDENTIAL azurecred
WITH IDENTITY = 'SHARED ACCESS SIGNATURE', SECRET = 'sv=2015-12-11&ss=b&srt=sco&sp=rwac&se=2017-02-01T00:55:34Z&st=2016-12-29T16:55:34Z&spr=https&sig=copyFromAzurePortal';
2c. Now it is time to setup an external datasource
CREATE EXTERNAL DATA SOURCE csvdata
WITH ( TYPE = BLOB_STORAGE, LOCATION = 'https://myazureblobstorage.blob.core.windows.net', CREDENTIAL= azurecred);
Once the creation has run successfully you will see the CSV source under External Resources
3. Let's create the table we will put the data in
CREATE TABLE [dbo].[CSV_Upload](
[ID] int NULL,
[Stuffs] [varchar](40) NULL,
[things] [varchar](40) NULL
) ON [PRIMARY] GO
4. We have the scaffolding done and it's time to create the stored procedure which will do the work. We configure the file name to be a variable so that we can use this for other files. If you need the ability to do the same for other sinks feel free to setup the sink to be a parameter as well.
Create PROCEDURE [dbo].[usp_bulk_insert] @file nvarchar(500)
as
begin DECLARE @bulkinsert NVARCHAR(2000)
SET @bulkinsert = N'BULK INSERT CSV_Upload FROM ''' + @file + N''' WITH (FIRSTROW = 2, FIELDTERMINATOR = '','',
DATA_SOURCE = ''csvdata'', FORMAT = ''CSV'',ROWTERMINATOR = ''0x0a'')' EXEC sp_executesql @bulkinsert
end
GO
5. We have the source and the sink, now it is time for the orchestration. Navigate to Power Automate and create an automated cloud flow based on a form submission.
6. Now we need to gather the information from the form, bear in mind that when you upload a file in a form it is placed in Onedrive for business. We will start by getting the details of the response so that we can retrieve the file from OneDrive
7. We can now retrieve the file
8. Time to upload the file to Azure blob storage which we setup as an external data source
9. Now add an action to execute the stored procedure we created earlier
10. Now here comes the trick, we have setup a process which will consume a CSV file and bulk insert into a SQL Server table however the form upload only supports Excel.
Simply change the file extension from csv to xlsx to allow the form to see the file. DO NOT change the file to an excel file only change the file extension. Upload the file and the process will pick it up, pop it into azure blob storage and SQL server will bulk insert the data into your table. Now you are free to do whatever you want with your data which is nicely structured in SQL Server.
If you think your business might benefit from this type of thinking reach out to us at Ava Tech, we love to talk automation.
Comments