
With the SQL Database connector, Azure Logic Apps enables us to perform operations towards both cloud and on-premise SQL Server databases.
Currently, the SQL Server Connector does not have any triggers which can be used & it supports the following operations:
- Delete a row
- Execute Stored Procedure
- Get row
- Get rows
- Insert row
- Update row
- Get Tables
The operations above are a very nice starting point when wanting to manipulate SQL Server databases using Azure Logic Apps. However, keeping the “pay as you use”-pricing model of Azure Logic Apps in mind, using the SQL Database connector might become an expensive solution in specific scenario’s. This blog post will focus on using the SQL Database Connector to insert multiple records into a SQL Database using the Azure SQL Database Connector.
As a starting point, we’ll use the following JSON, which will be used as an inbound HTTP request for our Logic App:
{
“People”: [ { “FirstName”:”Orlando”, “LastName”:”Gee” }, { “FirstName”:”Keith”, “LastName”:”Harris” }, { “FirstName”:”Donna”, “LastName”:”Carreras” }, { “FirstName”:”Janet”, “LastName”:”Gates” }, { “FirstName”:”Lucy”, “LastName”:”Harrington” } ] } |
The inbound HTTP request contains a list of multiple “People” which we want to insert to SQL using the SQL Database connector. Looking at the list of available actions, we only have the “Insert Row”-action available. As a result, we would have to loop over each Person in the List and perform the inserts one by one. In the most minimal solution and based on the sample inbound request, we would have a to pay for a total of 7 actions:
- Trigger
- Initialize for-each loop
- Insert Row (5x)
When working with more data, this might become a problem to keep your Logic App price-efficient. In this blog, we’ll describe an alternative method using a stored procedure and OPENJSON to insert multiple rows within one transaction using the Azure SQL Database connector.
Preparing SQL Server
Creating an Azure SQL Server Database
In this section, we will prepare an Azure SQL Server Database.
Browse to the Azure Portal (http://portal.azure.com), click the “Add new”-button and search for SQL Database
Continue creating the SQL Database by filling in all the required information, such as the desired database name, subscription, resource group, pricing tier etc.
Since we don’t yet have a SQL Database Server, we also need to provide some additional details allowing the creation of a new database server as well.
Once all required information was provided, complete by clicking the “Create”-button.
When our SQL Database Server and database have been provisioned, we’ll continue with the SQL Preparation using SQL Server Management Studio. To find the exact connection string to use, open the database in the Azure Portal and click “Show database connection strings”. The database connection string needed for Management Studio looks something like this: yourservername.database.windows.net,1433. For credentials, use the Server admin login and password specified during creation of the database.
In some cases, opening the connection in SQL Server Management Studio can be blocked by the SQL Server firewall. To allow our IP’s, open the database in the Azure Portal again and select “Set server firewall” from the top menu.
In the “Firewall settings”-window, click the “Add client IP”-button and it will automatically white list our client id on the SQL Server firewall. We can now open the connection using SQL Server Management Studio.
Validating the compatibility level
In order to use the OPENJSON, our database needs to have a compatibility level of 130. We can validate this by using the “sys.databases” view. In our case, we used the following T-SQL query to validate the compatibility level:
In case the database has a lower compatibility-level, we can change it using the following command:
OPENJSON
OPENJSON is a table-value function to parse JSON text and return the objects and properties from the input JSON as rows and columns. Since OPENJSON returns a set of rows, it can be used in a FROM-clause of SQL-statements, allowing us to use the result set in JOINS, CROSS APPLY, SELECTS, etc…
OPENJSON in action
We’ll start by declaring a variable to store our JSON-string. This is the same JSON-object which will be used to trigger our Logic App with.
Let’s start with a basic select using OPENJSON on our @People-variable containing the JSON expression:
The result returns us one row with Key “People”. The value is in fact the array of People contained in the message and type “4”.
The following types are available:
- NULL
- String
- Number
- Boolean
- Arrays
- Objects
In our case, we don’t want the root People-array, but we want a list of People. So let’s tweak the SELECT a little further. Following statement returns us 5 People-objects. In this statement, we passed a second argument to the OPENJSON function. The ‘$.People’-argument defines the path, which references an object or an array within the JSON-string. If the path is provided as an argument, the OPENJSON-function will use the specified position to perform a search within the JSON-text and will only parse the references value.
We are already a step closer to what we need, but we’re not just there yet.
We are going to add a final addition to our T-SQL-query, known as the “with”-clause. By using the “with”-clause, we can define a list of columns and their types which we expect to see returned by the OPENJSON-function. In addition to specifying the column-name, we’ll also provide a column-path. This is because our requested column-names do not match the keys within our JSON-object. If this is the case, specifying a column-path instructs the OPENJSON-function were to seek the desired columns.
Creating the table and stored procedure
Now that we’ve produced a T-SQL statement which returns us the desired dataset, we can proceed with configuring the final parts of our SQL-database.
We’ll start by creating a basic “People”-table which matches the structure of the inbound JSON-message we’ll be using to trigger our Logic App:
We’ll also create a stored procedure to perform the bulk insert operation to our “People”-table. The stored procedure will have one input parameter which will contain our JSON-string. We’ll do a basic insert into our “People”-database using the data produced by the OPENJSON-function.
Designing the Logic App
All that’s left now is the creation of our Logic App. Browse to http://portal.azure.com and create a new Logic App.
Start by adding the “HTTP-request” as trigger of the Logic App by searching on Request and selecting the proper action:
We’ve used the sample JSON as sample payload to generate the JSON-schema for the “HTTP Request”-action and end up with the following “Request”-trigger.
When saving the Logic App, a new HTTP Post URL is automatically generated. This is the URL we’ll need during testing.
Let’s Add a new action by clicking “New step” followed by “Add an action”.
We’ll search for the SQL Server “Execute Stored Procedure”-action. Type “SQL” in the search-pane, click on the “SQL Server”-connector, and choose the “Execute Stored Procedure”-action from the list of available SQL Actions.
Complete the configuration of your action by providing some additional information.
- Define a name for the connection
- Select the SQL Database Server and SQL Database name. Since we are using an Azure SQL Database, the servers and databases available to the logic app are listed for us.
- Provide the username and password to make the connection.
- Complete by clicking “Create”.
Finally, choose the stored procedure and defined the content of the “jsonObject”-parameter.
Let complete the Logic App by adding the “HTTP Response” as our final action.
As Status Code we’ll be returning “200 OK” with the “Stored Procedure Result Set” as the response body.
Testing the Logic App
We’ll be using Postman to test our Logic App. We’ll be sending a HTTP-POST with our “People”-json object containing 500 people.
We’ll be executing an HTTP POST towards the Logic Apps endpoint. The address is the URI which was automatically generated for our Trigger when saving the Logic App.
As a Body, we’ll be sending a HTTP-POST with our “People”-json object containing 500 persons.
When executing the POST-action, we receive the HTTP 200 Response, and the body contains the result set returned by the Stored Procedure containing the actual table Id.
If we check our SQL Table, we can see all 500 records have been inserted.
When validating the Logic App run itself, we can see it took 923 Milliseconds to execute the entire Logic App, including inserting 500 People into SQL Server.
Let’s compare this performance against executing a “Insert row” for each person in our People-object by executing the following logic app.
The Request and Response remain unchanged, however instead of executing the Stored Procedure, we’ll be looping over the “People”-array and insert a row for each person within the array:
Without any optimizations in the default configuration of this Logic App, our test failed because the timeout for the response was reached.
The Logic App did manage to insert all 500 records, but it took nearly 4minutes to complete the action.
Conclusions
Performing batch-uploads using SQL Connector, stored procedures and OPENJSON is a fast and reliable method to perform Bulk-operations towards SQL-server. As a demo-purpose, we’ve kept the sample message fairly simple, as we mainly wanted to demonstrate the basic concepts. However, OPENJSON does provide enough functionality to seek in complex JSON-strings, so with the power of OPENSJON and good stored procedure design, this approach offers the necessary flexibility to implement bulk operations towards a SQL Server Database.