You are viewing limited content. For full access, please sign in.

Question

Question

Feature Request: BULK Insert for SQL

asked on May 20, 2022

While we do not run into millions of rows that we need to insert into a SQL table, we do come across situations where we are processing thousands of values from a multi-value field or similar scenario. Rather than having to loop through each one and perform an Insert, it would be great if we could gather all the values and then do a BULK Insert instead. Is it possible to create a Workflow activity to do it? I know that it asks for a file to point to, so I'm not sure the logistics of how that would work, but it would be great if we could. We have several processes that could benefit from it.

2 0

Replies

replied on May 23, 2022 Show version history

I agree this would be useful as a built-in activity type. Workflow creates XML temp files for certain activities like db lookups, so it can definitely create files on the disk.

I think the tricky part would be ensuring that the file for a bulk insert is created in a place that the SQL Server can access it for the import because that is something that could go wrong more easily. For example, WF could place a file somewhere SQL can't get it, or the WF service account may not have access to place the file where SQL can access it.

 

I have a couple of processes that use bulk insert via Workflow. I'll include some information about that below for anyone who might find it useful.

In my process, I'm receiving a txt flat file from another system via Import Agent, which I then insert into a SQL database using Workflow. If you're using data from another source you'd still need to build the file, but that can be done with a script if necessary and even if you have to loop to create the file I'd imagine that should still be faster than looping db inserts.

First, I have tokens that define the paths outside of the scripts so they're easy to maintain and I don't forget about them. The first points to the file I'll be using, and the second points to a custom folder on the SQL Server we have specifically for things like this.

The "Import Path" also includes subfolders; one is for the actual data file, and the other is for the template file I use to ensure the bulk insert reads the data properly.

When you're using a fixed-width flat file, the XML for the bulk insert would look something like this:

<?xml version="1.0"?>
<BCPFORMAT xmlns="http://schemas.microsoft.com/sqlserver/2004/bulkload/format" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<RECORD>
  <FIELD ID="1" xsi:type="CharFixed" LENGTH="1"/>
  <FIELD ID="2" xsi:type="CharFixed" LENGTH="19"/>
  <FIELD ID="3" xsi:type="CharFixed" LENGTH="12"/>
  <FIELD ID="4" xsi:type="CharFixed" LENGTH="12"/>
  <FIELD ID="5" xsi:type="CharFixed" LENGTH="3"/>
</RECORD>
<ROW>
  <COLUMN SOURCE="1" NAME="Code" xsi:type="SQLCHAR" LENGTH="1"/>
  <COLUMN SOURCE="2" NAME="LastName" xsi:type="SQLCHAR" LENGTH="19"/>
  <COLUMN SOURCE="3" NAME="FirstName" xsi:type="SQLCHAR" LENGTH="12"/>
  <COLUMN SOURCE="4" NAME="MiddleName" xsi:type="SQLCHAR" LENGTH="12"/>
  <COLUMN SOURCE="5" NAME="Suffix" xsi:type="SQLCHAR" LENGTH="3"/>
</ROW>
</BCPFORMAT>

Next, a script activity copies those files over to the "Export Path" on the SQL Server so the Bulk Insert has access to them in the next step of the process.

// get file path
string sourcePath = GetTokenValue("Import Path").ToString();
string destinationPath = GetTokenValue("Export Path").ToString();

// get file names
string file = GetTokenValue("Data File").ToString();
string format = GetTokenValue("Format File").ToString();

// default result value
bool success = false;

try {
    // check for source directory
    if(Directory.Exists(sourcePath)){
        // create target directory if it does not exist
        if(!Directory.Exists(destinationPath)) {
            Directory.CreateDirectory(destinationPath);
        }

        // get source file paths
        string sourceFile = Path.Combine(sourcePath,"Pending",file);
        string sourceFormat = Path.Combine(sourcePath,"Templates",format);

        // check for source files
        if(File.Exists(sourceFile) && File.Exists(sourceFormat)) {
            // copy files to target path
            File.Copy(sourceFile,Path.Combine(destinationPath,file),true);
            File.Copy(sourceFormat,Path.Combine(destinationPath,format),true);

            // set result
            success = true;
        }
    }
}
finally {
    // set result token
    SetTokenValue("Export Successful",success);
}

I use a try-finally with the "Export Successful" token holding the results so I can loop around and try again if it fails for any reason.

Then, I use a custom query activity to call Bulk Insert and import the data using the XML template.

BULK INSERT [#DataImport]
FROM 'D:\SQL Job Resources\FILE_NAME.txt'
WITH
(
	FORMATFILE = 'D:\SQL Job Resources\TEMPLATE_NAME.xml',
	KEEPNULLS
)

Here's how that part of the workflow looks

At the end of the WF I have another script that cleans up the completed files.

4 0
replied on May 23, 2022

I like this!  Currently I have a couple process where I manually gather the information in a query from one SQL database, place it in a temp table, and manually update a different SQL table in Laserfiche.  It takes a very short time to do it, but it would be great if somehow I could do that with a workflow.

0 0
You are not allowed to follow up in this post.

Sign in to reply to this post.