You are viewing limited content. For full access, please sign in.

Question

Question

Workflow - Rows to Return from Lookup - How to start on row

asked on November 7, 2023

We have these 3 options, but how do we say Next 5000 Rows or start on Row 5001 for example

You can have 10,000 rows in a Simple Table but you can only put 5,000 rows per workflow.

0 0

Answer

SELECTED ANSWER
replied on November 8, 2023 Show version history

You could add a RowID field to the table and then include where RowID > 0 and RowID <= 100

1 0

Replies

replied on November 7, 2023 Show version history

To do that, you will have to do a custom query and use OFFSET and FETCH

SELECT Col1
,Col2
,Col3
FROM [Table1]
ORDER BY Col1
OFFSET 100 ROWS
FETCH NEXT 100 ROWS ONLY;

 

2 0
replied on November 8, 2023

We lost the custom query activity after moving to Cloud. This could work in a view though. Since views don't take parameters we would need multiple views though.

0 0
replied on November 7, 2023

If this is cloud, you can't skip rows. Your query needs to be able to narrow down the results.

0 0
replied on November 8, 2023

I need to narrow  the results down to the maximum amount workflow can insert into a simple table, that would be 5,000.

My plan was to narrow the results down by configuring the query to return the next 5000 rows.

But since all configurations start on First, there is no way to narrow the results. It will always start at the top.

0 0
SELECTED ANSWER
replied on November 8, 2023 Show version history

You could add a RowID field to the table and then include where RowID > 0 and RowID <= 100

1 0
replied on November 8, 2023

Oh maybe we could do this with a view and have a single view

0 0
replied on November 8, 2023

So what is the end goal here? It sounds like you'd want to iterate over the entire table every time this WF runs?

0 0
replied on November 9, 2023 Show version history

We are just looking to populate a Simple Table in Cloud from an external data source.

The simple table allows for 10k records, but Workflow forces you to execute a For Each Loop and a rule call to insert 1 value at a time. It does not have an option to map external data source data to a Simple Table directly.

In this case we are using SQL Server as the source, but in some cases we also use HTTP Request to JSON or Retreive Document Text to Lines. Then pass this data to a For Each Value or Row.

On top of this, it counts each loop as an activity increment in addition to the rule call. So 5000 inserts is 10k activities according to WF, half of the table limit.

The reason we so often copy data from external sources to simple tables is because Web Forms do not have live lookups to external data sources and Invoice Capture Vendor Lookup has no lookups to external data sources. So all the information must be copied continuously from the external data source to the simple table for these features to work.

Since vendors are being added and removed from the primary service where they are managed, the data source, we need to wipe the entire Simple Table and re-build it consistently to keep it in sync. Imagine trying to do a differential in WF with a 10k limit on activities including loop iterations.

Adding the row count worked, but I ended up with 4 workflows to get the data in there since with the where clause information it had to sort through 18k rows to find the 9k rows it was looking for to insert into the simple table.

What I don't understand is why can I run 4 workflows that have 40k activities but I can't run 1 workflow that has 40k activities. It is the same amount of work for the computer, but more confusing for the end user organizationally.

0 0
replied on November 9, 2023

To add to Chad's comment, I would like to understand these cloud limitations.  My company is implementing Smart Invoice Capture with Chad's help, and it's quite frustrating to work around these issues.  In a perfect cloud world, we should be able to:

  • Upload a vendor table (ours has about 17,000 records)
  • Sync the vendor table with on-prem data source (SQL Server table)

 

Scheduled sync jobs would greatly reduce the amount of work the servers have to do and would be significantly less complicated to maintain.

 

1 0
replied on November 9, 2023

In regards to the 17k records, the updated Lookup Table limits as part of the November release should help with being able to hold more table information.

If you are also looking to sync the vendor table, with your on-prem data source, you can consider utilizing the Replace Lookup Table activity to do a full replace of the table. Export your SQL Server table to a file and you can use a combination of Import Agent + WF to perform the replace. 

We are currently looking in to better solutions to help with the data sync as well as lookups to external data sources. The information you have provided has been great so far and I have it noted along with reaching out to discuss your use cases in more detail.

0 0
replied on November 14, 2023

It seems replace lookup table does exactly what we are doing now, delete and replace all rows.

But it does not say how it connects to their database to get the CSV file. Import Agent only looks at a drive location as far as I know, expecting the CSV to already be created.

0 0
replied on November 14, 2023 Show version history

Neither the "Replace Lookup Table" activity or Import Agent connects to the DB to export the data to CSV.  It is up to you to decide how you are going to get data into a CSV file and that file into the folder monitored by Import Agent.

 

So now the question is how do you get the SQL data into a CSV file in the folder that Import Agent monitors.  One way is to script it through a PowerShell script and use a bat file to execute the PowerShell script.  Once you have the PowerShell script working and the bat file executing it, then use Windows Task Scheduler to run the bat file on the schedule that you want.

 

In my examples, the bat file (ExportSQLtoCSV.bat) and the PowerShell script (ExportSQLtoCSV.ps1) are in the C:\Temp folder

 

Sample PowerShell script (ExportSQLtoCSV.ps1)

$sqlinstance = "YourSQLServerInstance"
$SQLDB = "YourDBName"
$Schema = "dbo"
$TableName = "YourTableName"
$User = "SQLUserName"
$Password = "SQLUserPassword"
$CurDate = Get-Date -Format "MMddyyyy_HHmmss"
$OutDir = "C:\Import\Agent\MonitoreFolder"
$query = "SELECT * FROM [$SQLDB].[$Schema].[$TableName];"

$OutFile = "$TableName" + "_" + "$CurDate" + ".csv"
$path = "$OutDir\$OutFile"
$results = Invoke-Sqlcmd -Query $query -ServerInstance $sqlinstance -Username $User -Password $Password
$results | Export-Csv -Path $path -NoTypeInformation -Encoding UTF8

Sample bat file (ExportSQLtoCSV.bat)

PowerShell -NoProfile -ExecutionPolicy Bypass -Command "& 'C:\Temp\ExportSQLtoCSV.ps1'"

 

1 0
You are not allowed to follow up in this post.

Sign in to reply to this post.