I wanted to see what you guys thought about the difference between a script running in WF to get data out of a text file, vs using the ODBC connector to read a CSV file.
Which way would work better? Fast performance, less errors.
Thanks.
I wanted to see what you guys thought about the difference between a script running in WF to get data out of a text file, vs using the ODBC connector to read a CSV file.
Which way would work better? Fast performance, less errors.
Thanks.
Lucas,
To answer your question, I would think that opening the text file and reading and parsing the data in a Script activity would be faster than using the Query activity with an ODBC connection. I don't have metrics but my thought would be that the overhead with the ODBC connection itself would take more time than just simply reading the text file into an array, list, or dictionary and then parsing it an element at a time in a Script activity. On the other hand, going the ODBC route would probably be 'easier' as the ODBC driver will automatically parse the CSV file for you and return formatted data elements.
That being said though; using a CSV or text file as a data source for Workflow would not be my first choice. Workflow is multi-threaded therefore you can potentially have multiple instances of the same workflow trying to open and read the same text or CSV at the same time. That would be especially troublesome (and I am sure would fail) if you were trying to update that data source when Workflow was trying to read it.
My feedback would be to use SQL as a data repository if you are looking for a more robust solution.
If you still need an answer on this a year later one of the things not mentioned in the reply is that ODBC will 'lock' the text file so only one program or process can access it at a time. If something tries to update the text file at the same time as the workflow tries to read it, it'll lock up the workflow, and it'll fail and terminate where it tries to read the CSV.
I've come up with a means of parsing a CSV file with a workflow without resort to the SDK. Extract the text, use Pattern Matching with the regex .*/n to put each line of the CSV into a token of a multivalue token, and then use a pattern match with a regex like .*,.*,.*,(.*),.*,.* to get the values of each row into their own token. That example would get you the fourth column of a CSV with six columns.
You can then do things like update or add to a SQL table from the workflow. An external system drops the CSV in a folder monitored by Import Agent, Import Agent grabs the CSV and extracts the text, and drops it in a folder monitored by the CSV import workflow.
Hope that helps!
Hi Glen
If possible, could you upload some samples?
Thanks!
That's pretty impressive thank you.
Although I kind of don't understand haha. Are you creating a CSV file from SQL or are you import a CSV file in to SQL using that technique?
My goal is to automate extraction of a SQL table in to a CSV file to give to our accounts team, which they eventually load up in to a 3rd party application.
Sorry for the confusion.
Thanks
To push from SQL to CSV, you would probably need to use a Custom Query activity. Two requirements of the query to export to CSV is that it needs to have the access to the Microsoft.ACE.OLEDB.12.0 provider (matching the Workflow service architecture) and a CSV file with header row in place before the query executes.
The requirement to have the CSV file (blank except for header row) in place can be handled by 3 different approaches, but all need to be scripted.
Once the CSV file is in place, you would run a Custom Query like this:
INSERT INTO OPENROWSET('Microsoft.ACE.OLEDB.12.0','Text;Database=D:\;HDR=YES;FMT=Delimited','SELECT * FROM [FileName.csv]') SELECT Field1, Field2, Field3 FROM DatabaseName
That technique is to import CSVs into SQL lookup tables for Laserfiche to use.
To generate a CSV from a SQL table I'd use the tools built into SQL server rather than muck about with Laserfiche Workflow. It'd be more difficult, you'd have to learn SQL scripting, but a more correct and elegant solution that would be easier to maintain.
I need to export a table from SQL to CSV.
I'm working on this now, I appreciate the responses.
I'm no programmer and I am finding this quite difficult to do.
I like the three approaches mentioned and I would deem it better to keep it in LF because I can then email the report to our accounts team from LF.
The goal is to display a report of Accepted and Not Billed quotes - Which is a prefix if you will of the quote system I built for our company.
The accounting team requested a CSV export of the report on top of having a live preview of all the information in Forms.
So, if I can generate this CSV directly from SQL using WF, I can ultimately automate the extraction and email her weekly reports.
Use SQL Server Management Studio for that. You can export query results as a CSV by right clicking on the query result table. My process is for doing things the other way around.
It's not a problem exporting from SQL to CSV, I need to do it using WF.
And it needs to be automated.
Can you do that with SQL?
If the goal is to export a SQL table to a CSV file, why is Workflow involved? Why not use SQL's tools to export?
De Wet Van Der Merwe
See my post above. It tells what is needed to use Workflow to export from SQL Table to CSV file.
Hi everyone.
I have built a super simple, small and compact solution to exporting data from SQL/Forms processes, etc. into a CSV document.
Basically, if you are using a for each within your Workflow to iterate through your data, this project can help you.
I created a GitHub Repository and would like to invite anyone to check it out to achieve the goal this question set out.
https://github.com/Arbitrary5001/LaserficheWorkflowExportToCSV.git
Please feel free to contact me with any feedback/suggestions/etc.
Good luck guys.