Posted Wed, 13 Feb 2019 00:20:56 GMT by

Hi!

I'm using AM11 with Oracle DB and one of the cases is merging data from dataset into database table. I haven't found any specific action for that so I've ended up with looping through dataset row-by-row and performing merge(insert)-commit at every step.

Doing that way it takes half an hour to inset 20-30K rows. Very slow.

May be you have examples of best-practices for that process? Is there a better (faster) way to do it?

Thanks!

Posted Sat, 16 Feb 2019 08:43:50 GMT by

Hello Alex,

We have had clients with the same issue use either queries, powershell scripts, etc... that will accomplish the data dump at a much faster pace, but automate it with AutoMate using a trigger.

Here is a link showing how to automate powershell scripts:

https://community.helpsystems.com/knowledge-base/automate/general/how-to-automate-powershell-scripts/

And here are some very general tips on improving execution speed overall.

https://community.helpsystems.com/knowledge-base/automate/automate/Optimizing-Task-Execution-Speed/ 

 

Posted Fri, 22 Feb 2019 19:18:12 GMT by

Another way, potentially faster, can be too loop the dataset but instead of doing inserts at each loop you build up a SQL string by adding the data to a text variable. When the loop is finished you do one insert instead.

Posted Fri, 08 Mar 2019 00:40:54 GMT by

Hi,

As it is Oracle, you'll probably find some benefit if you use the action "Dataset to CSV" action then use SQL*Loader ( http://www.orafaq.com/wiki/SQL*Loader_FAQ ) to bulk upload the data to a staging table then running a single MERGE command on the imported data.

 

Hope that helps. 

Posted Tue, 19 Mar 2019 22:02:16 GMT by

Thanks everybody!

Finally I end up using insert-by-insert on every step.

Bulk collecting all inserts\merges into a variable and then executing it once does not give real benefits.

SQL*Loader will not fit because we loose control over errors that may occur.

You must be signed in to post in this forum.