Posted Mon, 27 Aug 2018 22:25:47 GMT by

I'm using BPA v10.7.0.3 and am wondering if there are any size/row limits to datasets. I have a query that is returning 39 columns (short string, date, int fields ... no blob or long text fields) for around 550,000 rows but have another use case that will be querying 6 columns for over 2 million rows and saving the output to a dataset.

I think I recall reading a while back that there was a 2GB limit to the size of datasets but I'm not sure if this is still the case or how someone could go about evaluating the size of a dataset to know 1) If it needs to be split into multiple datasets or 2) If an alternate approach (ie. spool output to flat file, etc) would be more appropriate.

Any help on this one?

Posted Thu, 06 Sep 2018 00:51:56 GMT by

Hello Adam,

 

I ran a test using Automate BPA server v10 x64 and I was able to create a dataset and get row count for a table that had 2.5 million rows and 9 columns. Looping this large dataset will take a lot of resources and time (hours) from the machine while the task is running (BPAtask.exe). Best thing to do is to split the data if possible. Please keep in mind that if you are running 32 bit version of Automate there is a 2 or 3 gig memory limitation usage. Another thing to keep in mind is that it will be best to run the task from Workflow level or repository since it will run faster than running it in Task Builder. Please let us know if you have any further questions.

You must be signed in to post in this forum.