I have an OLE DB Source from sql server, two lookups and one ole db destination to sql server. The select in the OLE DB source needs to get a row_number value to create an autoincrementing number that starts at the maximum number already in the destination table in the OLE DB destination task. If I run more than one month's worth of data (about 25,000 rows) I get blocking from the select for the max id from the fact_destination for the bulk insert. I have the default read committed isolation on. In the past I have had the same or similar situation and have resolved it by either just selecting one month at a time or by turning on snapshot isolation but I'm wondering if there is something better I could be doing. I have tried to look into setting the maximum insert commit size but I think that only affects the batch size so that the trans log doesn't fill up. I do have the table lock and check constraints selected for the ole db destination task. Each task has its own connection to the database. Is there anything else that I can do to be able to select more data without blocking?
Psudocode in ole db source:
set nocount on
declare @main table(column1 nvarchar(50),column2 nvarchar(50),column3 nvarchar(50))
insert into @main
exec storedproc ('1/1/2012','1/31/2012') --takes about 10 min to select data from another db and table
select row_number() over(order by column1,column2,column3)+(select coalsce(max(id),0) from fact_destination) as id
,getdate() as createdatetime
I would get your max number with an execute sql tak in the control flow before the data flow. Save the max number in a variable and use it in the source query.