Business Intelligence Blogs

View blogs by industry experts on topics such as SSAS, SSIS, SSRS, Power BI, Performance Tuning, Azure, Big Data and much more! You can also sign up to post your own business intelligence blog.

Maximum Insert Commit Size

  • 2 July 2010
  • Author: ShawnHarrison
  • Number of views: 29394
  • 0 Comments

A few days ago, I was working with a client that recived the following warning on the connection managers in his SSIS packages.

[OLE DB Destination [40]] Information: The Maximum insert commit size property of the OLE DB destination "component "OLE DB Destination" (40)" is set to 0. This property setting can cause the running package to stop responding. For more information, see the F1 Help topic for OLE DB

The packages were executing successfully, but he wanted to prevent that warning message. This was an easy fix.

When you are using an OLE DB destination, and you choose the 'Table or View - fast load' as your data access mode, you will see the 'Maximum Insert Commit Size' property. This is where you can specify the maximum number of rows that can be processed and inserted into the destination in one batch insert. By default, this property is set to 2147483647. There are two ways to change this value.

You can right click on the destination in the data flow task and click edit. You will see it on the Connection Manager tab.

Connection Manager Properties

 

The other way to change it it in the properties pane for the destination. You can find it under the custom properties section.

Properties Pane

 

I don't know how the default value was changed to 0, but it may have had something to do with the fact that he was converting these packages from DTS to SSIS.

Print
Tags:
Rate this article:
No rating

ShawnHarrisonShawnHarrison

Other posts by ShawnHarrison

Please login or register to post comments.