Bulk Insert into SQL Server using SqlBulkCopy

Posted by: Malcolm Sheridan , on 5/27/2009, in Category WinForms & WinRT
Views: 428374
Abstract: The following article demonstrates how to bulk insert data into SQL Server 2008 using SqlBulkCopy.
Bulk Insert into SQL Server using SqlBulkCopy
 
I was recently tasked with a project at a company to update an SQL Server 2008 database with large amounts of data each day. The task at first seemed daunting due to the files exceeding well over 400,000 records and there were several that needed processing daily. I first tried LINQ to SQL, but with the amount of data, the inserts were slow performing to say the least. Then I remembered the SqlBulkCopy class. SqlBulkCopy lets you efficiently bulk load a SQL Server table with data from another source. The SqlBulkCopy class can be used to write data only to SQL Server tables. However, the data source is not limited to SQL Server; any data source can be used, as long as the data can be loaded to a DataTable instance or read with a  IDataReader instance. For this example the file will contain roughly 1000 records, but this code can handle large amounts of data.
To begin with let’s create a table in SQL Server that will hold the data. Copy the following T-SQL into SQL Server to create your table:
 
CREATE TABLE [dbo].[Censis](
          [Suburb] [varchar](200) NULL,
          [NotStated] [int] NULL,
          [NotApplicable] [int] NULL,
          [Fishing] [int] NULL,
          [Mining] [int] NULL,
          [Manufacturing] [int] NULL,
          [Electricity] [int] NULL,
          [Construction] [int] NULL
) ON [PRIMARY]
GO
 
The table above will hold Censis data that is freely available to download in Australia.
 
The next item to do is create a console application that will bulk load the data. Open Visual Studio 2008 and choose File > New > Windows > Console Application. 
 
Before moving on, to explain the code I have to work backwards and explain the final method that bulk loads data. SqlBulkCopy has a method called WriteToServer. One of the overloads of this method takes a DataTable as the parameter. Because a DataTable contains rows and columns, this seemed like a logical choice for the task I was facing.
 
Jumping back to the example we now know we need to create a DataTable that contains the information from the text file. The code below demonstrates how to do this:
 
C#
 
DataTable dt = new DataTable();
string line = null;
int i = 0;
 
using (StreamReader sr = File.OpenText(@"c:\temp\table1.csv"))
{  
      while ((line = sr.ReadLine()) != null)
      {
            string[] data = line.Split(',');
            if (data.Length > 0)
            {
                  if (i == 0)
                  {
                  foreach (var item in data)
                  {
                        dt.Columns.Add(new DataColumn());
                  }
                  i++;
             }
             DataRow row = dt.NewRow();
             row.ItemArray = data;
             dt.Rows.Add(row);
             }
      }
}
 
VB.NET
 
Dim dt As New DataTable()
Dim line As String = Nothing
Dim i As Integer = 0
 
Using sr As StreamReader = File.OpenText("c:\temp\table1.csv")
      line = sr.ReadLine()
      Do While line IsNot Nothing
             Dim data() As String = line.Split(","c)
                  If data.Length > 0 Then
                        If i = 0 Then
                         For Each item In data
                                    dt.Columns.Add(New DataColumn())
                         Next item
                         i += 1
                        End If
                   Dim row As DataRow = dt.NewRow()
                   row.ItemArray = data
                   dt.Rows.Add(row)
                  End If
            line = sr.ReadLine()
      Loop
End Using
 
 
In the code above, I created a DataTable that will store all the information from the csv file. The csv file resides in the C:\Temp directory. I am using a StreamReader object to open the file and read each line in the file. Each line is then split up into a string array. That string array will be assigned to each DataRow as the ItemArray value. This sets the values for the row through the array.
 
When the file has been read, the next thing to do is use the SqlBulkCopy class to insert the data into SQL Server. The following code demonstrates how to do this:
 
C#
 
using (SqlConnection cn = new SqlConnection(ConfigurationManager.ConnectionStrings["ConsoleApplication3.Properties.Settings.daasConnectionString"].ConnectionString))
{
      cn.Open();
      using (SqlBulkCopy copy = new SqlBulkCopy(cn))
      {
            copy.ColumnMappings.Add(0, 0);
            copy.ColumnMappings.Add(1, 1);
            copy.ColumnMappings.Add(2, 2);
            copy.ColumnMappings.Add(3, 3);
            copy.ColumnMappings.Add(4, 4);
            copy.DestinationTableName = "Censis";
            copy.WriteToServer(dt);
      }
} 
 
VB.NET
 
Using cn As New SqlConnection(ConfigurationManager.ConnectionStrings("ConsoleApplication3.Properties.Settings.daasConnectionString").ConnectionString)
      cn.Open()
       Using copy As New SqlBulkCopy(cn)
             copy.ColumnMappings.Add(0, 0)
                  copy.ColumnMappings.Add(1, 1)
                  copy.ColumnMappings.Add(2, 2)
                  copy.ColumnMappings.Add(3, 3)
                  copy.ColumnMappings.Add(4, 4)
                  copy.DestinationTableName = "Censis"
                  copy.WriteToServer(dt)
       End Using
End Using
 
SqlBulkCopy uses ADO.NET to connect to a database to bulk load the data. I have created an SqlConnection object, and that object reference is used to create the SqlBulkCopy object. The DestinationTableName property references a table in the database where the data is to be loaded. A handy feature of SqlBulkCopy is the SqlBulkCopyColumnMappingCollection. Column mappings define the relationships between columns in the data source and columns in the destination. This is handy if the data source file has columns that don’t need to be inserted into the database. Column mappings can be set by an index, such as the example above, or they can be set by the name of the column. Using the index is handy when you’re working with files that contain no column names. Finally the data is sent to the database by running the WriteToServer method.
 

The SqlBulkCopy class has been around since .NET 2.0, but it is easy to forget about this as the languages move forward, but this is one object you should keep in mind when transferring large amounts of data. The entire source code of this article can be downloaded over here

This article has been editorially reviewed by Suprotim Agarwal.

Absolutely Awesome Book on C# and .NET

C# and .NET have been around for a very long time, but their constant growth means there’s always more to learn.

We at DotNetCurry are very excited to announce The Absolutely Awesome Book on C# and .NET. This is a 500 pages concise technical eBook available in PDF, ePub (iPad), and Mobi (Kindle).

Organized around concepts, this Book aims to provide a concise, yet solid foundation in C# and .NET, covering C# 6.0, C# 7.0 and .NET Core, with chapters on the latest .NET Core 3.0, .NET Standard and C# 8.0 (final release) too. Use these concepts to deepen your existing knowledge of C# and .NET, to have a solid grasp of the latest in C# and .NET OR to crack your next .NET Interview.

Click here to Explore the Table of Contents or Download Sample Chapters!

What Others Are Reading!
Was this article worth reading? Share it with fellow developers too. Thanks!
Share on LinkedIn
Share on Google+

Author
Malcolm Sheridan is a Microsoft awarded MVP in ASP.NET, a Telerik Insider and a regular presenter at conferences and user groups throughout Australia and New Zealand. Being an ASP.NET guy, his focus is on web technologies and has been for the past 10 years. He loves working with ASP.NET MVC these days and also loves getting his hands dirty with jQuery and JavaScript. He also writes technical articles on ASP.NET for SitePoint and other various websites. Follow him on twitter @malcolmsheridan


Page copy protected against web site content infringement 	by Copyscape




Feedback - Leave us some adulation, criticism and everything in between!
Comment posted by Peeyush on Wednesday, May 27, 2009 2:48 AM
I need to upload a CSV to the server, and insert data using a SProc. I cannot use Bulk Insert or SQLBulkCopy, since there are some conditions for the Insert, and involves 5 tables where data is inserted. How can this be done?
Thanks
Comment posted by malcolm sheridan on Wednesday, May 27, 2009 5:20 AM
@Peeyush
If you're dealing with large amounts of data, then I would recommend using SqlBulkCopy.  If you're not dealing with large data, then you could use LINQ to SQL  to insert the data and test your conditions inside your class.
Comment posted by shiraabr on Wednesday, May 27, 2009 7:39 AM
Thanks! this is exactly what I was looking for!
Comment posted by Adnan on Wednesday, May 27, 2009 8:57 AM
Thanks for telling about bulkcopysql
Comment posted by Peeyush on Wednesday, May 27, 2009 11:55 AM
Thanks Malcolm,
The CSV's would be large enoguh. How will SQLBulkCopy help me to test the conditions, and then do a Insert. Can you point me to some article?.
Comment posted by Morteza on Thursday, May 28, 2009 4:02 PM
Thanks, By the way, SQL Server 2008 has new table valued datatype that can be passed to stored procedure as parameter. it's ideal for batch inserting and processing.
Comment posted by Gali.Malli on Monday, June 29, 2009 6:22 AM
Very thanks for giving this much of useful and no confusion.
Comment posted by Teena Soni on Friday, July 31, 2009 7:42 AM
I was used ur code that how to insert into sql server multiple rows it works successfully, thanks
Comment posted by preeti mandovra on Friday, July 31, 2009 7:44 AM
code had been successfully run.
Comment posted by lokesh on Sunday, August 23, 2009 3:05 PM
Dear Malcolm,

I tested the code and it is reading only one line.So i intialised i=0 to read next value.Afer you increament i++ then i value will be 1 and it
will not become 0 untill you make it zero to read nextline

if (data.Length > 0)
                    {
                        
                        i = 0;
                        if (i == 0)
                        {
                        
                            foreach (var item in data)
                            {
                                
                                                               dt.Columns.Add(new DataColumn());
                                                                                                                          
                            }
                            i++;
                        
                        }
Comment posted by TAmilMaran on Thursday, October 22, 2009 12:23 AM
i want a help i already Inserted the Data using Bulk copy i want to update the data is there any chance to update using bulk copy
Comment posted by Shah Rukh Khan on Friday, October 23, 2009 9:47 AM
Hey ...nnnniiice .....tallllkkkkkk.........
Comment posted by PeteGreg on Wednesday, October 28, 2009 8:43 PM
Very nice!

One thing to note: If you're source csv has a header row you want to skip adding that as a datarow after you create the columns...

if (i == 0)
   {
      foreach (var item in data)
         {
            dt.Columns.Add(new DataColumn());
            }
            i++;
   } else {
      DataRow row = dt.NewRow();
                row.ItemArray = data;
                dt.Rows.Add(row);
   }
Comment posted by Mike Murphy on Monday, January 4, 2010 8:24 AM
Do you know if the SqlBulkCopy uses the SQLBulkLoad COM object in the background? We are using SQLXMLBulkload now and might consider implementing SqlBulkCopy as long as its not a wrapper using the same technology we're using now. ;-)
Comment posted by Bijayani on Monday, January 25, 2010 8:12 AM
Hi,

I happened to this post and would like to share a link for some knowledge where a software engineer has shared a tip on "Insert records from CSV file to SQL table using VB.NET".

Here is the link:
http://www.mindfiresolutions.com/insert-records-from-csv-file-to-sql-table-using-vbnet-766.php

Hope you find it useful and of assistance.

Thanks,
Bijayani
Comment posted by Eddie on Monday, February 8, 2010 8:32 PM
Great article but had a question. I am running this and works great until my table starts getting very large. Currently it contains about 80 million records. I have removed all indexes (aside from PK) and i have tried different methods but keep finding myself with a very long delay. It takes about 3+ mins to bulk insert about 50 thousand records. Any ideas on what the issue might be? Something on the table that is turned on or in the database. Appreciate any and all information.
Comment posted by md on Tuesday, February 23, 2010 4:58 AM
i want t add my local table to sql server table and i also want to check for multiple copies please help me
Comment posted by Malcolm Sheridan on Thursday, March 18, 2010 4:37 AM
@Eddie
80 million rows is allot of data.  Perhaps with a question like that you'd be better off asking a database developer.
Comment posted by James on Monday, March 29, 2010 3:08 PM
This is a great article. However, it does not handle the case of a csv file with data that may contain ","s. For instance, a row like this:
1, "Test1,Test2", "Third Column"

This is a pretty standard case. Can anyone think of a quick bit of code to allow for qualifiers like this?
Comment posted by Tim on Wednesday, May 19, 2010 1:30 PM
Great article.  I was able to insert +300K plus, but what about say 3 million to 20 million has anyone done that?  
Comment posted by Jeremy on Tuesday, July 20, 2010 10:29 AM
Very nice. I had been struggling with inserted data from an excel file. Your code helped alot. I was just missing the column mapping.
Comment posted by Haresh Ambaliya on Sunday, August 22, 2010 6:46 AM
Hi,
I find error in bulk copy operation
Error: The locale id '0' of the source column 'StandardName' and the locale id '1033' of the destination column 'StandardName' do not match. I am coping from sql 7 to sql 2000/2005
can anyone help me?
Comment posted by Kevin on Monday, October 18, 2010 11:19 PM
you are a genius my friend.
Thanks a million for this article, I have a bulk update to do every 10 minutes and my test data was taking approx 90 seconds to be entered into the DB, the same data is now taking 0.8 seconds
Cheers
Comment posted by abc on Thursday, October 28, 2010 4:40 AM
when i upload csv file it has double inverted commas i want to remove that commas and upload it ...
is that possible than please the post that code.
Comment posted by abc on Thursday, October 28, 2010 4:50 AM
when i upload csv file it has double inverted commas i want to remove that commas and upload it ...
is that possible than please the post that code.
Comment posted by Abdul Rajak Samsudeen on Wednesday, December 22, 2010 6:48 AM
Thank you so much ....
Comment posted by Aswath on Monday, January 3, 2011 1:26 AM
Good one.....Really Helped.
Comment posted by Vinoth Ganesh on Wednesday, April 20, 2011 7:11 AM
Thanks a lot Malcolm. I am doing a Application where we are supposed to bulk insert into many tables which has parent child relationship.Is it possible to map a Single CSV File to multiple Table column in a Single Bilk Import operation.
Comment posted by GreenTea on Thursday, May 19, 2011 10:34 PM
Hi there,
Great article, it helps me alot with my school project. However how do i make the program to overwrite existing data? because this program just keep on concatenate the data in.. how do i make it overwrite existing data? thanks
Comment posted by GreenTea on Thursday, May 19, 2011 10:45 PM
Hi there,
Great article, it helps me alot with my school project. However how do i make the program to overwrite existing data? because this program just keep on concatenate the data in.. how do i make it overwrite existing data? thanks
Comment posted by Gravitas on Tuesday, May 31, 2011 12:25 PM
Excellent article - it works - thank you!
Comment posted by Gravitas on Tuesday, May 31, 2011 12:47 PM
A quick test of this code revealed that this technique writes 50,000 rows per second. I have a pretty standard PC.
Comment posted by Gravitas on Tuesday, May 31, 2011 12:48 PM
Make sure you have indexing turned off, and no primary keys, or else the write speed will slow down as the database gets larger. Use horizontal partitioning, by date, to speed up retrieval.
Comment posted by srinu on Thursday, June 2, 2011 5:19 AM
please add csv file format.....thanks for provied this code
Comment posted by bhupal shetty on Tuesday, June 14, 2011 1:09 AM
great article it was.helped lot of people who are working with bulk data insertion into the data base.
Comment posted by miri on Thursday, August 11, 2011 2:52 AM
i use the copybulkinsert
i have a file that contains character fileds with inverted commas
and it is not work good
any idea?

Comment posted by Bruce on Thursday, September 29, 2011 10:42 AM
Thank you very much for this - just what I was looking for!
Comment posted by Bruce on Thursday, September 29, 2011 10:44 AM
Thank you very much for this - just what I was looking for!
Comment posted by Rashmi Kant on Tuesday, November 1, 2011 7:35 AM
use the following link for bulk insertion using XML from asp.net
http://oops-solution.blogspot.com/2011/11/aspnetmultiple-insertion-of-record-into.html
Comment posted by Dru on Tuesday, November 22, 2011 5:38 AM
TextFieldParser is an eaiser way to parse CSVs by the way.
Comment posted by Brian on Thursday, January 5, 2012 9:36 AM
Does the bulk insertion Allow adhear to locks.  Thus locking the records preventing reading until the last record is inserted.  I was troubleshooting some code that I know uses bulk imporing from .net but I was able to read dirty records while the data was importing.  This was concerning.

For Peeyush,  The process would be like this

1)  Import data into a staging table that is a copy of your CSV file
2)  Run a stored procedure that processes the data (hopefully using sets not on record at a time)  into the 5 other tables you need.

Comment posted by xyz on Thursday, January 5, 2012 10:20 PM
abc
Comment posted by RyanKiller01 on Tuesday, March 13, 2012 8:50 AM
Hi there. Is it possible to make the columnMappings in a loop so that the adding does not get very long?
Comment posted by Albert on Monday, May 7, 2012 11:10 PM
Thannnnnnnnnnnnnnnnnnnnnnnnnnnnkkkkkkkkssssssss  ! ! ! !
Comment posted by raj on Wednesday, August 22, 2012 5:55 AM
easy way to do it

http://sqlcopy.blogspot.in/
Comment posted by Mansour on Sunday, September 9, 2012 12:55 AM
I have an issue while uploading the excel sheet, if the excel column name has dot for example [Val.Type], i got an error saying ” The given ColumnMapping does not match up with any column in the source or destination. “
Comment posted by alao adebisi on Monday, September 24, 2012 4:47 AM
Please how do i use sqlbulkinsert to move records from one single table that has many fields to three different tables that use foreign key of the one that has larger fields.thanks.
Comment posted by keke on Thursday, February 28, 2013 12:11 AM
ht
Comment posted by Anonimo - Colombia on Saturday, May 11, 2013 6:06 PM
Muchas Gracias, Optimizo mi proceso de 10 seg a milisegundos
Comment posted by Thank you so much!!! on Sunday, May 12, 2013 4:42 PM
This section of code is a life save for me.  I have about 15 mil+ records which need to be placed in a SQL database.  One quick question if I may.  Many CSV files will have double quotes as part of the data.  How would we strip those out of the upload?

Thanks again!!!
Comment posted by Khuram Jan on Friday, January 3, 2014 4:34 AM
I have Date coloumn in csv file in the format dd.mm.yyyy i-e 30.02.2013 etc
How can we format using sqlBulk to insert data into sqlserver (ver 2005).
thanks
Comment posted by Gilbert on Wednesday, January 29, 2014 9:50 AM
Vb.net code to insert more item for once and also code to upload excel into SQL Database of student Marks
Comment posted by Sajjan Kumar on Wednesday, July 9, 2014 1:50 AM
SqlBulkCopy with updating existing data using c# code
Comment posted by SA on Monday, July 21, 2014 1:31 PM
Thanks
this code helps me a lot SQLBLuk
Comment posted by Carab on Friday, January 23, 2015 12:44 PM
Thanks. It is helpful but I have one problem; when I want to upload large CSV file I am getting this error "'system.outofmemoryexception' occurred in mscorlib.dll"
Is there a way to change the code? Thanks.
Comment posted by yutu on Monday, February 9, 2015 4:07 AM
tyutyuty
Comment posted by malathi on Tuesday, March 10, 2015 5:50 AM
I have table with two columns only, together primary key. sqlbulkcopy thru .net says column  cannot insert dbnull.value