Comment posted by
Peeyush
on Wednesday, May 27, 2009 2:48 AM
|
|
I need to upload a CSV to the server, and insert data using a SProc. I cannot use Bulk Insert or SQLBulkCopy, since there are some conditions for the Insert, and involves 5 tables where data is inserted. How can this be done?
Thanks
|
Comment posted by
malcolm sheridan
on Wednesday, May 27, 2009 5:20 AM
|
|
@Peeyush
If you're dealing with large amounts of data, then I would recommend using SqlBulkCopy. If you're not dealing with large data, then you could use LINQ to SQL to insert the data and test your conditions inside your class.
|
Comment posted by
shiraabr
on Wednesday, May 27, 2009 7:39 AM
|
|
Thanks! this is exactly what I was looking for!
|
Comment posted by
Adnan
on Wednesday, May 27, 2009 8:57 AM
|
|
Thanks for telling about bulkcopysql
|
Comment posted by
Peeyush
on Wednesday, May 27, 2009 11:55 AM
|
|
Thanks Malcolm,
The CSV's would be large enoguh. How will SQLBulkCopy help me to test the conditions, and then do a Insert. Can you point me to some article?.
|
Comment posted by
Morteza
on Thursday, May 28, 2009 4:02 PM
|
|
Thanks, By the way, SQL Server 2008 has new table valued datatype that can be passed to stored procedure as parameter. it's ideal for batch inserting and processing.
|
Comment posted by
Gali.Malli
on Monday, June 29, 2009 6:22 AM
|
|
Very thanks for giving this much of useful and no confusion.
|
Comment posted by
Teena Soni
on Friday, July 31, 2009 7:42 AM
|
|
I was used ur code that how to insert into sql server multiple rows it works successfully, thanks
|
Comment posted by
preeti mandovra
on Friday, July 31, 2009 7:44 AM
|
|
code had been successfully run.
|
Comment posted by
lokesh
on Sunday, August 23, 2009 3:05 PM
|
|
Dear Malcolm,
I tested the code and it is reading only one line.So i intialised i=0 to read next value.Afer you increament i++ then i value will be 1 and it
will not become 0 untill you make it zero to read nextline
if (data.Length > 0)
{
i = 0;
if (i == 0)
{
foreach (var item in data)
{
dt.Columns.Add(new DataColumn());
}
i++;
}
|
Comment posted by
TAmilMaran
on Thursday, October 22, 2009 12:23 AM
|
|
i want a help i already Inserted the Data using Bulk copy i want to update the data is there any chance to update using bulk copy
|
Comment posted by
Shah Rukh Khan
on Friday, October 23, 2009 9:47 AM
|
|
Hey ...nnnniiice .....tallllkkkkkk.........
|
Comment posted by
PeteGreg
on Wednesday, October 28, 2009 8:43 PM
|
|
Very nice!
One thing to note: If you're source csv has a header row you want to skip adding that as a datarow after you create the columns...
if (i == 0)
{
foreach (var item in data)
{
dt.Columns.Add(new DataColumn());
}
i++;
} else {
DataRow row = dt.NewRow();
row.ItemArray = data;
dt.Rows.Add(row);
}
|
Comment posted by
Mike Murphy
on Monday, January 4, 2010 8:24 AM
|
|
Do you know if the SqlBulkCopy uses the SQLBulkLoad COM object in the background? We are using SQLXMLBulkload now and might consider implementing SqlBulkCopy as long as its not a wrapper using the same technology we're using now. ;-)
|
Comment posted by
Bijayani
on Monday, January 25, 2010 8:12 AM
|
|
Hi,
I happened to this post and would like to share a link for some knowledge where a software engineer has shared a tip on "Insert records from CSV file to SQL table using VB.NET".
Here is the link:
http://www.mindfiresolutions.com/insert-records-from-csv-file-to-sql-table-using-vbnet-766.php
Hope you find it useful and of assistance.
Thanks,
Bijayani
|
Comment posted by
Eddie
on Monday, February 8, 2010 8:32 PM
|
|
Great article but had a question. I am running this and works great until my table starts getting very large. Currently it contains about 80 million records. I have removed all indexes (aside from PK) and i have tried different methods but keep finding myself with a very long delay. It takes about 3+ mins to bulk insert about 50 thousand records. Any ideas on what the issue might be? Something on the table that is turned on or in the database. Appreciate any and all information.
|
Comment posted by
md
on Tuesday, February 23, 2010 4:58 AM
|
|
i want t add my local table to sql server table and i also want to check for multiple copies please help me
|
Comment posted by
Malcolm Sheridan
on Thursday, March 18, 2010 4:37 AM
|
|
@Eddie
80 million rows is allot of data. Perhaps with a question like that you'd be better off asking a database developer.
|
Comment posted by
James
on Monday, March 29, 2010 3:08 PM
|
|
This is a great article. However, it does not handle the case of a csv file with data that may contain ","s. For instance, a row like this:
1, "Test1,Test2", "Third Column"
This is a pretty standard case. Can anyone think of a quick bit of code to allow for qualifiers like this?
|
Comment posted by
Tim
on Wednesday, May 19, 2010 1:30 PM
|
|
Great article. I was able to insert +300K plus, but what about say 3 million to 20 million has anyone done that?
|
Comment posted by
Jeremy
on Tuesday, July 20, 2010 10:29 AM
|
|
Very nice. I had been struggling with inserted data from an excel file. Your code helped alot. I was just missing the column mapping.
|
Comment posted by
Haresh Ambaliya
on Sunday, August 22, 2010 6:46 AM
|
|
Hi,
I find error in bulk copy operation
Error: The locale id '0' of the source column 'StandardName' and the locale id '1033' of the destination column 'StandardName' do not match. I am coping from sql 7 to sql 2000/2005
can anyone help me?
|
Comment posted by
Kevin
on Monday, October 18, 2010 11:19 PM
|
|
you are a genius my friend.
Thanks a million for this article, I have a bulk update to do every 10 minutes and my test data was taking approx 90 seconds to be entered into the DB, the same data is now taking 0.8 seconds
Cheers
|
Comment posted by
abc
on Thursday, October 28, 2010 4:40 AM
|
|
when i upload csv file it has double inverted commas i want to remove that commas and upload it ...
is that possible than please the post that code.
|
Comment posted by
abc
on Thursday, October 28, 2010 4:50 AM
|
|
when i upload csv file it has double inverted commas i want to remove that commas and upload it ...
is that possible than please the post that code.
|
Comment posted by
Abdul Rajak Samsudeen
on Wednesday, December 22, 2010 6:48 AM
|
|
Thank you so much ....
|
Comment posted by
Aswath
on Monday, January 3, 2011 1:26 AM
|
|
Good one.....Really Helped.
|
Comment posted by
Vinoth Ganesh
on Wednesday, April 20, 2011 7:11 AM
|
|
Thanks a lot Malcolm. I am doing a Application where we are supposed to bulk insert into many tables which has parent child relationship.Is it possible to map a Single CSV File to multiple Table column in a Single Bilk Import operation.
|
Comment posted by
GreenTea
on Thursday, May 19, 2011 10:34 PM
|
|
Hi there,
Great article, it helps me alot with my school project. However how do i make the program to overwrite existing data? because this program just keep on concatenate the data in.. how do i make it overwrite existing data? thanks
|
Comment posted by
GreenTea
on Thursday, May 19, 2011 10:45 PM
|
|
Hi there,
Great article, it helps me alot with my school project. However how do i make the program to overwrite existing data? because this program just keep on concatenate the data in.. how do i make it overwrite existing data? thanks
|
Comment posted by
Gravitas
on Tuesday, May 31, 2011 12:25 PM
|
|
Excellent article - it works - thank you!
|
Comment posted by
Gravitas
on Tuesday, May 31, 2011 12:47 PM
|
|
A quick test of this code revealed that this technique writes 50,000 rows per second. I have a pretty standard PC.
|
Comment posted by
Gravitas
on Tuesday, May 31, 2011 12:48 PM
|
|
Make sure you have indexing turned off, and no primary keys, or else the write speed will slow down as the database gets larger. Use horizontal partitioning, by date, to speed up retrieval.
|
Comment posted by
srinu
on Thursday, June 2, 2011 5:19 AM
|
|
please add csv file format.....thanks for provied this code
|
Comment posted by
bhupal shetty
on Tuesday, June 14, 2011 1:09 AM
|
|
great article it was.helped lot of people who are working with bulk data insertion into the data base.
|
Comment posted by
miri
on Thursday, August 11, 2011 2:52 AM
|
|
i use the copybulkinsert
i have a file that contains character fileds with inverted commas
and it is not work good
any idea?
|
Comment posted by
Bruce
on Thursday, September 29, 2011 10:42 AM
|
|
Thank you very much for this - just what I was looking for!
|
Comment posted by
Bruce
on Thursday, September 29, 2011 10:44 AM
|
|
Thank you very much for this - just what I was looking for!
|
Comment posted by
Rashmi Kant
on Tuesday, November 1, 2011 7:35 AM
|
|
use the following link for bulk insertion using XML from asp.net
http://oops-solution.blogspot.com/2011/11/aspnetmultiple-insertion-of-record-into.html
|
Comment posted by
Dru
on Tuesday, November 22, 2011 5:38 AM
|
|
TextFieldParser is an eaiser way to parse CSVs by the way.
|
Comment posted by
Brian
on Thursday, January 5, 2012 9:36 AM
|
|
Does the bulk insertion Allow adhear to locks. Thus locking the records preventing reading until the last record is inserted. I was troubleshooting some code that I know uses bulk imporing from .net but I was able to read dirty records while the data was importing. This was concerning.
For Peeyush, The process would be like this
1) Import data into a staging table that is a copy of your CSV file
2) Run a stored procedure that processes the data (hopefully using sets not on record at a time) into the 5 other tables you need.
|
Comment posted by
xyz
on Thursday, January 5, 2012 10:20 PM
|
|
abc
|
Comment posted by
RyanKiller01
on Tuesday, March 13, 2012 8:50 AM
|
|
Hi there. Is it possible to make the columnMappings in a loop so that the adding does not get very long?
|
Comment posted by
Albert
on Monday, May 7, 2012 11:10 PM
|
|
Thannnnnnnnnnnnnnnnnnnnnnnnnnnnkkkkkkkkssssssss ! ! ! !
|
Comment posted by
raj
on Wednesday, August 22, 2012 5:55 AM
|
|
easy way to do it
http://sqlcopy.blogspot.in/
|
Comment posted by
Mansour
on Sunday, September 9, 2012 12:55 AM
|
|
I have an issue while uploading the excel sheet, if the excel column name has dot for example [Val.Type], i got an error saying ” The given ColumnMapping does not match up with any column in the source or destination. “
|
Comment posted by
alao adebisi
on Monday, September 24, 2012 4:47 AM
|
|
Please how do i use sqlbulkinsert to move records from one single table that has many fields to three different tables that use foreign key of the one that has larger fields.thanks.
|
Comment posted by
keke
on Thursday, February 28, 2013 12:11 AM
|
|
ht
|
Comment posted by
Anonimo - Colombia
on Saturday, May 11, 2013 6:06 PM
|
|
Muchas Gracias, Optimizo mi proceso de 10 seg a milisegundos
|
Comment posted by
Thank you so much!!!
on Sunday, May 12, 2013 4:42 PM
|
|
This section of code is a life save for me. I have about 15 mil+ records which need to be placed in a SQL database. One quick question if I may. Many CSV files will have double quotes as part of the data. How would we strip those out of the upload?
Thanks again!!!
|
Comment posted by
Khuram Jan
on Friday, January 3, 2014 4:34 AM
|
|
I have Date coloumn in csv file in the format dd.mm.yyyy i-e 30.02.2013 etc
How can we format using sqlBulk to insert data into sqlserver (ver 2005).
thanks
|
Comment posted by
Gilbert
on Wednesday, January 29, 2014 9:50 AM
|
|
Vb.net code to insert more item for once and also code to upload excel into SQL Database of student Marks
|
Comment posted by
Sajjan Kumar
on Wednesday, July 9, 2014 1:50 AM
|
|
SqlBulkCopy with updating existing data using c# code
|
Comment posted by
SA
on Monday, July 21, 2014 1:31 PM
|
|
Thanks
this code helps me a lot SQLBLuk
|
Comment posted by
Carab
on Friday, January 23, 2015 12:44 PM
|
|
Thanks. It is helpful but I have one problem; when I want to upload large CSV file I am getting this error "'system.outofmemoryexception' occurred in mscorlib.dll"
Is there a way to change the code? Thanks.
|
Comment posted by
yutu
on Monday, February 9, 2015 4:07 AM
|
|
tyutyuty
|
Comment posted by
malathi
on Tuesday, March 10, 2015 5:50 AM
|
|
I have table with two columns only, together primary key. sqlbulkcopy thru .net says column cannot insert dbnull.value
|
|