Jump to content

SQL Insert from text file


Recommended Posts

I am taking a DB management class but i would really like to play with it at home so am trying to copy the db's to my computer so i did a simple select * from collect and saved the results as a csv so i have all the records

at home i made the db

create table collection
(
BOOK_ID CHAR(5) NOT NULL,
SCAN_CODE CHAR(6) NOT NULL,
LIBRARY CHAR(8) NOT NULL,
BOOK_STATUS CHAR(1) NOT NULL,
DATE_LAST_OUT DATETIME NOT NULL,
TIMES_CHECKED_OUT NUMERIC(4) NOT NULL,
CHECK_OUT_INTERVAL CHAR(1) NOT NULL,
COST NUMERIC(7,2) NULL,
YEAR_ACQUIRED CHAR(4) NULL,
LAST_DATE_UPDATED DATETIME NOT NULL
)

then tried to insert the data but then i get errors and have no idea what i need to do

BULK INSERT collection 
FROM 'C:\DB\collection.csv'
WITH
(
FIELDTERMINATOR = ','
)

Errors

Msg 4863, Level 16, State 1, Line 1

Bulk load data conversion error (truncation) for row 1, column 1 (BOOK_ID).

Msg 4866, Level 16, State 8, Line 1

The bulk load failed. The column is too long in the data file for row 1, column 10. Verify that the field terminator and row terminator are specified correctly.

Msg 7399, Level 16, State 1, Line 1

The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.

Msg 7330, Level 16, State 2, Line 1

Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)"

Sample of data
00220,A63982,JCLS,R,1999-07-20 00:00:00.000,5,C,12.59,1981,1990-09-04 00:00:00.000

00807,A64129,JCLS,I,1999-08-02 00:00:00.000,6,C,NULL,1973,1990-10-22 00:00:00.000

00206,A65257,JCLS,I,1999-07-12 00:00:00.000,1,C,NULL,1980,1991-01-30 00:00:00.000

00931,A65258,JCLS,I,1999-12-22 00:00:00.000,9,B,8.95,1975,1990-03-07 00:00:00.000

00669,A65382,JCLS,O,1999-12-19 00:00:00.000,3,C,7.95,1971,1990-09-01 00:00:00.000

Thanks

Link to comment
Share on other sites


First of all. I want to demonstrate a example.

A file (Datafile.txt) that contains some fields seprated by ","

420,Ramesh,8,10000,999
421,Rakesh,8,10000,1000
422,Rajesh,8,10000,1001

The first thing U have to read this file in exact format that reads field by field and passes those values to Insert SQL Command. I m using Visual Basic 2005

Check the valid format of file and read the values.

Private Sub Button_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button3.Click
Dim tfp As FileIO.TextFieldParser = My.Computer.FileSystem.OpenTextFieldParser(Datafile.txt)

tfp.Delimiters = New String() {","}

tfp.HasFieldsEnclosedInQuotes = False

While Not tfp.EndOfData

Dim Fields() As String = tfp.ReadFields
'LoadFiles()
InsertSQL(Fields)
End While
End Sub

Private Sub InsertSQL(ByVal Values() As String)

Dim cmd As New SqlClient.SqlCommand()

cmd.CommandText = "Insert Into Student (StudentID,StudentName,Semester,Fees,RecNo) Values(@ID,@Name,@Sem,@Fees,@RecNo)"

Dim MySQLCOnnection As New System.Data.SqlClient.SqlConnection("Data Source=.\SQLEXPRESS;AttachDbFilename=|DataDirectory|\Database1.mdf;Integrated Security=True;User Instance=True") [i]////???? Connection String of Database ///////??????[/i]

cmd.Connection = MySQLCOnnection

Dim p1 As New SqlClient.SqlParameter("@ID", Values(0))

Dim p2 As New SqlClient.SqlParameter("@Name", Values(1))

Dim p3 As New SqlClient.SqlParameter("@Sem", Values(2))

Dim p4 As New SqlClient.SqlParameter("@Fees", Values(3))

Dim p5 As New SqlClient.SqlParameter("@RecNO", Values(4))

cmd.Parameters.Add(p1)

cmd.Parameters.Add(p2)

cmd.Parameters.Add(p3)

cmd.Parameters.Add(p4)

cmd.Parameters.Add(p5)

MySQLCOnnection.Open()
cmd.ExecuteNonQuery()
MySQLCOnnection.Close()
End Sub

:thumbup I hope U can understand this .... If anyproblem regrading to code. U may put ur queries. :thumbup

Link to comment
Share on other sites

Sorry if I haven't replied sooner, was away for most of 2 weeks.

First things first: congrats Neo on writing far better stuff than I would have expected to see - specifically using parameterized queries instead of string concatenation (even though no SqlDbType is specified for the parameters), and no ghetto homebrew buggy csv parser.

However, it's still a VERY minimalistic (or quick and dirty) approach... Lots of little issues:

-no error handling at all anywhere

-no comments, documentation or anything like that anywhere (even though it's a short snippet)

-the loop REALLY should be between the connection open/close and just calling the query multiple times (no repeating the create conn/open/query/close process altogether all over again for every single row - no point in passing parameters like that or even calling any subs here)

-some of it is database specific (like the conn string and provider), and he hasn't specified anything about what database he's using (some things change a lot from a database to another)

-again, not only basic error handling, but handling of almost anything that could go wrong - like bad values in a csv file (always happens). Was there a dbnull, or a string (text) inside the csv where an int should be, something will go wrong, and the whole thing will just crash hard, and without any real useful error messages... It should at least handle things like dbnulls... And there ought to be try/catch blocks -- really.

Also, you're assuming he has not only basic programming knowledge, but specific knowledge of the .NET framework and VB (enough to recreate the app - which should really have been a console app IMO - tweak the code e.g. new queries, proper parameters, and all - for every single table).

Lots of little things (no point to go further)... But anyways:

The real question is what database are you using, or even better: why are you even doing this in the first place???

If you're taking word documents home, do you export them to html, take those files home, and try to convert 'em back to word at home? Makes no sense, just like using csv files here either. Instead of carrying csv files over, carry a backup of the database! With many databases, you can carry the files (like .mdf files for SQL Server Express), and most databases have statements like "BACKUP DATABASE ..." which will let you create backups (check the docs e.g. BOL for SQL Server), and just carry those over and restore them at your place using "RESTORE DATABASE ..." or such (lots of the management tools will let you create backups easily too i.e. the "point and click" way). Fail any of this, many GUI management tools will let you export to a format they can also import directly (or even generate the queries to recreate the tables and fill them with the data - there are 3rd party tools for this too).

There is no need to program anything here, much less use code without any error handling and such, having to do all the programming work manually for every table to be imported and such. Actually, I hardly ever bother to manually write any of this database stuff anyways (simple CRUD operations) -- too much work/too time consuming (for no reason - such a simple thing), one just uses a decent code generator with a good template (saves a LOT of time). But again, there's no need even for that here: copy the files, do a backup, or generate scripts or something instead.

Link to comment
Share on other sites

Sorry if I haven't replied sooner, was away for most of 2 weeks.

First things first: congrats Neo on writing far better stuff than I would have expected to see - specifically using parameterized queries instead of string concatenation (even though no SqlDbType is specified for the parameters), and no ghetto homebrew buggy csv parser.

However, it's still a VERY minimalistic (or quick and dirty) approach... Lots of little issues:

-no error handling at all anywhere

-no comments, documentation or anything like that anywhere (even though it's a short snippet)

-the loop REALLY should be between the connection open/close and just calling the query multiple times (no repeating the create conn/open/query/close process altogether all over again for every single row - no point in passing parameters like that or even calling any subs here)

-some of it is database specific (like the conn string and provider), and he hasn't specified anything about what database he's using (some things change a lot from a database to another)

-again, not only basic error handling, but handling of almost anything that could go wrong - like bad values in a csv file (always happens). Was there a dbnull, or a string (text) inside the csv where an int should be, something will go wrong, and the whole thing will just crash hard, and without any real useful error messages... It should at least handle things like dbnulls... And there ought to be try/catch blocks -- really.

Also, you're assuming he has not only basic programming knowledge, but specific knowledge of the .NET framework and VB (enough to recreate the app - which should really have been a console app IMO - tweak the code e.g. new queries, proper parameters, and all - for every single table).

Lots of little things (no point to go further)... But anyways:

The real question is what database are you using, or even better: why are you even doing this in the first place???

If you're taking word documents home, do you export them to html, take those files home, and try to convert 'em back to word at home? Makes no sense, just like using csv files here either. Instead of carrying csv files over, carry a backup of the database! With many databases, you can carry the files (like .mdf files for SQL Server Express), and most databases have statements like "BACKUP DATABASE ..." which will let you create backups (check the docs e.g. BOL for SQL Server), and just carry those over and restore them at your place using "RESTORE DATABASE ..." or such (lots of the management tools will let you create backups easily too i.e. the "point and click" way). Fail any of this, many GUI management tools will let you export to a format they can also import directly (or even generate the queries to recreate the tables and fill them with the data - there are 3rd party tools for this too).

There is no need to program anything here, much less use code without any error handling and such, having to do all the programming work manually for every table to be imported and such. Actually, I hardly ever bother to manually write any of this database stuff anyways (simple CRUD operations) -- too much work/too time consuming (for no reason - such a simple thing), one just uses a decent code generator with a good template (saves a LOT of time). But again, there's no need even for that here: copy the files, do a backup, or generate scripts or something instead.

Yaa ! U r right. there are a lot of reduction of code....

such as error handling, comments, documentation.

It can be done but its has been written in hurry....!

Sorry....!

Link to comment
Share on other sites

It can be done but its has been written in hurry....!

Sorry....!

No problems, and no need to be sorry. My point was, this code still needs a fair amount of work (or to just be rewritten), and will require a lot of manual tedious hand coding of the queries and parameters for every single version (might change several times a day in a database course) of every single table in the whole database, and that's very time consuming for anything non-trivial, and error prone too. Considering there's no reason to even use CSV in the first place, writing a CSV importer is working around the real problem: he should be using a backup (or a script or whatever) instead.

Besides, CSV files are rather useless in the first place. It's just the raw data (in a sucky, poorly formatted way), no table schemas, no relationships/constraints or anything like that.

Something like a trial version of Red Gate SQL Packager - or any of the similar products - would do wonders if he can't just make/use backups.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...