p2p.wrox.com Forums

p2p.wrox.com Forums (http://p2p.wrox.com/index.php)
-   SQL Server 2000 (http://p2p.wrox.com/forumdisplay.php?f=20)
-   -   Fetching fields from flat file (http://p2p.wrox.com/showthread.php?t=50595)

Jeff Moden November 22nd, 2006 01:10 AM

Fetching fields from flat file
 
Praveen,

Dunno what's wrong with the other thread but couldn't get the reply window to work correctly...

Anyway, using the data you posted, here's what you need to do...

Create a format file and store it somewhere where the SQL Server can see it... here's what the file should look like... the file is called "C:\Test\TestImport.bcp" in the sql code that follows that...

Code:

8.0
9
1  SQLCHAR 0 16  ""  0  Junk1          ""
2  SQLCHAR 0 12  ""  2  DocumentID    ""
3  SQLCHAR 0 6    ""  0  Junk2          ""
4  SQLCHAR 0 8    ""  3  ReceivedDate  ""
5  SQLCHAR 0 8    ""  0  Junk3          ""
6  SQLCHAR 0 5    ""  4  Verifier      ""
7  SQLCHAR 0 843  ""  0  Junk4          ""
8  SQLCHAR 0 10  ""  5  MemberID      ""
9  SQLCHAR 0 1434 "\n" 0  Junk5          ""

...and here's the code that uses it... read the comments and compare what I've done with the format file against the BCP format file example in Books Online... Bulk Insert can uses the same type of format file...

Code:

--===== If the test table exists, drop it
     IF OBJECT_ID('TempDB..#ImportTest') IS NOT NULL
        DROP TABLE #ImportTest

--===== Create the test table
 CREATE TABLE #ImportTest
        (
        RowNum INT IDENTITY(1,1) PRIMARY KEY CLUSTERED,
        DocumentID VARCHAR(12),
        ReceivedDate DATETIME,
        Verifier VARCHAR(5),
        MemberID INT
        )

--===== Do the import
   BULK INSERT #ImportTest
   FROM 'C:\Test\TestImport.txt' --DATA FILE!!!
   WITH (
        BATCHSIZE    = 300000, --Basically, import the whole file all at once
        CODEPAGE    = 'RAW',  --No special language requirement
        DATAFILETYPE = 'CHAR', --Normal text input
        FIRSTROW    = 1,      --No header
        FORMATFILE  = 'C:\Test\TestImport.bcp', --FORMAT FILE!!!
        TABLOCK                --Uses a single lock on whole table for max speed
        )



...and yes, it's been tested...

Code:

SELECT *
   FROM #ImportTest

RowNum      DocumentID  ReceivedDate            Verifier MemberID   
----------- ------------ ------------------------ -------- -----------
1          06276100011  2006-10-03 00:00:00.000  CSUCH    373944015
2          06276100012  2006-10-03 00:00:00.000  CSUCH    395468826
3          06276100013  2006-10-03 00:00:00.000  CSUCH    397600221

(3 row(s) affected)



--Jeff Moden

praveennk84 November 22nd, 2006 01:34 AM

Hi,
    Thanks for the solution given. It is working fine.....
    But i have one more question, will this Bulk Insert works for TABLE data type?

praveen KN

Jeff Moden November 22nd, 2006 08:30 AM

Why don't you just try it?

I don't know because I don't ever use the TABLE datatype for this type of thing because they aren't capable of using STATISTICS and cannot be made to do so. Since both TEMP tables and TABLE variables start out in memory and they both spill over into TEMPDB when they get big, there's no performance advantage there, either. In fact, the only thing I ever use table variables for is if I need a temporary data structure in a function... and if they allowed TEMP tables there, I wouldn't use TABLE variables at all.

Most folks consider that just because TABLE variables live only at the procedure level that there's some sort of advantage to them disappearing on their own. Have you ever tried to troubleshoot the contents of something that disappears at run completion?

Just in case anyone thinks I'm full of hooey on this, please study Q3/A3 and Q4/A4 of the following URL...

http://support.microsoft.com/default...&Product=sql2k

--Jeff Moden


All times are GMT -4. The time now is 02:53 AM.

Powered by vBulletin®
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
© 2013 John Wiley & Sons, Inc.