Wrox Programmer Forums
|
Access Discussion of Microsoft Access database design and programming. See also the forums for Access ASP and Access VBA.
Welcome to the p2p.wrox.com Forums.

You are currently viewing the Access section of the Wrox Programmer to Programmer discussions. This is a community of software programmers and website developers including Wrox book authors and readers. New member registration was closed in 2019. New posts were shut off and the site was archived into this static format as of October 1, 2020. If you require technical support for a Wrox book please contact http://hub.wiley.com
 
Old March 23rd, 2006, 07:35 PM
Authorized User
 
Join Date: Mar 2006
Posts: 19
Thanks: 0
Thanked 0 Times in 0 Posts
Default Design Practices

Hey everyone,

Being a fairly new Access and VB developer I wanted to hear some opinions on existing code that our databses use. We have an old system called baby36 that is driven by RPG. It basically is a flat file system. To get information from baby36 to Access for reporting there are nightly scripts that export the flat files to text files. In our Access databases there are several programs that check to see if certain tables exist, if so they are deleted and the new information is copied to the appropraite tables with vb code such as:

If fExistTable("tbl_DSY") Then
        DoCmd.DeleteObject acTable, "tbl_DSY"
DoCmd.TransferText acImportDelim, "DSY Import Specification", "tbl_DSY", "P:\TXT\DSY.TXT", False, ""

This seems to rather slow on some databases as some have as many as 70,000 records. Is this a good practice or is there a better way to be doing this?

Also after awhile these databases grow fairly large because there is no compacting unlesss I do it manually, which of course means it has to be late at night when no one else is using the databse. IN some cases we have had a database as large as 2 gigs. Of curse that is mostly unclaimed space. It usually shrinks to like under 50 megs in most cases after i compact te databases.

Thanks,

Robert


 
Old March 23rd, 2006, 11:50 PM
Friend of Wrox
 
Join Date: Jun 2003
Posts: 1,093
Thanks: 1
Thanked 12 Times in 11 Posts
Default

Hi Robert,

DoCmd.TransferText is actually significantly faster than the other two alternatives to programmatically importing text files I've used: Basic binary I/O functions, and the FileSystemObject in the Microsoft Scripting Runtime libray. I put together three subs, one for each approach, and timed the import of a table with 100,000 records on my local drive. DoCmd.TransferText was actually about 4 times faster (average of 3 seconds) than the other two appraoches. If you're interested in the benchmarking code, I rewrote your routine as follows:

Code:
Sub ImportTextFile()
    ' Benchmarking variables
    Dim varTimeStart As Variant
    Dim varTimeEnd As Variant

    ' Start timer
    varTimeStart = Time
    Debug.Print "Start file import: " & varTimeStart

    If fExistTable("tblHundredThousand") Then
        DoCmd.DeleteObject acTable, "tblHundredThousand"
    End If
    DoCmd.TransferText acImportDelim, "tblHundredThousand Import Specification", "tblHundredThousand", "C:\tblHundredThousand.txt", False, ""

    ' End timer
    varTimeEnd = Time
    Debug.Print "End file import: " & varTimeEnd
    Debug.Print "Elapsed Time: "; DateDiff("s", varTimeStart, varTimeEnd) & " Seconds"
End Sub
I did the same for a binary I/O and a FileSystemObject routine (average of about 13 seconds). These two approaches give you total control over the import, but I didn't even include dropping and re-creating the table functionality. DoCmd.TransferText, in addition to being the fastest of the three, also makes it super easy to load a table with new data.

So as it stands, and for what its worth, I can't think of a faster approach than DoCmd.TransferText.

HTH,

Bob



 
Old March 24th, 2006, 12:03 AM
Friend of Wrox
 
Join Date: Jun 2003
Posts: 1,093
Thanks: 1
Thanked 12 Times in 11 Posts
Default

 
Quote:
quote:...timed the import of a table with 100,000 records on my local drive


'Course meant to say timed the import of a text file...

Bob

 
Old March 24th, 2006, 09:53 AM
Authorized User
 
Join Date: Mar 2006
Posts: 19
Thanks: 0
Thanked 0 Times in 0 Posts
Default

Quote:
quote:Originally posted by Bob Bedell
 Hi Robert,

DoCmd.TransferText is actually significantly faster than the other two alternatives to programmatically importing text files I've used: Basic binary I/O functions, and the FileSystemObject in the Microsoft Scripting Runtime libray. I put together three subs, one for each approach, and timed the import of a table with 100,000 records on my local drive. DoCmd.TransferText was actually about 4 times faster (average of 3 seconds) than the other two appraoches. If you're interested in the benchmarking code, I rewrote your routine as follows:

Code:
Sub ImportTextFile()
    ' Benchmarking variables
    Dim varTimeStart As Variant
    Dim varTimeEnd As Variant

    ' Start timer
    varTimeStart = Time
    Debug.Print "Start file import: " & varTimeStart

    If fExistTable("tblHundredThousand") Then
        DoCmd.DeleteObject acTable, "tblHundredThousand"
    End If
    DoCmd.TransferText acImportDelim, "tblHundredThousand Import Specification", "tblHundredThousand", "C:\tblHundredThousand.txt", False, ""

    ' End timer
    varTimeEnd = Time
    Debug.Print "End file import: " & varTimeEnd
    Debug.Print "Elapsed Time: "; DateDiff("s", varTimeStart, varTimeEnd) & " Seconds"
End Sub


I did the same for a binary I/O and a FileSystemObject routine (average of about 13 seconds). These two approaches give you total control over the import, but I didn't even include dropping and re-creating the table functionality. DoCmd.TransferText, in addition to being the fastest of the three, also makes it super easy to load a table with new data.

So as it stands, and for what its worth, I can't think of a faster approach than DoCmd.TransferText.

HTH,

Bob



Thanks Bob,

Btw, seeing that certain tables are deleted and re-created each time the database is open, is there a way keep the database from growing as it does now? Or possibly place some code in there to ensure that it only imports data once daily instead of everytime the file is open?

Robert







Similar Threads
Thread Thread Starter Forum Replies Last Post
Java Best Practices Class gxp Spring 0 August 21st, 2008 06:02 PM
View layer best practices danielpilon C# 0 May 8th, 2008 07:41 AM
Debugging - best practices bonekrusher XSLT 6 March 3rd, 2008 09:13 AM
ObjectDataSource Best Practices? TKelley4591 BOOK: ASP.NET 2.0 Website Programming Problem Design Solution ISBN: 978-0-7645-8464-0 0 July 20th, 2007 02:39 PM
XSLT Best Practices tripecac XSLT 1 May 4th, 2007 06:02 PM





Powered by vBulletin®
Copyright ©2000 - 2020, Jelsoft Enterprises Ltd.
Copyright (c) 2020 John Wiley & Sons, Inc.