• Need something faster than DLookup

    • This topic has 19 replies, 7 voices, and was last updated 15 years ago.
    Author
    Topic
    #468493

    I have two tables that may contain information a user is looking for. One is the “active” table; the other is an “archived” table. Information gets archived periodically, but sometimes may need to be brought back to the active table. Users generally access the active table (by entering a Customer ID in a form), but if they enter a customer ID which does not exist in the active table, I use a DLookup function to check the archived table. If it exists there, they’re given the option of viewing the archived record and/or bringing it back to the active table. The problem is that, over time, the archived table is getting huge and the DLookup is getting slow. Is there a method of looking the customer ID up in the archived table that is faster than DLookup? I’ve read that opening a recordset is faster than DLookup. Is that true? How should I handle this situation? The database is split – the data physically resides in a separate database on the server.

    Viewing 9 reply threads
    Author
    Replies
    • #1220856

      The best way to speed your query is to make sure that there is an index CustomerID on the table Archive.

    • #1220858

      Both tables are indexed by CustomerID. Any other suggestions?

      • #1221091

        How big is “huge”? How many records in the archived table?

        Why not just write a select statement for a recordset that gets only the specific record?

    • #1220888

      I read somewhere, that DLOOKUP was optimized for Local tables, but a Recordset was better on linked tables.
      Quite willing to be corrected on that though.

      Have a look at Allen Brownes ELookup Function as well. ELOOKUP

    • #1220894

      I have two tables that may contain information a user is looking for. One is the “active” table; the other is an “archived” table. Information gets archived periodically, but sometimes may need to be brought back to the active table. Users generally access the active table (by entering a Customer ID in a form), but if they enter a customer ID which does not exist in the active table, I use a DLookup function to check the archived table. If it exists there, they’re given the option of viewing the archived record and/or bringing it back to the active table. The problem is that, over time, the archived table is getting huge and the DLookup is getting slow. Is there a method of looking the customer ID up in the archived table that is faster than DLookup? I’ve read that opening a recordset is faster than DLookup. Is that true? How should I handle this situation? The database is split – the data physically resides in a separate database on the server.

      For a single lookup like that, I don’t think opening a recordset is any faster. I trust you have compacted the db with the archived table?

      I’m assuming this is a front-end/back-end situation? If so, you can open the the archive table directly (not as a linked table), and then use SEEK to find the record. This is as fast as it gets.

      • #1221016

        For a single lookup like that, I don’t think opening a recordset is any faster. I trust you have compacted the db with the archived table?

        I’m assuming this is a front-end/back-end situation? If so, you can open the the archive table directly (not as a linked table), and then use SEEK to find the record. This is as fast as it gets.

        Mark,
        What is the best way to do this, i.e., open the table directly? Use the IN external file clause in a query? Use a DAO Workspace? Other?

        Thanks

        • #1221110

          Mark,
          What is the best way to do this, i.e., open the table directly? Use the IN external file clause in a query? Use a DAO Workspace? Other?

          Thanks

          It’s been awhile, and I’m answering this remotely (I’m not at a computer where I can check syntax). Basically, it would be:

          dim db as dao.database.
          dim rs as dao.recordset

          set db = … (it is here you’d directly reference the backend, can’t remember syntax)
          set rs = db.Openrecordset(“tablename”,dbOpenTable)

          I’d make these Public variables, so they are kept open (otherwise, you lose speed advantage). then when you need it:

          rs.Seek …

          You’ll have to use Help to fill-in the blanks. Like I said, it has been awhile.

    • #1221119

      Thanks, Mark.

      I tested and see that this would work, but am wondering whether a query that uses the IN ‘BackEnd.mdb’ clause (as the recordset might work as well.

      • #1221280

        Thanks, Mark.

        I tested and see that this would work, but am wondering whether a query that uses the IN ‘BackEnd.mdb’ clause (as the recordset might work as well.

        Yes, it probably will. The key is to define the recordset as public so you don’t have to constantly recreate it. That’s where you lose time.

        • #1221291

          Yes, it probably will. The key is to define the recordset as public so you don’t have to constantly recreate it. That’s where you lose time.

          Good point, thanks.

    • #1221272

      Larry’s question with regard to what is large is very relevant – we do searches on table containing 4 million or more recors and get subsecond response time, but the table has to be indexed properly, and you need a robust server that can keep the data in memory. (We link to a SQL Server backend.)

    • #1221631

      Thanks for everyone’s feedback. I don’t have 4 million records – more like 500,000 (and growing) – but I don’t have a “robust server”. This is a front-end/back-end situation, but I’m only using Access tables as my backend not SQL server, so I’m hampered by speed this way and the fact that the network is slow. I’ll re-write the code to open a recordset directly as suggested by Mark and see if that speeds things up. Thanks for your help.

      As an aside, Mark recommends “Define the recordset as Public”…..is there an issue with keeping a recordset open? In the recesses of my mind, I remember being told “get in, get out” when dealing with data sets. “open the database, get your data and close the connection”. Am I locking the records and leaving the recordset open by defining it as public? Will this be setting myself up for corruption or frustration by other users trying to access the same records moments later? I’ll admit my schooling was several years ago and maybe things have changed….

      • #1221644

        Thanks for everyone’s feedback. I don’t have 4 million records – more like 500,000 (and growing) – but I don’t have a “robust server”. This is a front-end/back-end situation, but I’m only using Access tables as my backend not SQL server, so I’m hampered by speed this way and the fact that the network is slow.

        I still think you would be best served by a simple query in code (e.g. Select * from tablename WHERE criteria here; Remember, the jet engine is optimized for queries.

        I just ran a test against a 1.6GB table with 1.5+ million rows. The table is a linked Access table. With a query, selecting a specific row (against an indexed field) took well under 1 second. The linked table is on the same PC, so it is not slowed down by network access – which can be a significant slowdown. A slow network is even more reason to select a single row (or the minimum number of rows you need). I suspect (without testing) that having an open recordset that you use seeks on would be slower. I could be mistaken on that, but don’t think so. I’d be interested in a comparison. Come to think of it, I might do that and let you know if you are interested. Let me know. It would not take long to test.

        Let us know how it goes.

      • #1221687

        Thanks for everyone’s feedback. I don’t have 4 million records – more like 500,000 (and growing) – but I don’t have a “robust server”. This is a front-end/back-end situation, but I’m only using Access tables as my backend not SQL server, so I’m hampered by speed this way and the fact that the network is slow. I’ll re-write the code to open a recordset directly as suggested by Mark and see if that speeds things up. Thanks for your help.

        As an aside, Mark recommends “Define the recordset as Public”…..is there an issue with keeping a recordset open? In the recesses of my mind, I remember being told “get in, get out” when dealing with data sets. “open the database, get your data and close the connection”. Am I locking the records and leaving the recordset open by defining it as public? Will this be setting myself up for corruption or frustration by other users trying to access the same records moments later? I’ll admit my schooling was several years ago and maybe things have changed….

        You want to minimize the # of recordsets you have open, as it takes up memory. But the performance benefits of holding open a recordset connected directly to the backend outweighs the memory penalty of keeping it open.

    • #1221688

      Mark – thanks for clarifying what happens with multiple recordsets open. It makes sense that it uses memory to leave it open, and it also makes sense that it would be faster if it was already open! There’s no issue with selecting a particular record when opening the recordset? Will Access lock that record until the recordset is closed again?
      Larry – I would be interested in a comparison between the seek and the regular query (since it sounds to me that you’d like to test that! ) I have a lot of work to do with datasets in this project, so I want to optimize my interaction with the data. The data has to be split for security reasons – I just wish I had a faster network.

      • #1221695

        Larry – I would be interested in a comparison between the seek and the regular query (since it sounds to me that you’d like to test that! ) I have a lot of work to do with datasets in this project, so I want to optimize my interaction with the data. The data has to be split for security reasons – I just wish I had a faster network.

        I’ll try to get to this in the next day. I’m pretty sure I can.

      • #1221759

        Mark – thanks for clarifying what happens with multiple recordsets open. It makes sense that it uses memory to leave it open, and it also makes sense that it would be faster if it was already open! There’s no issue with selecting a particular record when opening the recordset? Will Access lock that record until the recordset is closed again?

        Merely looking at a record doesn’t lock it.

    • #1221767

      A very interesting result – not exactly what I had expected, but good anyway! I set up random lookups (there was a unique key on the table and I searched on it – 4 fields). I set up a random lookup (via separate tables which held only parts of the unique key). Then I used three different methods (and 3 completely separate runs, each using only one method of lookup). Here are the results (ms = milliseconds): There were over 1.6 million rows in an attached table. Since I was doing all this work anyway, I decided to at least look at the .findfirst method on an open recordset. The results on that were even worse than I expected.

      FindFirst method:

        [*]Number Lookups: 10
        [*]Min Ms: 655
        [*]Max ms: 4368
        [*]Avg ms: 2128

      Query method:

        [*]Number Lookups: 1000
        [*]Min ms: 0
        [*]Max ms: 46
        [*]Avg ms: 4

      Seek method:

        [*]Number Lookups: 1000
        [*]Min ms: 0
        [*]Max ms: 15
        [*]Avg ms: .05

      I must admit I’m surprised by how much faster the seek is. It is definitely more complicated to use, but would be worth using if one were looking up thousands of times or more (for 1000 lookups, the total time is 4 seconds vs .05 seconds). I think I’ve only used a seek one time myself.

      From what I can tell on the original problem description, I’d guess that the user will not be able to tell the difference between a query implementation and a .seek implementation. My inclination would be to implement with a query and see how it goes. If it is too slow, then move to a .seek and see if it improves response any, but if you have a slow network, I’m not sure you’ll see much improvement. I just don’t know how a slow network affects the response times (well, we all know it does slow it down ). It could be that the lookup time is just in the noise.

    • #1221777

      That’s really helpful, Larry! I’ll be using a seek as often as possible from now on! You’re right about the network, though. I’ll have to talk to the hardware guys and see what can be done about getting some more speed or bandwidth or something.
      Mark: Thanks for clarifying that records aren’t locked when a record is only viewed.

    Viewing 9 reply threads
    Reply To: Need something faster than DLookup

    You can use BBCodes to format your content.
    Your account can't use all available BBCodes, they will be stripped before saving.

    Your information: