Recommended RAM for working w/large DBs?

John P

Board Regular
Joined
Oct 17, 2002
Messages
108
I am currently working with an Access DB that is 350MB in size and continues to grow. My laptop setup is 1.2 Ghz with 512 RAM. When copying rows, pasting, etc, I often receive the low resources message. I would like to go down the path of requesting additional RAM from my boss, but need some additional proof that more RAM will help.

What would you recommend for RAM when working with such a large DB...keeping in mind that I would like to perform other tasks on the device when macro/query processes are running. I was thinking of trading the 2 256MB chips for 2 512MB chips.
 

Excel Facts

Format cells as currency
Select range and press Ctrl+Shift+4 to format cells as currency. (Shift 4 is the $ sign).
First question is - do you really have a mdb that requires 350MB?
Have you ever used the Tools-Database Utilities-Compact And Repair Database option?

Access has a habit of growing in size with everything you do. It creates a number of (invisible to you) indexes to speed up how you work with data. None of it is cleared/cleaned up unless you compact the database.

One of the others here will have to chime in on their personal recommendations. Like with many things, Memory is not always the end-all answer. On the other hand, if you're using Transactions or extremely large Arrays to temporarily hold records/entire tables -- both store things in memory first before the Commit or when you Clear the array.

Another thought is - it's not my goal to be critical - but, you mentioned copying & pasting rows in Access. When you do that, you're really using windows & office functionality instead of Access itself. You should take a look at how the built-in export tool -- or to consider using Append queries to move data between tables.

If you have a particularly table - say 100MB of data, and you're attempting to do a select all - copy & paste -- well no wonder you're having resource problems!

Mike
 
Upvote 0
The DB is really 320MB and I compact after every use. I have a couple of tables with 300,000 rows of data in each (one for 2003 and one for 2004). Each of the data elements is required because they want to have the ability to report off everything. I am copying and pasting now because they wanted a new data element added and since the data I was provided for import was not clean...I had to clean a lot of it after import. If I start over and import everything, then I have to clean it all up again, which was literally one week plus. The other issue is that there is not one key field that I could use for a lookup on the value, so I really needed to bring in the data from another table again. Believe me, if there was a better way I would do it.
 
Upvote 0
Got it - partial apology then, but I do think there is a way to streamline your process. How do you plan, for example, to add another field should they add one?

Another question is - is this 'new field' a product of several other fields? Is it (just as an example) of combining date elements (Month/Day/Year) that are in 3 fields into a single field?

Making another assumption - if you can cut and paste a new field into the table, it's probably because the order of the records is the same. Some of the field values were cleaned up but #5 is #5 on both source versions.

If so, you can go after this with code to update the tables.
The technique would be to import the new table (only import the one column, this will make it far smaller)...then use something like DAO to open two recordset objects and copy the fields over. You probably can't use SQL & a JOIN to do this if the key fields have differences.

Please note, this makes the highly unlikely assumption that the field # that you wish to update is the 11th (10 -- columns start at 0) You can also specify a fieldname like !fieldname or .Fields("name").Value.

Also, this is pretty rough. No error checking in case rs2 runs out of records should it not be the same length nor is it comparing which row each recordset might be on. Needs polishing.

Code:
Dim dbs As DAO.Database
Dim rs, rs2 As DAO.Recordset
Dim strSQL As String

Set dbs = CurrentDB()

strSQL = "SELECT * FROM tblCurrent"
Set rs = dbs.OpenRecordset(strSQL, dbOpenDynaset)
strSQL = "SELECT * FROM tblNew"
Set rs2 = dbs.OpenRecordset(strSQL, dbOpenSnapshot)

With rs
  Do Until rs.EOF
   .Edit
   .Fields(10).Value = rs2.Fields(0).Value
   rs2.MoveNext
   .Update
    .MoveNext
  Loop
End With

Set rs = Nothing
Set rs2 = Nothing
Set dbs = Nothing

Mike
 
Upvote 0
Gonna leave all that in. Sometimes I miss the obvious.
You said that you do not have a unique key - and that many field values were cleaned/up altered which may prevent you from using SQL to join records and mass update them.

But, you're able to cut and paste - and I'm assuming they're in the same order.

What if you added an autonumber field to both the current table and the imported table and then join on it?

You'd be able to add the extra field to the current table and then use a SQL UPDATE query to update the values into it. By default, a SQL update query is going to run ALOT faster than any code attempt like my last one. Minutes versus hours type differences.

Mike
 
Upvote 0
It is a completely unique field. It is not a concatenation of existing fields. I cannot copy and paste from another table because the records are not in the same order. In the normal process I have all my queries run against one table that I populate with monthly data. I then copy and paste append those rows to the existing table that is the source for the Excel pivot tables that produce the reports. I then delete the values from that table that had the queries run.

In this process I added a new field to the source. I copy and paste append the data for one month from the static table, run the queries, delete the rows I just used from the static table and copy/paste append the 'new' rows to the static table.

The process is the nature of the beast and there is no way to make the DB smaller, since it is all data required for reporting.
 
Upvote 0
Well darn - better go get more RAM. What you're working with isn't small. Access can handle it though. Maybe you should look at a little more CPU muscle.

Internally with my own employer, I've been thinking of requesting a high end desktop to do my own development work instead of the 1Ghz 256mb IBM Thinkpad I'm using at the moment. It works a the moment, but I'm mostly using files that are a twentieth of yours in size.

And yes - seriously, both might be needed. RAM only goes so far.

Mike
 
Upvote 0

Forum statistics

Threads
1,221,675
Messages
6,161,216
Members
451,691
Latest member
fjaimes042510

We've detected that you are using an adblocker.

We have a great community of people providing Excel help here, but the hosting costs are enormous. You can help keep this site running by allowing ads on MrExcel.com.
Allow Ads at MrExcel

Which adblocker are you using?

Disable AdBlock

Follow these easy steps to disable AdBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the icon in the browser’s toolbar.
2)Click on the "Pause on this site" option.
Go back

Disable AdBlock Plus

Follow these easy steps to disable AdBlock Plus

1)Click on the icon in the browser’s toolbar.
2)Click on the toggle to disable it for "mrexcel.com".
Go back

Disable uBlock Origin

Follow these easy steps to disable uBlock Origin

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back

Disable uBlock

Follow these easy steps to disable uBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back
Back
Top