Help - Huge table / select query / excel pivots

df9864

Board Regular
Joined
Sep 28, 2004
Messages
108
I have a tactical Access2002 dbase which contains an end table of 1.5 million rows of data (approx 60 columns) and counting.
The "user" needs to be able to query this on potentially any of the columns and view in a pivot table. My solution is to get them to download this huge dbase to their c:\drive and then have created a simple select query which they 'filter' their results and save.
Then I have a pivot table in Excel that is "pointing at" this query and refreshes the data.

Problem is that it is SO slow.
I am not an "Access-head" (excel is my thing!!) but I have noticed the following.
1) the select query returns the data very quickly (maybe 1 or 2 seconds)
2) it is the pivot which seems really slow which makes me think that the 'query' remains an instruction rather than a table so the pivot is querying the base table - and since it is doing it second hand from XL and it is such a large table, it is very slow.
3) if i turn the select query into a make-table query then it is really slow in Access (freezes on my specs) - nb has to be a make table to another dbase as I am approaching Access memory limit (2 gigs I've been told)

Any ideas - this dbase is only going to get bigger (currently at 1.6 gigs !!).
Fortunately I can kill it at end of year as it all goes into Oracle but would be really grateful for any tips in the meantime.
 

Excel Facts

Select a hidden cell
Somehide hide payroll data in column G? Press F5. Type G1. Enter. Look in formula bar while you arrow down through G.
Why not try a Crosstab Query in Access?

It's similar to a pivottable in Excel.
 
Upvote 0
I like Norie's suggestion but I'll chuck in another option for the hell of it.

If your query doesn't return more than 65000 rows you can push the data to Excel using the TransferSpreadsheet method, and then deal with the pivot table there.

Alternatively you can build a Totals query which will reduce the # of rows, and push that data to Excel. The trick here is to GROUP BY categories such as Name, Date, Division, etc, and SUM / AVERAGE / COUNT / MAX / MIN the numerical data. This can be pulled into Excel using ADO -- the tedious bit is building the initial summary query, then it's easy.

{Edit} Here's a sample of code that I'm using to pull summarised data from Access. In this sample, the original data is monthly for 8 years, for about 250 projects. I've returned data summarised by half-year, for just one project, by combining a parameter in ADO with the output of an Access query... final result, 16 rows. I know the amount of data doesn't compare with yours, but the bit that takes the longest is writing the headings.

Code:
Sub GetSummary_Query()
  Dim cnn As ADODB.Connection
  Dim rst As ADODB.Recordset
  Dim MyConn As String
  Dim sSQL As String, sProject As String
  Dim wshNew As Worksheet
  Dim c As Range
  
  MyConn = Range("TPath").Value
  sProject = Sheets("Transfer Setup").Range("B4").Value
  
  'open the connection
  Set cnn = New ADODB.Connection
  With cnn
    .Provider = "Microsoft.Jet.OLEDB.4.0"
    .Open MyConn
  End With
  
    If sProject = "" Then
    sSQL = "SELECT * FROM qrySummary1"
  Else
    sSQL = "SELECT * FROM qrySummary1 WHERE Project = " & "'" & sProject & "'"
  End If
  
  Set rst = New ADODB.Recordset
  rst.CursorLocation = adUseServer
  rst.Open Source:=sSQL, ActiveConnection:=cnn, _
    CursorType:=adOpenForwardOnly, LockType:=adLockOptimistic, _
      Options:=adCmdText
  
  Set wshNew = Worksheets.Add
  ActiveSheet.Name = "Test Summary"
  Range("A1:Y1") = Array("Project", "Business Unit", "Type", "Status", "Acqn Type", _
   "On/Off", "Half Year", "Fin Year", "Acq Cost", "Constructed", "Settled", _
     "Dev Cost", "Gross Rev", "Cap Int", "Int Expensed", "GST", "Net Rev", _
      "Cost of Sales", "Gross Profit", "OB_NetFE", "CB_NetFE", _
        "OB_GrossFE", "CB_GrossFE", "Gross Land Creditor", "Version")
  Range("A2").CopyFromRecordset rst
  
  rst.Close
  cnn.Close
  
  Range("G:G").NumberFormat = "mmm-yy"
  Range("I:I").NumberFormat = "$##,000"
  Range("L:X").NumberFormat = "$##,000"

End Sub


Denis
 
Upvote 0
Thanks for the suggestions but not exactly what I was looking for.
1) crosstab query - won't stop the performance issues and users will want to see this in Excel in preference to Access
2) Groupings - unfortunately the table is already grouped. Each grouping now (all 1.5mm rows !!) is already at the highest level grouping - unfortunatlley there are that many unique combinations of data-points for analysis.

Was more interested in methods of optimising (I know about variable / string types) - and how Excel interacts with Access (eg. via query vs table)
 
Upvote 0
If the summarised version is 1.5 M rows, you've outgrown Excel. Unless your filtered query is less than the row limit in XL, you'll need to run the queries remotely. IF you can get the rows below 65536, the ADO approach is still worth a go.

Denis
 
Upvote 0
Hi, I agree with Norie and Denis. If the data exceeds 65k rows then you have outgrown Excel plus with a filtered crosstab query, you can copy and paste the data into Excel for those users who are more comfortable with Excel.

You can improve the performance of the crosstab query by have a properly optimised database with the correct keys & indexes set up for each table and the correct bits of data in the correct tables. Plus you might want to make sure you have a sufficiently large swap file setting on the PC.

If you are approaching the 2Gb limit of Access then it may be time to look at optimisation. What is the data you are collecting and what is the setup of the main table that has the 1.5m records? I'm also curious as to the make up of the 60 columns in the main table. Also, what did you mean by this ?

... and since it is doing it second hand from XL and it is such a large table, it is very slow.

Lastly, have you considered splitting the database up into smaller units?

Andrew. :)
 
Upvote 0
Just a thought (daft) why not use excel to build the query (Data menu Import Data) instead of using the access query. I find altough my dbase isn't that big it ran a lot faster that way
 
Upvote 0

Forum statistics

Threads
1,221,828
Messages
6,162,215
Members
451,752
Latest member
freddocp

We've detected that you are using an adblocker.

We have a great community of people providing Excel help here, but the hosting costs are enormous. You can help keep this site running by allowing ads on MrExcel.com.
Allow Ads at MrExcel

Which adblocker are you using?

Disable AdBlock

Follow these easy steps to disable AdBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the icon in the browser’s toolbar.
2)Click on the "Pause on this site" option.
Go back

Disable AdBlock Plus

Follow these easy steps to disable AdBlock Plus

1)Click on the icon in the browser’s toolbar.
2)Click on the toggle to disable it for "mrexcel.com".
Go back

Disable uBlock Origin

Follow these easy steps to disable uBlock Origin

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back

Disable uBlock

Follow these easy steps to disable uBlock

1)Click on the icon in the browser’s toolbar.
2)Click on the "Power" button.
3)Click on the "Refresh" button.
Go back
Back
Top