# Is there a limit to SQL query complexity in getting data into a PP sheet?



## RichardRayJH (Jan 31, 2013)

I've written a SQL query that gets some data for an analysis I want to present through PowerPivot 2012 using Excel 2010. It *really* doesn't work using the PowerPivot tools on simple data imports. I need the SQL to get the data. The query includes a table variable, a tally table with a cross join to a real data table, several joins, a couple of correlated subqueries and a partridges in a pear tree. 

When I try to use it to get the data PP thinks for awhile then throws and error complaining that


```
OLE DB or ODBC error.


An error occurred while processing table 'PresidentsWeekData'.


The current operation was cancelled because another operation in the transaction failed
```

I moved the whole shooting match to a 64-bit machine with 32 GB or RAM and installed 64-bit Excel 2010, but the error remains.

If needs be I can create a table in a utility database, schedule a job to run the query to put the data into that table then use a simple import from PP, but I'd rather do it directly and not create the maintenance issues. Is there a known limitation I might be bumping into?


----------



## RichardRayJH (Jan 31, 2013)

Just to clarify, the query works fine in SSMS on either machine I've tested on.


----------



## powerpivotpro (Jan 31, 2013)

I am not aware of any limit.  Are there calculated columns already in the table?  Sometimes those can be the problem - the data refresh is running fine but the calc columns are complex enough that you run out of RAM.

Another thing to try, if it's not calc columns, is to change up the sort order of the data you are importing.  We have seem strange cases where sorting the incoming data one way vs. another yields failure one way and success the other.  (The theory is that the sort order impacts the compression routines).

Here's a post:

Workarounds for “Canceled Due to Memory Pressure” « PowerPivotPro

But it's not clear to me that RAM is even the problem here for you.  Have you watched RAM consumption during the import process?


----------



## RichardRayJH (Jan 31, 2013)

It's an import to a rand new PowerPivot window. No existing data, sorts or calculated fields.

The query only returns about 400,000 rows of data. I've imported three 10 million row tables into one workbooks PP storage on the 32 GB 64-bit machine and barely even moved the Available MB Perfmon counter of the pin. I will check that in this case, though, just in case there's some pathology in the query execution that's creating a problem. The cross join to the tally table that sets up the returned data could be getting sizeable, I suppose, but SSMS executes the query without a problem.

One of the tables being joined is on a linked server, but it's on GB Ethernet on a server to server private VLAN, so I can't see that being a problem.

For now I've gone with the utility table idea. It works fine, but if this workbook is as much of a hit with senior management as I suspect it's going to be there will be about a dozen of these same kind of analyses, and one of them will likely run into the 40 million row x 30 column range. I'd rather not have to run that through the utility table tactic.


----------



## powerpivotpro (Feb 1, 2013)

30 columns in large tables is scary in my experience.  PowerPivot handles high row counts MUCH better than it handles high column counts.

A number of posts on this topic, but here's one:

Less Columns, More Rows = More Speed! « PowerPivotPro 

But to reiterate an earlier point - we definitely have seen cases where we change the SORTBY clause in a SQL query and magically a failed import starts succeeding.


----------

