Question

I have a cube that has a dimension that is roughly 100,000 rows. It runs extremely slow, and i am not sure what direction to go to fix the problem. I have created partitions for the measures, and the the aggregates seem to run just fine. I can however, pull over just a dimension, into the browser in 2008 BIDS, without any measures even, and i will take 10 minutes to load. Any suggestions of what direction i can go to trouble shoot this? I am using SQL Server 2008 Enterprise Edition. The Server is huge, having 196 gig om memory.
I appreciate your help!

Was it helpful?

Solution

Build hierarchies on your dimensions's attribute. Even if you can bring them all to the cube, who will search a single attribute among 100.000k possible values?

Its a Microsoft best practice to add big dimensions in some sort of hierarchy and set the AttributeHierarchyVisible property to false so the attribute is enable to be browsed through the hierarchy but not alone

OTHER TIPS

Is the dimension slow in processing the cube or after the cube is processed? How is the dim composed? Is it a SQL Server View where there could be joins to other tables underneath, perhaps nested loop joining instead of merge or hash joining to read the entire Dim?

Are you using Bridge tables to get to your fact data? These would be evaluated at run-time and could take some time. Even though you are only bringing back dim data in the browser, SSAS uses a default measure, which means your browsing query most likely is referencing fact data anyways.

Lastly, ensure memory is set appropriately on the server making sure SSAS is using enough of that memory. Is this server soley dedicated to SSAS or is SQL Server or other things installed on it?

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top