Resolutions for common Issues and Facts while using SSAS 2005

In general these are some common that we must be aware of while working on SSAS else we end up wasting our precious time googling and reading articles in detail.

  1. Locking mechanisms are inherent in Sql server 2005 (OLAPLockTypes). Hence this feature is deprecated.
    * By default if a user is trying to update or process the objects in SSAS 2005. Other user will not be able to access that object until it becomes free.
    * Explicit Rollback and Commit transactions provided by the server can help in granting both of the users or neither of them in updating or reading the cube simultaneously
    * Query Log table availed in the Sql Server 2000 Analysis Server repository by default was helpful to determine the OLAP usage statistics.
  2. Sql Server 2005 allows this option via its properties where you should manually set a data source where you want to create query log table will be created.
    * Structure of this table is fixed and you cannot alter them.
    * Binary values are stored in DataSet Column
    * You can run trace files at the background initiated by Sql Profiler that helps to store recent information about the query run. However if you want to log every single query or activity then it’s better to set up a trace using.
    * How to leverage the Querylog Table? Usage based optimization wizard is meant for this. Required details like begin and end dates are prompted, set to choose: Performance gain of 30%.
  3. Purpose of doing count estimates while partitioning the cube helps in adjusting the performance of the query as well as determining if the limitation has reached (Not more than 20 million rows). Key to choice of storage settings( MOLAP or ROLAP) is here:
    * Set the Performance based on frequency of use. eg set 45% Performance with MOLAP on Partition 200708, set 30% Performance with MOLAP on Partition 200707 and so on.
    * For rarely queried partitions, use ROLAP with slice value setting.
    * Do apply Proactive caching for frequently accessed MOLAP Partitions- An enhancement feature.
  4. While Processing Dimensions, beware of those which exceed 4 GB since an exception is thrown. Finally Sunil has found a workaround that a property for a dimension should be manually set to attributes rather than tables. Take a look at this link for settings required to increase the size of string store. Usually a common error that occur here is "File system error: A File Store error from Write File occurred.Physical file...". Solution is to change all the ProcessingGroup properties to ByAttribute
  5. AMO can be used in ASP.NET by making sure that the Account Name you add in your Application pool has OLAP Admin privileges. Once done you can even delegate the privileges to users using AMO. You can Change the account in Identity tab of your selected Application Pool's Properties of inetmgr.
  6. Clearing Analysis services 2000 Process logs (mdb files or tempdb, incase if you have migrated the repository, Prashant is much aware of this issue)
    * Few possible methods mentioned in Microsoft web link:
    * Right Click Analysis Services Properties , Logging - >clear the log - > Specify drive having enough space. Dbcc shrinkdatabase tempdb or dbcc shrinkfile(,’space in MB’).
    * Alter database tempdb... command to specify the least size. Stop and restart the analysis service.
    * But in fact the second step considerably increased my tempdb space.
  7. Have you come across an error which is very common: "Class Not Registered", when you click browse cube. Solution is to install the OWC – Office Web Components.
  8. Validation of Data by browsing dimensions and cubes consists of the following Steps:
    * Check if all the levels show data after migration.
    * Browse the Cube, Check every dimension against the calculated members, and drill down to the lowest level and match this against legacy.
    * Note: Pivoting should also show the similarity in data up to lower levels.
  9. The attribute hierarchy relations will get grouped down at the lowest level attribute (key attribute). In the above case, every attribute must be re-linked manually to its immediate parent attribute.
  10. Recreating the dimensions are required incase of Attributes having Parent Child relationships
  11. Have you come across an error: OLEDB or ODBC error, Login failed for user 'NT AUTHORITY\NetworkService'.;42000. This is because we are supposed to add/ create the above account for the Sql Server in which your data source reside. Do apply sysadmin privlege to it if you are using AS2000. You may want to look at this link for detailed help.
  12. You might get an error - "Server Option cancelled.. check the driver..." , Solution is to check if IIS, Network DTC, Network COM+ in Windows Installation Componenents - > Application Server has been checked. and the respective Services are running in Services snap-in.
  13. "Error in OLAP Storage Engine: Invalid Key Error", Solution is to Check for the Data Quality issues and While processing if you require to ignore these errors , then Change settings under Dimension key errors tab to ignore errors.
  14. "Drill through failed. Error in the OLAP Engine .." . Solution : Under Analysis Services -> Advanced Properties -> Set Value of OLAP\Process\ROLAPProcessingEffort = count(Fact table ) + probable growth of rows. Provided the fact Dimensions have been added for the drill through with ROLAP Storage set.
  15. "Connection timed out .." Solution: Under Analysis Server -> Advanced Properties -> set Value of ConnectionTimeout= 0 and ExceedConnectionTimeout =0.

Comments

Alan Koo said…
You said:
* Do apply Proactive caching for frequently accessed MOLAP Partitions- An enhancement feature.

Do you know why this doesn't work? I got this issues two weeks ago, the proactive caching was working very well in testing, but when we went live it didn't work anymore so we contact Microsoft Premier support, but they didn't find any reason for it to fail.

Thanks!

Alan Koo
Interesting to hear from you Alan Koo. I am wondering why that didn't work on your live environment, if it ran fine on your test env. Now the questions for you are :

1) Are the Sql Server 2005 Editions in TEST and PRO same ?

2) Have you applied the same settings in PRO as that of your TEST ENV ?

3) Are you running lastest Service Packs ?


Thanks
Subhash Subramanyam
Anonymous said…
What remarkable phrase
Anonymous said…
It was extremely interesting for me to read that post. Thanks for it. I like such topics and everything that is connected to them. I definitely want to read more soon.
Anonymous said…
I thank you very much, webmaster, for this great blog. I really appreciate your posts.
Anonymous said…
It was extremely interesting for me to read the article. Thanks the author for it. I like such themes and anything that is connected to this matter. BTW, try to add some pics :).
p said…
hi,

iam facing this error while processing one dimenstion
i tried somany options which mentioned in the Blogs,and already my dimenstion running under by Attribute only..

File system error: A FileStore error from WriteFile occurred. Physical file: \\?\M:\MSSQL\Data\AAA.db\Dim.0.dim\68. Key.asstore. Logical file: . . Errors in the OLAP storage engine: An error occurred while the ' Key' attribute of the 'Primary Dim' dimension from the 'DNNB' database was being processed.
please help me any solution for this.....

Popular Posts