Business Intelligence Blogs

View blogs by industry experts on topics such as SSAS, SSIS, SSRS, Power BI, Performance Tuning, Azure, Big Data and much more! You can also sign up to post your own business intelligence blog.

How to Schedule and Automate backups of all the SSAS catalogs within the Server Instance.

  • 23 January 2014
  • Author: Anil
  • Number of views: 13660

With the course of time, you might need to backup all the SSAS databases within a server instance automatically and schedule to backup these database monthly wise or according to your specific time set. 

May be there are a lot of alternative methods to perform this task and many solution's. But I have researched a lot regarding to backup all the SSAS databases within a particular server instance dynamically. 

Previously, I have shared “How to schedule and automate backup all the SQL databases within a server instance”.

Here ,I am going to show how to obtain all the cube databases backup automatically along with schedule time set.

 These are the following steps you should have to follow.

1. Adding a linked server in SSMS.

You can simply add a linked server within a SSMS by using a script as

 --Adding a linked server

 EXEC master.dbo.sp_addlinkedserver

   @server = N'SSAS_Backup'

 , @srvproduct=N'MSOLAP'

 , @provider=N'MSOLAP'

 , @datasrc=N'ANILMAHARJAN'  /* <<< My Analysis Services server name */

 /* <<< My Analysis Services database name */


  --Setup security as per your environment requirements.

EXEC master.dbo.sp_addlinkedsrvlogin


 , @useself=N'False'

 , @locallogin=NULL

 , @rmtuser=NULL

 , @rmtpassword=NULL


2. Make a SSIS package.

You can simply create the SSIS Package as shown in below

2.1. Create the table in SSMS of the output of all the catalog name within a server given by executing the query into the DMV in SSAS from SSMS.

You can use the following script in order to find all the current catalogs name within a server. Also here you can create a log file to know the entire backup process by setting within a job agent.


      DROP TABLE [CubeBackupInfo]

SELECT  * into CubeBackupInfo  FROM OPENQUERY(SSAS_Backup,'select * from $system.dbschema_catalogs')

2.2. Read the total no of database backup to be made by reading the max count of catalog name from the table just we created above.

2.3. Read the backup XMLA within a variable into SSIS from a particular location.i.e backup XMLA that generated manually and we can also generate it by using a some C# or code embedded within SSIS.

Here is a Backup XMLA script as







2.4. Now, within a for loop, set the max loop to the variable as @Max_No_Backup i.e max no of backup to be made.

2.5. Here read the catalog name of database with in a server one by one  by using a table that we created before from CubeBackupInfo.

2.6 Modify the XMLA script using ‘Script Task’ within where I have used C# code in order to modify the XMLA and generate the modified XMLA for each catalog name one by one.

2.7. Backup all the cube database with in a server into a default location of SSAS backup.

-stores accourding to the ‘catalog name’ along with the ‘system date’ in order to know the particular backup date. as i.e. TestCube-03-11-2011.abf

-also catalog name can overwrite it if it exist already.

i.e. : C:\Program Files\Microsoft SQL Server\MSAS10.MSSQLSERVER\OLAP\Backup

3.Make a SQL Server Job Agent.

3.1. You can easily make a job in SQL by following this link

and  Running and Scheduling SSIS Packages by following the link as

3.2. Make schedule to backup all the catalogs/ database within a particular server. i.e. monthly or weekly according to your specific time set.

Hope this will help for someone , also I am thinking to right an article about 'Dynamic partition within a cube from underling partition table' using a similar approach where I have researched a lot in this topic too.


Anil Maharjan

Rate this article:
No rating


Highly motivated Business Intelligence Engineer having leadership abilities and team work skills as well as the ability to accomplish tasks under minimal direction and supervision. Has more than 5 years of development & implementation experience in HealthCare Data Analytics and Telecommunication

Other posts by Anil

Full biography Contact author

Please login or register to post comments.