I've recently come across backup failures on an SSAS database with the below error...
"Memory error: Allocation failure : Not enough storage is available to process this command."
So I was thinking about the errors & think these can be gotten around to a degree.
In AS there is a setting called MemoryLimitErrorEnabled that controls whether an error will be generated if the estimated memory needed for an operation can’t be granted. See this technet article for more in-depth info.
I’m pretty sure this is causing at least some of the errors. Since this is for estimated memory usage only there shouldn't’t be any detrimental effect to disabling it for backups (presumably you won't be processing cube dimensions while taking backups). The worst case is the backup will just fail a bit later when it actually runs out of memory; at best the backup will succeed & not require manual intervention or a restart.
The plan would be to
1. set MemoryLimitErrorEnabled to false
2. take a backup
3. set MemoryLimitErrorEnabled back to true
Setting MemoryLimitErrorEnabled does not require a restart to take effect & can be scripted.
I've tested this method on a Development server & it worked perfectly.
Quick update, on implementing this solution I realized that the workflow of the SQLAgent job would need to change. This workflow is designed to always leave MemoryLimitErrorEnabled - true regardless of whether the backup step succeds or fails. Please forgive the roughness of the document, but I don't have access to Visio.