If you are processing a large number of files to/from a consumer-level Cloud service (excluding Amazon S3, Microsoft Azure, Google Storage, Backblaze B2, OpenStack, OVH, Egnyte), there is a chance that Out-of-memory errors may be encountered. The error message will be logged as such:

Failed to scan files: Out of memory

You may get this out of memory error when working with any of the consumer-level Cloud services as listed below:

- Box

- Dropbox

- Office 365 (OneDrive for Business and SharePoint)

- OneDrive (Personal)

- Google Drive

- SugarSync
- pCloud
- Citrix ShareFile
- Google Photos

The simple solution is to switch to the 64-bit version of SyncBackPro. This lets SyncBack use all available memory instead of being restricted to just 2GB (as is the case with the 32-bit version).

Technical Reason

There is a special type of cloud data that needs to be stored in memory. This data include pointers, special-ID, para-ID, etc. which are interlinked together and relate to how the Cloud service handles and processes cloud-based data files.

Due to the way these cloud file details interrelate with one another, SyncBackPro has to store them in memory until the entire batch of files in-scope has completed processing and it can't simply transfer them over to disk storage to free up memory.

As a result, if there are a large number of files to be processed, SyncBackPro will run out of memory eventually. If you are using the 32-bit version of SyncBackPro then it can only use 2GB of free memory in the PC system, regardless of how much RAM is available.

If you encounter such errors during a profile run, we recommend splitting up your data into several profiles and then placing them in a Group so that each profile is processed sequentially. Source path and file selections on each separate profile may need to be tweaked, with each profile pointing to a sub-set so that the data can be backed up in batches.

Alternatively, consider switching to the 64-bit version, which can use all available free memory.