Huge amounts of data being sent FROM backup server, to shadow protect client(s)

Hi,

We are using shadowprotect server on a few servers that back up to a remote backup server:

Server 1: 260gb continuous incrementals

Server 2: 150gb once-weekly full backup - disabled on the client

Server 3: 90gb once-weekly full backup - disabled on the client

(the above is the backup size btw)

I can understand data being sent from these servers to the backup server is large. What I can't understand is why, at seemingly random times of around once or twice a month that I'm seeing HUGE amounts of data being sent FROM the backup server TO these clients?! For example, a couple of days ago it decided to send 325gb from the backup server to the client resulting in an awesomely huge bill from our cloud provider. 

This is not the first time it's happened. It happened about a month or so ago. The backup server has image manager installed so I completely stopped and disabled the service, thinking that it's somehow verifying the backup. 

Does anybody know why this would happen? It appears it's sending more data back to the client than is being backed up. If I was to guess, it's probably some "image verification" going on to make sure that the backup is valid and working.

Comments

STC-JoshS

I would check to see if your

I would check to see if your machines are running differentials, this would require the software to read the existing chain to figure out where it was and could cause that. Differentials are usually caused after a system experiences an unexpected shutdown.

someuser11

Thanks for the reply. Nope,

Thanks for the reply.

Nope, differentials aren't configured. We do however have differentials configured that go to a (local) backup server. So we really have two backups that run for the night; one that is continuous incrementals to a remote backup server and another which is a differential to a local backup server.

STC-JoshS

I would still check to see if

I would still check to see if the incremental has done any differntials lately. Unless you are on SP5 and have the Self Healing incremental option disabled, SPX will do differential backups to repair the chain if needed.

someuser11

Oh ok, I didn't know that it

Oh ok, I didn't know that it can do that. Yeah I guess sometimes there are unexpected system failures. That might be the cause of it. We are using SP4.

Is there any way to find out if it was actually doing a differential at that time? I'll check the logs, it will probably be in there I guess. Thanks.

STC-JoshS

Yes it'll be in the logs. It

Yes it'll be in the logs. It will have the word DIFFGEN in the backup log instead of VDIFF for a nromal incremental.

JDDellGuy

Possibly a non-optimal configuration in use

First off, I second the idea that a differential backup is running as that would explain the databeing sent to the clients for the purposes of calculating a differential.

Secondly, if I understand what you have in place, you are backing up a machine directly to a backup server which is located in a geographically remote location.  This is a non-optimal configuration.  A more optimal configuration would be to back up to a local backup server which runs ImageManager, and then have ImageManager configured to replicate those backups to the offsite backup server.  Not only will this provide a faster backup connection to the computers being backed up, it will also solve the issue you're seeing with data being sent to the clients.  This is because if the client had to take a differential backup, it would be communicating with a local backup server instead of a remote cloud server.  It sounds like you already do backup to a local backup server, but via a separate backup job.  With that in mind, yo ualready have all the pieces to do this, you just need to do some configuration changes.

Note that doing this requires that you use continuous incrementals, that you manage the backup chains with ImageManager (free), and that you setup replication in ImageManager with either intelligentFTP (free, but requires you to setup a receiving FTP server, which also runs ImageManager) or ShadowStream (not free, and requires the setup of ShadowStream Server and ImageManager on the remote backup server).

Let me know if you're interested in more explanation on the setup of this scenario that I'm describing.  Changing your attack strategy should greatly improve both the performance and simplicity of your backup configuration.  It would only need one backup job to accomplish both local and remote backups and like I said, it would remove the issue you're seeing with data being sent back to the client from the remote backup server.

__________________

StorageCraft Certified Engineer

lumacor

Agreed. The overall strategy

Agreed. The overall strategy definitely needs to be reviewed.

You could have something running locally, like an Intel NUC running Windows 10 with attached storage, like an iSCSI NAS or USB 3.0 disk.

The NUC could have ImageManager replicating data off to a low-spec Cloud management device (e.g. an Azure F2 spec machine running FileZilla server, with attached / mapped Blob storage). This is something you could easily play around with on an Azure trial.

__________________

StorageCraft Certified Master Engineer

Veeam Technical Sales Professional (v9)

someuser11

Ok so this problem just

Ok so this problem just happened again. For the record, I there is nothing at all in the log (on the clients) that have the word DIFFGEN or VDIFF for that matter.

JDDellGuy thanks for your suggestion. Yeah we do have a local backup server but it's currently a linux server, so no image backup. But we can transfer to a windows machine easily enough that can handle that.

Yes that would be great if you could tell me a little bit more about the setup. But it seems pretty easy; client -> local backup server -> replicated to remote backup server via either iFTP or ShadowStream. Is there much more involved here? I had image manager running on the remote backup server but ended up just turning it off. It consolidates backups and wraps them up but over time you need to go in, remove the old backups then start with another full. I just manually do a full backup every now and again and remove the old ones at some point in the future. 

Where can I find more information about image manager replicating off to a remote server?

Thanks all for your replies.

STC-JoshS

Here is the documentation for

Here is the documentation for Image Managers replication options:

https://www.storagecraft.com/support/book/storagecraft-imagemanager-user...

someuser11

Hi am just replying to say

Hi am just replying to say thanks and that I'm now using ImageManager to replicate local backups to the remote backup server directly using a network share (VPN). Didn't need to use iFTP or ShadowStream or anything like that. Seems to be working ok so far. 

lumacor

Reliability

If your network / VPN connection doesn't suffer drops or anything, it sounds like you've resolved the issue.

Worth keeping in mind that the Network Replication doesn't support a resume function.

Not so much of an issue if incremental files are always small, but a headache if you have fairly large incremental backups transferring and then dropping about 90% of the way through.

I've had to configure FileZilla / iFTP over unreliable network / VPN connections before, to ensure replications are able to continue and resume from where they dropped.

Luke

__________________

StorageCraft Certified Master Engineer

Veeam Technical Sales Professional (v9)

someuser11

Interesting to know. Most

Interesting to know. Most servers the incrementals aren't that big but sometimes they can reach up to 5gb+ for the Exchange server. I see when ImageManager rolls up the backups in to a weekly (for example), the incrementals are based on that roll up, not the last full backup which is good (it means that the incrementals don't keep getting bigger and bigger). I can't delete previous week's roll ups though can't it? 

lumacor

Query Path

You might find this useful to understand the minimum file dependencies required:

https://www.storagecraft.com/support/kb/article/202

You can run the query on the latest incremental image to find out what you need to keep in order to restore from that point.

In theory, you could remove anything outside of the listed dependencies. However, they'll be the only ones left you could then restore from.

Luke

__________________

StorageCraft Certified Master Engineer

Veeam Technical Sales Professional (v9)

someuser11

Thanks that helped.

Thanks that helped.

Terms and Conditions of Use - Privacy Policy - Cookies