Scenario: I have to do deploy some application(s) across many co-located data centers. The collective size of deploy will be of the order of tens of GB.
Conventional methods like scp, rsync and http fails:
scp will not resume if it breaks at any point. Every time I will have to start over and over again.
rsync works well with text files, not so well with binaries (it works nonetheless). The amount of CPU it eats is unacceptable though.
http can resume most of the times but as more servers try to download the application, the bandwidth limitations slow down the entire process.