We’ve received a few online, and in person questions like this, so i figured it was probably worth explaining in a little more detail.
On the Deployment server, we have a variety of applications that we deploy. From Windows .Net Services, Python, Classic ASP, CSS/JS and PHP to name a few.
We chose to standardize the interface to the Deployment server to make creating new code deployment clients simpler. Our Deployment server is essentially an on demand package creation and deployment system.
For example when we push CSS/JS we minify the contents prior to packaging it up to be picked up by the static content servers. Same goes for things like the python deployments, where we actually build a virtualenv on the deploy server and package that up prior to deploying. This allows the code deploy clients to just download a ready to go package.
Other side benefits are that this allows us to cache the packages with varnish to offload the push server during deploys. It also allows us to have a cached version of the packages in each data center to increase the speed of deployments locally. We also get a little bit of increased security. Really more the Onion / Layers reasoning here, but every little bit helps.
A con that should be noted is that unlike an rsync or direct repo deployment, it’s a full package, not an incremental. At this point in time our largest application deployment is the CSS/JS application which is still only about ~70M when compressed. We may need to move to a different model when bandwidth/download time of packages become the bottleneck, but we have a lot of breathing room for now.