single deployer machine

Jun 4, 2012 at 5:36 PM

Hello,

I've been doing research into what are the options available for TFS to perform automatic deployments up the environment promotion chain, e.g., Dev, QA, Staging, Production.  This appears to be a great way to control it using existing TFS capabilities.  What doesn't seem to be all that ideal is the setup on each target machine, i.e., installing Team Explorer, setting up TFS users, running a service, etc., seem like a lot of work to simply copy some files and run a script.  I would like to get thoughts on using this same concept but executed in a slightly different way.

The key to the idea is to have the build process generate the deployment packages needed for each of the environments, or, now that I'm thinking about it, there only needs to be a base package to which, at the time of the deployment, the correct configuration is applied, and finally the appropriately configured (and signed) package is generated and deployed.  Starting from here, the build quality change events still trigger a deployment but then from the same build server, have powershell 2.0/WinRM remotely run commands or a script on the target machine that would download the package from the drop location (or sharepoint or wherever), extract, and execute the deployment.  This does require powershell to be set up on each target to accept remote commands.  Or if you choose, it could also just launch msdeploy from the script.  This requires the WMSvc to be setup and running on each target.

With this methodology, TFS Deployer only needs to be installed on one machine.  I would just assume to install it on the build controller or a build agent.  Then the WinRM configuration on each target just needs to have access to the drop folder and allow remote access from the deployer machine.

I'm not sure this is even possible, but this would be an ideal solution for me and probably quite a few other people.  This also opens the door to deploying to other operating systems (via SSH).  Any thoughts or hints would be great.  Thanks!

Coordinator
Jun 5, 2012 at 12:53 AM

Hi,

Installing a single instance of TFS Deployer on the Build Controller is exactly how I use TFS Deployer within the company I work for and is how I recommend others use it too.

There are some additional considerations to be aware of when using it this way though:

  • Build drop locations are on network shares (in TFS 2010 and prior anyway) so a PS Remoting session will encounter second-hop authentication issues if the target machine tries to access the drop network share. This can be solved with either:
    • Kerberos delegation
    • Using CredSSP authentication for the Remoting session (which unfortunately requires explicit credentials and therefore storing the password somewhere).
    • Copying the required files from the drop location to the target machine before running deployment via Remoting. You can use an existing admin share (eg ADMIN$, C$), a preconfigured deployment share on the target, or send the files over the Remoting pipe (not officially supported).
  • Deployment scripts may also require access to other second-hop resources other than the drop location share. A SQL Server on a third machine is a common example but even accessing the certificate store local to the Remoting target machine can incur a second-hop.
  • If some deployment environments are in an isolated network environment from the Build Controller you may need an additional TFS Deployer instance per network segment.

On the upside, both PS Remoting and the WMSvc (for use with MSDeploy) can be configured by Group Policy and Windows Server 2012 has Remoting enabled by default.

Regards,

Jason

Jun 8, 2012 at 1:15 AM

Hi Jason,

Thanks for tips.  That is very helpful.  I did find that deploying to environments in an isolated network segment will not work unless there are valid certificates to allow for https communication.  Unfortunately, the my IT team did not want to set up the revocation checking service so none of the certificates on the servers are valid.

In the end, I am using msdeploy to cross the network boundaries.  I had already set this up for our development environment and has been working for a while but directly deploying as part of the build process versus via TFS Deployer.  So basically, it is calling vsdbcmd and msdeploy from the deploy (and build) machine with the proper credentials.

Until MS makes the command line remoting as simple as rdp, this is probably the best way to do it if you can rely solely on vsdbcmd and msdeploy.

Dave