Automatic Updating during Build?

Oct 19, 2010 at 7:53 PM

I just proposed to my boss that we use NuPack for internal dependencies. He really liked the idea, but he has one question which I haven’t really been able to find the answer to.

Is it possible to automate the updating of packages as part of our build?

We have certain libraries that we would always want to pull the latest version, whether building locally or on the TFS server. I read in the FAQ that the reason there is no command line tool is because NuPack automates Visual Studio, which makes sense… but would this also pertain to updating a NuPack package that is already referenced?

If this is possible, can anyone point me in the right direction?

Oct 19, 2010 at 9:39 PM
You could do this with a special task (once the command line tool is rockstar), but my question is why at build time? Updating your packages could break the build / tests. Breaking builds should be reserved only to humans. :-)

If updating your packages was super easy (it will be with NuPack), then have it be something that happens somewhere at night (one dev's workstation) that can be verified and checked into source control before it breaks the build. You get your autopilot that way and you don't needlessly break the build. Just one developer per project in charge of updating packages. :D

Of course I'm curious as to the reasoning.
"Be passionate in all you do"
Oct 19, 2010 at 10:43 PM

VS automation is an important part of NuPack. Suppose you have the following situation, you've installed an awesome package which depends on cool package:

MyAwesomePackage v1 -> SomeCoolPackage v1

Now the makers of MyAwesomePackage updates their package to v2, which now has another dependency.

MyAwesomePackage v2 -> SomeCoolPackage v2
-> AnotherCoolPackage v1

If your nightly build process updates MyAwesomePackage automatically to v2, it's going to have to install AnotherCoolPackage v1. That means it needs to update your development environment (aka your VS solution) with the new assemblies and files brought in by this new dependency.

I don't know about you, but it's not something I'd expect my build server to do. Not only that, keep in mind that it would need access to Visual Studio to properly add the new assembly references.

Oct 20, 2010 at 1:17 AM

Hey, thanks for the replies.

I totally understand what you’re saying. But the thing is that we would only be using this for internally developed libraries that our own solutions reference.

We have a script setup now that automatically copies these libraries to different locations on our TFS server to make sure everything stays up-to date, but this is cumbersome to maintain, and using NuPack would be much simpler.

So, that being said, It would be awesome if we could have something during the build that would call NuPack API to update the packages.

Oct 20, 2010 at 3:31 AM
Edited Oct 20, 2010 at 3:31 AM

Rob, Phil,

Thanks for your replies.  Let me try to describe our situation a bit further to see what best practice you would recommend.

We currently have multiple TFS projects, including those that contain multiple internal utility DLLs. Previous to our upgrade to TFS 2010 we used Dependency Replicator to push "released" libraries to global \Library folders in projects that had dependencies. Our client projects would look in that global directory for the references, which would trigger automatic refresh of dependencies during both local and TFS builds. After our upgrade, and for other reasons, we would like to move away from Dependency Replicator and hence are considering NuPack.

I agree with you that from a best practice perspective, updating a dependency should be equivalent to a code change, in the sense that it should be done and tested by a developer before being checked into source control/build.  
However, in practice we have virtually never had cases where changes to internal libraries were not meant to be propagated to our client projects.  our strategy with internal DLLs has been to always keep our development branches synced up to the latest internal libraries.  This way, we are constantly building against and testing a single and latest version of our libraries, and we don't have to support outdated versions of our internal DLLs.

My concern with NuPack is that by forcing this update process to become manual and per project, we will end up with dozens of projects that may erroneously reference different private versions of dependencies.  

What would be great is a way to centralize dependency propagation in way that:
- would be scoped to an entire product branch (and can be tested/built/merged along with the source code in that branch)
- would insure the same version of the dependency for all projects in that branch
- would give us the control over when/what version of the dependency is propagated

What approach can you recommend?


Oct 20, 2010 at 4:25 PM

One approach you could take is to have an internal feed for these internal packages. Thus when one of these internal libraries is updated, you push the update to the feed. Now, the next time someone opens up VS, they can run the command List-Package -Updates.

Even better, I think it's possible to have a package that runs a PS script every time VS opens. That package could run something like:

foreach($id in (List-Package -Updates | select Id)) {Update-Package $id}

I believe all you need to do is put that command in an init.ps1 script in a package and make sure that package is installed.