NuGet Design Issues within a large Development & Build Process Workflow

Aug 11, 2011 at 6:27 AM
Edited Aug 11, 2011 at 6:29 AM

The purpose of this discussion is to get some productive dialog around some of the issues experienced in using the current version of NuGet within a large corporate development & build process, and assist in taking the design of the product down a route that ultimately works for everyone's requirements.  Apologies in advance for the length of this post.

Some basic requirements

  • We will host our own private NuGet feeds
  • 3rd party dependency references should be sourced internally from our NuGet feeds and package dir content will not be checked in to VCS
  • Internal dependency references (dependencies we build) should be sourced internally from our NuGet feeds and package dir content will not be checked in to VCS
  • Internally built packages can update on an hourly basis, which will trigger dependent CI build configurations
  • Internally built packages will be straightforward framework packages, containing only assemblies with no additional functionality (just smart RoboCopy)
  • Only one version of a dependency reference should be referenced in any given solution (no side-by-side versioning by default).  This would remove any need for HintPath updates / version control thrash.
  • With our choice of VCS being TFS, file-locking and the TFS model need to be considered for VS and non-VS scenarios
  • Installs & Updates should be possible without dependency on Visual Studio (such as on a Build Server via MSBuild)
  • References can be specified using maven-style versioning (not just dependency versions in the nuspec file), allowing for static packages.config (based on the current design)

Links to Spec's & Issues

The requirement for not checking in dependencies is well accepted on the forums, and looks to be included as a spec for future version:

http://nuget.codeplex.com/wikipage?title=Enabling%20Using%20NuGet%20Without%20Checking%20In%20Packages%20Folder

The requirement for single versions of assemblies has also been requested:

http://nuget.codeplex.com/workitem/914

Also updates and dependencies on Visual Studio have been noted too:

http://nuget.codeplex.com/workitem/913 - workflow of current update-package is a question below

http://nuget.codeplex.com/workitem/902

Issues

We've come up against some roadblocks (as expected!) in the current version [of NuGet] that have encouraged us to become involved in contributing to the code-base.  We'd like to request (then maybe provide) some guidance as to the intended approach to the re-factoring of the code-base to enable the support of the functions required to enable the ExcludeVersion install / update workflow of NuGet packages (across both single & multiple project solutions).

Unlike the command-line exe, the PowerShell commands / VS add-in do not support the ExcludeVersion capability.  This is not acceptable in our environment, as developers {#headcount-in-the-thousands} will all need to follow the same workflow, and have that workflow reflected on the build server.  Having spent 3 days re-factoring on our own fork of the code-base, we're currently in a 90% complete state which is good, but are now having difficulties understanding how the current workflow and design, specifically of the update-package command, can actually work in a ExcludeVersion workflow.  It seems the design is tightly coupled to the side-by-side versioning paradigm.

When updating a NuGet package (and consequently the project references) in a solution where only one version of a package can be installed the workflow would have to work something like this:

  • Check for update
  • Download new version to cache directory
  • Dereference current version from project files (if using ExcludeVersion, across whole solution in required projects)
  • Uninstall any VS object model changes if possible (if using ExcludeVersion, across whole solution in required projects)
  • Ensure old package is in cache, copy if not
  • Remove package from packages directory
  • Unpack new package from Cache directory to packages directory
  • Add references to project files (if using Exclude versions, across whole solution in required projects)
  • Install any VS object model changes if possible (if using ExcludeVersion, across whole solution in required projects)
  • Update packages.config files in all required projects to reflect change

Currently it works something like this (we think):

  • Check for Update
  • Download new version to cache directory
  • Dereference current version from target project file
  • Unpack new package to versioned packages directory
  • Uninstall any VS Object model changes
  • Modify references to new package location
  • Install any VS object model changes (if using ExcludeVersion, across whole solution in required projects)
  • Update packages.config files in all required projects to reflect change
  • If not referenced in another project, remove from packages directory
  • Repeat for next project

The magnitude of the rework required in our opinion exceeds the boundaries that fits in to the current design which is why this is getting pushed back to the community.  ExcludeVersions quite simply does not work at the moment across the whole of the toolset, and it is one of the primary things that we need in place to make this work within a corporate environment.

So, questions that we have:

  1. What are the current design thoughts and timeframe around implementing ExcludeVersions across all NuGet tools?
  2. Who is currently looking at this, and are they open to input as to how this can be implemented?
  3. Why is there such a disparity in functionality between the NuGet.exe, PowerShell commands and the GUI?  For example, the workflow behind the PowerShell update-package command is not written with ExcludeVersion in mind.  Was this a conscious design choice?  If so, why?
  4. Can you provide some clarity or guidance around the NuGet verbage and its actual implementation?  For example, Update does different things depending on which tool you use, and sometimes yields surprising results (NuGet.exe only updates the packages.config, whilst PowerShell updates the whole package and installs it).

Thanks in advance for any feedback gained from this discussion.  None of the above intends to take anything away from the amount and quality of work achieved to date on the product.

Aug 11, 2011 at 6:46 AM

Looking at the requirement "With our choice of VCS being TFS, file-locking and the TFS model need to be considered for VS and non-VS scenarios", it is unclear as to how this affects the subsequent workflows - is that covered in the "VS Object model changes" via the SCC interface?

Thanks,

-Steve

Developer
Aug 11, 2011 at 6:47 AM
  1. No time frame. ExcludeVersions was added to the command line for people that only wanted to use nuget.exe but not the rest of the tool chain (the old "nu project" workflow).
  2. It's hard to make work correctly for everyone, we attempted to make a change to enforce one version per project and there were several issues we faced (there's on the discussions here).
  3. Powershell and the GUI are running within VS and have DTE available. Yes it was a conscious design choice. There are things you can do within VS that you can't do from the outside as easily (DTE apis handle alot). The way we designed NuGet was such that you can plug in different project systems/repositories and everything is definitely possible from outside of VS but it'll be hard to get 100% compatibility.
  4. See this post (http://nuget.codeplex.com/discussions/268464).
Aug 11, 2011 at 6:52 AM

Understood - thanks.  Sounds like its completely unobtrusive when using git/hg which is my primary concern.  My handling is entirely through .gitignore/.hgignore.  Thanks.

Aug 11, 2011 at 7:00 AM
Edited Aug 11, 2011 at 7:01 AM

(Note #1: personal, subjective, opinion follows. You know ... because it's a discussion forum and that's what we do here.)

I've yet to see a compelling reason for why people need to keep their packages out of source control. Sorry, but 'file size' doesn't cut the mustard. It's on the same level as wanting to rename 'packages' to 'lib' just ... because.

My personal opinion is that the NuGet team should continue to focus on their existing workflows. Spending time supporting the non-package-check-in-crowd is as yet unjustified.

You're doing a good job supporting the Get Latest -> F5 experience seamlessly from day 1. Let's keep the effort along that line.

(Note #2: The direct questions asked at the end of the OP are valid. Experiences should be consistent across clients where possible. I'm just weighing in that I think the premise of the post is of low priority.)

Aug 11, 2011 at 7:10 AM

NuGet works great for external dependencies, but it is bad choice for internal dependencies, where you want to automatically update and rebuild all dependent project on every commit. Since NuGet can update more than just references (content files, config...), it would probably involve distinguishing between "full" and "DLL-only" packages (which can be updated without the need to commit changes to source control). Not sure this is good idea, but I like the way Maven manages dependencies and would like to see something like Maven in .NET.

Yes, you can use command line nuget.exe to download the packages, which works for referenced DLL if ExcludeVersion is used, and then pull any content from "packages" folder in pre-build event of the project that is using it, but I agree with chappoo that the disparity in functionality between nuget.exe and GUI is disturbing.

It would be great if the ExcludeVersion was solution-wide setting.

Aug 11, 2011 at 7:12 AM

I hear your point, but supporting the non-package-checkin-crowd is huge to a lot of us.  It is especially applicable to certain project types and source code control selection combinations.  In particular - things that do not yield good compression results (binaries, pre compressed assets (i.e. graphics), etc) become very heavy over time in DVCS.  I want source code control to be nimble and fast, and binary checkins can quickly become a major contributor to missing that goal. The promise of nuget solving this problem in a clean/systematic way is extremely compelling.

Aug 11, 2011 at 7:22 AM

I have to say that after advocating committing binaries early on, I have become strongly against it. When using a DVCS, it's just a horrible thing to do.

Aug 11, 2011 at 7:25 AM

The only supported method that enables a transient packages folder workflow (i.e. not checking in your packages folder) is to embed a NuGet.exe install in each pre-build event.  TFS controlled files are read-only unless 'checked-out' and this causes problems when running NuGet from the command line.  

Updates also fail because version numbers need to be updated in package.config.

Aug 11, 2011 at 7:36 AM

The ability to make a certain package dependent on a certain feed.

Example Feed Collection:

  1. Public Nuget Feed
  2. Internal Master Solution Feed
  3. Local Box Testing Prepare Feed
  4. Local Box Development Prepare Feed

When I am developing package X I would like to be able to cycle through my projects and have my local feeds referenced while I am developing multiple packages. Currently I have to make sure that my current application uses and updates only from the Development Prepare Feed as to avoid conflicting with developers utilizing my Testing feed. So that the contents of the package can be validated across multiple locations before it is submitted into our internal Solution Feed.

This also goes for maintaining and spinning new applications with a verified and working compatibility against a specific nuget package version. We maintain a version that might exist in the public feed so that our less skilled developers can use the UI and not worry about utilizing the powershell interface which they are not often familiar.

Aug 11, 2011 at 7:46 AM
dkl wrote:

NuGet works great for external dependencies, but it is bad choice for internal dependencies, where you want to automatically update and rebuild all dependent project on every commit. Since NuGet can update more than just references (content files, config...), it would probably involve distinguishing between "full" and "DLL-only" packages (which can be updated without the need to commit changes to source control). Not sure this is good idea, but I like the way Maven manages dependencies and would like to see something like Maven in .NET.

Yes, you can use command line nuget.exe to download the packages, which works for referenced DLL if ExcludeVersion is used, and then pull any content from "packages" folder in pre-build event of the project that is using it, but I agree with chappoo that the disparity in functionality between nuget.exe and GUI is disturbing.

It would be great if the ExcludeVersion was solution-wide setting.

Agreed.  Internal dependencies have the potential to be built several times a day and each project that holds a reference to an old build would need to be updated - this means automated VCS check-ins if persisting packages which simply doesn't make sense and is not the job of the build system.  

NuGet references are, in my opinion and in this case, MSBuild project references that emit different behaviour.  ExcludeVersion / maven version referencing should be similarly provisioned.

Aug 11, 2011 at 7:49 AM

I'm not even sure ExcludeVersion is the correct verb - as this presumes the version number is present to begin with.  It would be nice to have a static reference location by default to avoid dynamic HintPath updates, and then opt-in if it's your preference to have side-by-side.

Aug 11, 2011 at 8:05 AM

And to re-iterate - managing 3rd party dependency references as opposed to managing internal dependency references within the application's dependency graph are two very different use cases, but should be supported :-)

Aug 11, 2011 at 9:32 AM
Edited Aug 11, 2011 at 9:32 AM

@dfowler So is it worth starting up a discussion around the technical changes that would be needed to the codebase to facilitate this functionality across all tools?  Adding the ExcludeVersions purely to the NuGet.exe starts causing problems when anyone adding a reference via VS/PowerShell will get HintPath values littered with version numbers.

We have already started looking at this in some depth, as essentially our options are to get NuGet to facilitate  this use case, or choose some other set of tools that does.  Even if forced down that route, we would look to try and use nuspec/nupkg as the "interface", similar to http://code.google.com/p/pepita/ but ideally we would love to see this functionality embedded.

@dkl I agree on their being two use cases, the "NuGet as a smarter robocopy" case and the "NuGet builds my solution/project/tempated code/makes tea when I add it" case.  Perhaps a specific flag on the package is not such a bad idea, as there is already a move to standardise this in the naming of the package, representing what you are getting when you install the package.  Gut feel is that most most people expect and probably want the simple version most of the time.

@chappoo ExcludeVersions doesn't seem like the correct verb.  I think that should be the default case, with IncludeVersion as a specific exception.  This would gel with my experience of never needing two different versions of the same dll in the same solution.

Developer
Aug 11, 2011 at 11:47 AM

Remember nuget has to work for everyone, not just your use case. Here's some food for thought:

  1. Also we're almost done with the next version of nuget so we wouldn't have anything official like this working anytime for you to use.
  2. We have a list of other features that are prioritized over this and unless the community tells us this is definitely what they want it'd have to be re-prioritized appropriately.
  3. This isn't something I'd tell anyone to send a pull request for as the scope and impact of a feature like this is huge. 

That said, we can definitely start a technical discussion on what it would take to implement this and what it would affect.

Coordinator
Aug 11, 2011 at 4:47 PM

Given the huge scope, I think we’d want a pretty solid functional design before we even considered working on it or would consider any pull request.

The challenge for us is that we don’t manage *our own* dependencies in this manner. So it’s very hard for us to implement this correctly because we’re not “scratching our own itch”.

However, if those in the community who really need this help drive a detailed functional spec with a list of scenarios so we understand why and how this is used, it’s much more likely we’ll get to implementing it someday. Especially if the community members actively act as stakeholders so we can get input the entire time. J

Aug 19, 2011 at 5:48 PM

OK, so to just throw my hat into the ring.

I've been using NuGet for the last month, and deployed my own internal company-wide Nuget server.

I have about 12 Nugets of which only a few are independent of each other, almost all are dependent on one or two of the other Nugets.

One particular Nuget has about 12 dependencies, and I still haven't figured out how to rebuild everything when that core Nuget has changed other than opening each solution, manually updating, rebuilding and running a batch file I've written to republish.

A single line change can take 20mins to publish through all the dependencies and is clearly impractical (am I being thick?)

 

We use SVN.  Checking binaries into source control seems dumb, and I was shocked to see how poorly Nuget deals with the absence of packages.  I know the thought is not universal (as can be seen from this thread), but I'm firmly in the camp that things a package manager solution should be able to easily and automatically restore the missing packages.  It might be OK for my own Nugets (not that huge), but consider the PostSharp nuget (which we do use).  Combined with the fact the nupkg file itself is in the packages directory that Nuget alone is about 17M, and it exists in ~20 solutions so far.

I'd also like to +1 the single version per solution, I know the arguments for the alternate but they seem the edge case (as I think your surveys indicate) rather than the norm, so it's frustrating that the focus appears inverted.

 

Finally, @Haccked, please, please, please, how do you manage *your own* dependencies, maybe if you could explain a compelling alternative I'd stop tearing my hair out.

Maybe there's a way to write a powershell to open solutions in order, update and build?  I'm very new to powershell and probably need to find some good resources.

Coordinator
Aug 19, 2011 at 6:04 PM

I just don’t think it’s dumb to check in dependencies. J After all, there’s only one thing I trust, and that’s my version control. I don’t trust network shares. I don’t trust my internet connection. I don’t trust my co-workers. :P

If you pull in dependencies from some external source, that source is not versioned and logged in a way you can verify. Someone could change something in source control, but I’ll know they changed it. I get commit emails. I can look at the version control log.

When I get latest, I know exactly what I’m getting. But if I rely on restoring packages from some other source, someone in theory could change the package from under me and what I have committed and tagged in source control doesn’t *exactly* match what I shipped.

Digression: Which actually gives me an idea. Perhaps we should store a hash in packages.config so we know when restoring a package, if the package has changed and can warn folks.

Having said that, I also don’t think it’s dumb to *not* check in dependencies. There are many valid reasons for it. Some folks think the risk is smaller than I do. So we do now have better support for that.

http://blog.davidebbo.com/2011/08/easy-way-to-set-up-nuget-to-restore.html

I’m using that on a project and it’s working out pretty well so far.

Aug 23, 2011 at 1:27 PM

Trust issues?  ;)

I agree that you should be able to use the tool both ways, and I think that some good work has been done to support the use of NuGet without committing packages.  But most large corporates create the majority of their own dependencies (I know, I know, it has to work for every use case...I just want "every use case" to include ours!).  We produce literally thousands of .dlls, and use comparatively few external dependencies.  We still have a requirement to manage our internally created libraries and their reuse, and NuGet is tantalisingly close to being the right answer to that problem.  We also need to ensure that we can reproduce our builds and ensure that dependencies are not tampered with, but source control is not the only way to do that.  Internally secured and published NuGet feeds do a reasonably good job of this (again, great work) as does a simple file share with some permissions.

Thargy, as stated we have a similar issue, although we have a few thousand projects we want to roll this out to.  We have scaled up to around 20 packages in a very nested dependency tree, and updating was hell.  We are actually looking to take one of the variant client handlers (MooGet, MuGet, PepitaGet etc) and look to modify it to have something that just ignores the package.config version and grabs the latest package from our internal, controlled and secured feed, deploying that to an unversioned package directory.  This requires no modification to HintPaths,and provides a "ticking" build pipeline, but it does mean we can't use any of the VS2010 IDE integration as it is (yeah, we are looking at modifying that too, but it is a bigger job).  Will update and let you know how we go.

Coordinator
Aug 23, 2011 at 5:40 PM

Just so I'm clear, what's missing in NuGet today that you need? NuGet.exe does have an update command. It comes with a lot of caveats:

  1. It won't run .ps1 scripts
  2. It won't add new content files to your project

But it will:

  1. Update assembly references

It's designed for cases where the package really only contains an assembly and no other content. That sounds like your situation. Have you tried playing around with it?

Aug 23, 2011 at 11:46 PM
Edited Aug 24, 2011 at 12:20 AM

What we need is really a "get latest".  We want to be able to:

  1. Query a Source for every Package ID in a packages.config (ideally where Version="Latest" or some other flag - best would be a Maven-style version matcher)
  2. Download the latest package for that package ID, and do not update the packages.config to reflect a later version (particularly if we are using a flag)
  3. Install the package to a non-versioned package directory, removing the need to update the HintPath in the project file
  4. Echo old/new version updates to the console/logger very prominently for capture in CI build process
  5. No requirement to update anything other than binaries in this situation, as it will need to run via console/MSBuild for CI process.

NuGet Update currently doesn't support this workflow (and I understand it wasn't built with this in mind, so that is fine).  For this functionality to work, it:

  1. Requires single version per solution (non-versioned package directory)
  2. Requires flag or maven version match in the packages.config, not just the nuspec
  3. Requires change to not update the packages.config file to reflect an update (this is more like a source code get latest)
  4. Requires a change to the install process such that it is atomic, and does not install/uninstall via the packages directory (currently IDE update requires both packages to be present in the packages directory to complete the process)
Coordinator
Aug 24, 2011 at 1:16 AM

Interesting. I didn't understand the requirement for #2. Why wouldn't you want packages.config to be updated. Is it that you always want latest so it shouldn't need to change? If so, perhaps the convention would be (for this workflow) that packages.config wouldn't store any version.

Aug 24, 2011 at 1:30 AM

Well, the main issue is that a NuSpec can dictate a floating version requirement for its dependencies, but a package install cant.  Ideally you should be able to set a maven-like version requirement so that a package dependency in packages.config can float on a branch version (say "1.3.*") or just any latest if you are brave.

So, we dont want packages.config to update, but we may need to constrain it to a branch (version stream) or even constrain individual dependencies explicitly, so we still need versions in the packages.config.  Something like (based on http://docs.nuget.org/docs/reference/version-range-specification and using empty for unconstrained):

<packages>
  <package id="Castle.Core" version="2.5.2" />
  <package id="Internal.Framework" version="[1.0,2)" />
  <package id="Internal.Other.Package" version="" />
</packages>

Coordinator
Aug 24, 2011 at 2:04 AM

We do support that! http://docs.nuget.org/docs/release-notes/nuget-1.4#Constraining_Upgrades_To_Allowed_Versions

Unfortunately, I haven’t had the time to move that information over to the main docs. L

Ack!

Phil

Aug 24, 2011 at 3:02 AM

The AllowedVersions almost gets us there, but it will still (I believe) update the Version when we run the update.  So it will constrain the update, but it will modify packages.config.  I guess making Version optional (so just having an AllowedVersion) would work...

Aug 24, 2011 at 3:41 AM

Oh, and it doesnt fix the non-versioned package directories....versioned HintPaths hurt....  :)

Coordinator
Aug 24, 2011 at 5:35 PM

Remind me again why modifying packages.config is bad?

As for the hint paths issue, we have an issue logged where considered adding a flag to enable that behavior with nuget.exe. The downside would be that a solution with packages installed in such a way would not be compatible with the NuGet within the package manager dialog.

Would that be a problem?

Aug 25, 2011 at 1:10 PM

Modifying packages.config is only really bad when:

  1. You use TFS and the command line NuGet.exe.  Files are write protected by default with TFS and not "checked out", nuget.exe doesn't hook into TFS (although the IDE plugins do) so doesn't understand any of this.  Causes pain and confusion to all files involved.
  2. Changes will not be checked in, hence will not reflect the actual version you build against (explicitly wrong).  Workflow will be "update and throw away changes".  Version ranges make this explicit.

HintPath issue and NuGet.exe....we had a look at the current implementation of ExcludeVersions and logged a few issues/discussions.  It is a major problem in the workflow when hand-editing of HintPaths is required post package add to remove the version.  We looked at changing this in both powershell and the VS IDE, which is why this discussion came about.  We found some design choices had been made that caused some larger problems when you try and add the -ExcludeVersions flag to the powershell scripts and the VS IDE.  We are going to try and work around it for the moment whilst we look into the issues in the front end.  We were trying to gauge the level of interest in fixing any of those design choices with this post.  Still unclear on how we are doing on that point, which makes it a bit of a forking gamble for us to invest too much time in it!  :)

Links to related issues/discussions:

http://nuget.codeplex.com/discussions/268465

http://nuget.codeplex.com/discussions/268464

http://nuget.codeplex.com/workitem/1391

http://nuget.codeplex.com/workitem/1383

http://nuget.codeplex.com/workitem/1381

Aug 25, 2011 at 4:54 PM

Catching up on this thread and trying to understand your scenario...

It sounds like what you're looking for is the semantic equivalent of having all your assembly dependencies in a fixed location on a network share that always has the latest, such that you always build against the latest without needing to update anything that's checked in. But you also want to take advantage of NuGet's dependency management so that if you reference an assembly, all the dependent assemblies are automatically referenced.

Is that a decent description of your requirement?

Aug 25, 2011 at 9:52 PM

Maybe there is a very simple solution for your scenario: don't change the package version during development! By doing this, both the csproj and packages.config will remain unchanged.

The one thing that won't quite work correctly  today is that when you run 'nuget.exe install packages.config etc...', it won't check the server if you already have that version installed (this is done so it's real fast). In the short term, you can get around this by wiping out the packages folder before running the command.

Longer term, we can think about adding a -alwaysCheckFeed flag that will cause it to always check the feed in case the has for that version of the package has changed, which would cause it to get installed again.

Aug 26, 2011 at 12:13 AM

David, that is almost what we want.  However, we need the ability to identify specific versions as dependencies, not just floating versions.

As for the suggestion of not changing the version, although this would fix the technical issue, I don't think any development operations team or build team would really embrace this.  In a continuous delivery environment, any build that passes all gateway testing should be able to go to dev/test/qa/prod environments.  If we couldn't link back all the way from a binary on a server, via its version number to the package that it came from, to the build that created it (and hence all the way back to the included commits, their work items, business case etc) then it probably wouldn't fly.  

You may be able to look at using an abstract version number for the package version, and embed correctly versioned assemblies into the package (which would link to a build), but we are then losing most of the benefit of using NuGet in the first instance.

This approach to allowing floating and fixed version dependencies via an artefact repository is a not a new pattern, and has been used and documented even within Microsoft (http://www.amazon.com/Build-Master-Microsofts-Configuration-Management/dp/0321332059).  What we are trying to do is not unusual in the build space, and is almost a requirement in the continuous delivery space.  I will look for some more reference material to help explain exactly what we are trying to achieve, as I am sure that other people have made a better attempt at explaining it than me.

Aug 26, 2011 at 2:03 AM

To be clear, I'm not suggesting that you don't change your assembly versions. Only that you keep your NuGet package version unchanged during development. This should allow tracking back what build created what. But I don't think knowing the package version where an assembly came from is interesting if the package contains nothing other than the assembly. At that point, the package is just a dumb container that enables NuGet use.

Aug 28, 2011 at 1:20 AM

Unfortunately, not changing the package version would mean an overwrite on each CI build as the feed would only allow a singular copy of the same package.  This means that NuGet couldn't be used as the artefact repository, and we would have to store the packages elsewhere as well.

Aug 28, 2011 at 3:18 AM

To me, it seems a bit contradictory to want to keep all the older versions on the NuGet server, while at the same time wanting to make sure that no one can use anything but the latest (within a range). The standard reasons for keeping all the older versions on there (not breaking anyone depending on them) just don't apply in this scenario.

So if it's simply a matter of archiving for the record, it doesn't seem far-fetched to use an alternate location for this purpose. But those older versions just don't need to be on your feed.

Note that I'm not saying that doing the full work to support version-less folders in NuGet would have no value, but it really feels like you could achieve 95% of the benefits with the current NuGet (well, if we added the minor change I mentioned, but that's an easy one).

Aug 28, 2011 at 9:15 AM

But we are not saying that we cant or wont specify a specific version.  We just need the ability to float on fast changing dependencies, and also be able to tie this down tightly when we need to (close to a release).  Think of it as -SHAPSHOT (in a Maven sense).  The -alwaysCheckFeed flag might be useful for some, but really doesnt address any of our issues (as we see it).  

A change to NuGet to allow non-versioned packages directories is probably the only real impediment to the use of NuGet the way we are envisioning it.  Unfortunately this requires a rethink of the install process (not installing from the packages directory, but possibly from the cache), involves the single version per solution issue, and a few other design changes. I can understand why people are leery of making this change (based on scope) but even without our requirements in mind this seems like a pretty logical change.  

This would still leave us a lot of messing about with prebuild/postbuild events to clean up modified packages.config files, and we probably have to roll our own version of NuGet.exe (or equivalent) to get the "update latest" functionality we need...but that is neither here nor there really.  It would still seem awkward that we were ignoring the hardcoded version number in the packages.config, but that is an issue of elegance rather than functionality.

Aug 28, 2011 at 10:58 PM
benphegan wrote:

But we are not saying that we cant or wont specify a specific version.  We just need the ability to float on fast changing dependencies, and also be able to tie this down tightly when we need to (close to a release). 

And the model I suggest allows exactly that, by choosing when you keep the package version the same and when you don't.

e.g. While you're developing 1.1, your package version is 1.1 (the assemblies in it can be 1.1.[build number]). Once 1.1 is final, the one that's left on your feed is the final one. Any work on 1.2 is now done in the 1.2 package. When you need to switch from depending on 1.1 to depending on 1.2, you can do this by upgrading from VS, which should not be a big deal since it'll happen quite rarely (once per release). 

Please try to see how such a model may work for you, as I think you are dismissing the option too quickly. I'm not saying that it absolutely will work for your case, but I'm yet to hear of a reason why it won't. :)

Aug 31, 2011 at 10:29 AM

Just to add my 2c: My company is currently using Ivy to manage our internal (and external) dependencies, but we'd love to move to NuGet if it could support our use-case.

We basically want to do what others above have talked about. The other wrinkle is what the dependency manager should do when one dependency (A) has been changed, but others (B, C) that are also dependent on it (e.g. B depends on A, C depends on B & A) have not yet been built. Ivy supports the sort of "Dependency Version Levelling" requested in http://nuget.codeplex.com/workitem/296.

If you don't update the version numbers, then you can't do dependency levelling as:

  • NuGet can't discriminate between versions
  • All the old versions have been overwritten anyway

Maybe there's a better way of structuring our build (I'd be glad to hear suggestions). We are a small, but not tiny, team (<20) working on a codebase that we've split into maybe 15 separate builds. Most of the coding work goes into 2-4 of those builds (which are mostly near the top of the dependency tree), but building everything (including installers & help) & running the tests takes > 1.5 hours, so we definitely don't want to just put it all in one big solution, or anything like that.

What we feel we need the dependency levelling for is so that if someone commits a breaking API change to one of the low-level solutions, then it doesn't break anyone that pulls dependencies for a high-level solutions before all the mid-level solutions have been built. I guess that would also be solved by committing all the dependencies to source control (currently svn, maybe TFS soon), but that feels like quite a lot of stuff to be going into source control. 

Aug 31, 2011 at 4:23 PM

A small variation of what I suggest above would I think work for your case: only increase the package version when there is a breaking change. This way:

  1. Non-breaking changes are automatically picked up
  2. Picking up breaking changes requires an explicit step. This needs to be done in VS, which makes sense since you probably have to change some code as a result of the breaking change.

Generally, NuGet was designed to require an explicit action to move from of dependency version to another. But by judicious use of package version re-use (i.e. keeping it the same), you can achieve a variety of interesting 'company internal' workflows.

Sep 9, 2011 at 2:15 AM

@davidebbo - I wanted to say +1 to the proposed "internal workflow."  In the open-source community this may seem useless, but when using NuGet to manage internal packages this makes a lot of sense.

Maybe NuGet could check the package hash and if it is different only do a package restore (what I mean by restore is download the dll's or lib folder, I believe that is the same thing that happens when using the following workflow : http://docs.nuget.org/docs/workflows/using-nuget-without-committing-packages).  Any breaking change or app.config/web.config changes would constitute a version increment and a manual update.

Sep 9, 2011 at 6:16 AM

@boydale1: right, that's basically the -alwaysCheckFeed flag I described above. I do think that generally, this workflow would work well if we implemented that piece.

Sep 9, 2011 at 2:17 PM

@davidebbo

Is it possible to restore a package from the Package Manager Console or would an uninstall then re-install be necessary in the current version of NuGet?

Is there an official Feature Request in the Issue Tracker for the alwaysCheckFeed option?

Sep 9, 2011 at 2:46 PM

@boydale1 We ended up modifying NuGet to add a "-latest" switch to the command-line "install" command, and use that with "-excludeversions" to ensure that the hint paths and packages directory do not reflect a version.  That way a developer can just explicitly request that the version is ignored, and the latest version is installed to the packages directory with no changes to project files.  This is not as clean or elegant as we would like as we do not explicitly update the version in the packages.config, as we are basically ignoring it.  We will probably end up just coding this up as an extension to the command line, and there are a bunch of other nasty issues with this that we are working through (hand-rewritting the damn hint paths to exclude the version number is the major one!  Fixing that requires a lot more changes.).

This in combination with TeamCity NuGet Triggers does allow a ticking build graph for internal consumption of packages.  The suggestion for -alwaysCheckFeed flag did not really work for us to support this situation.

@swythan Using the packages.config to restrict our "-latest" to the allowedVersions="" glob would allow a simplified "get me the latest non-breaking changes based on version number range dependencies across my entire build graph".  This is where we will be looking to head with the changes we are making, sounds like this may provide the functionality you are looking for as well?

Sep 9, 2011 at 3:22 PM
Edited Sep 10, 2011 at 2:22 AM

@BenPhegan - You are taking a different approach.

I was looking for a solution where I could overwrite a package that is on the server and NuGet would know to restore the package because the binaries have changed.  Your solution would require I store every builds nupkg on the server.  Storing 100+ CI builds a day on a server is impractical.  @davidebbo's solution allows you to NOT increment the package number during development, but still receive the latest binaries.

Just reading your entire post.  Is there any plan to release the functionality you mentioned to @swythan?

Sep 9, 2011 at 3:30 PM

@boydale1: What about hashing the nupkg? Then you could compare it to your installed package hash. Thing is, it could be pretty expensive to do it, it should be done as a special command.

You could write an extension to nuget.exe that would do this pretty easily. I have a blog post out there that walks through this process step by step. http://devlicio.us/blogs/rob_reynolds/archive/2011/07/15/extend-nuget-command-line.aspx

____
Rob
"Be passionate in all you do"

http://devlicio.us/blogs/rob_reynolds
http://ferventcoder.com
http://twitter.com/ferventcoder

Sep 10, 2011 at 7:55 AM

Note that the NuGet feed already contains the package hash, so it's possible to check whether it has changed without actually downloading the package. That's the idea behind the alwaysCheckFeed flag. Right now, it's just an idea that's not really being tracked, but if needed we can put it on the radar for a future version. And for now, you can experiment with your own implementation as Rob suggests.

Sep 12, 2011 at 10:56 AM

@BenPhegan - That sounds interesting. The "-latest" switch certainly reflects the Ivy version="latest.integration" setup that we are using. I'm afraid I've not actually tried NuGet (as it clearly can't just drop in in place of Ivy), but I'm guessing we'd have to maintain the allowedVersions="" filter manually, right?

Ivy's killer feature for us is it's automatic dependency levelling, which means that no-one ever gets an inconsistent set of dependencies, even when the end-to-end build is half-way through.

We do, in fact, publish the outputs of every CI build into our Ivy repository; We delete them when the disk usage gets high, though. They're only really useful during the same time-window  where the dependency levelling is in effect.

TBH I'm wondering about using the "commit dependencies to source control" workflow. If we added a step to the automated builds that committed the dependencies (but ONLY if the build succeeded), then no-one would have to worry about using NuGet at all (most of the time). The build server would be the only one doing a fetch. The major downside would be the amount of storage space potentially used by the dependencies in source control, given that (generally speaking) we wouldn't be able to purge old dependencies. Hopefully the delta compression algorithms in svn/TFS would handle this OK, but I'm not convinced they would.