NuGet in principle is great but causes us more pain than managing libraries manually

Dec 19, 2011 at 4:30 PM

We switched to using NuGet for managing our internal and external dependencies about four months ago but we are now switching back to managing our dependencies the old tried and tested way. It is not because we do not see NuGet being the solution it is just that NuGet (AS WE understand it) does not address our scenarios very well. We will review NuGet at a later date once we can start getting traction on our projects again. Below gives an example of our internal structure, how we are using NuGet and what our issues are.

Our current project structure

We have the following project structure:

                        -> Deliverable Prod C
Int Dep A -> Int Dep B --> Deliverable Prod D
          \            \
           \            -> Deliverable Prod E
            -> Internal Prod F
  • All projects depend on external third party dependencies
  • Internal dependencies are build by continuous integration server and then packaged to our internal NuGet server 
  • Internal packages have specific internal and third party versions set in the packages
  • Internal dependencies produce more than one package. We treat the whole solution as a suit but the projects are a package in their own right but can depend on each other.

How we were maintaining NuGet dependencies

  • We are using NuGet power tools so that dependencies are download when building.
  • Using the GUI to install and update external packages.
  • When an internal dependencies is packaged any dependent projects are triggered and do an update of the internal projects using the NuGet command line restricted to using our internal sources.

Using the above we are able to keep our versions of dependencies across projects consistent and have CI with both dependencies and code changes.

Issues that we are having

Branching internal dependent projects is painful. If we need to branch an internal project we currently have to rip out our internal references projects from NuGet and use assembly references in our lib folder.

Managing multiple packages. We split our internal dependency suites into multiple packages because we don't all our projects to drag in the all the third party references that our internal packages use. For example if we have a suite that has two assemblies, one references Windsor and one references Structure map. If we were to use a single package we would end up with both libraries being referenced so to resolve this we separated them into multiple packages, unfortunately this makes things harder to manage.

ReSharper using shortcut. We are avid ReSharper users and prior to NuGet we used the add assembly reference and using shortcut to manage references. We tried not to use it but it is an easy mistake to make and is a lot slower to use.

Visual Studio assembly search paths. Due to how Visual Studio searches for referenced assemblies we have had issues with the wrong library being referenced from the bin folder of another project. This normally happens when there was an issue with an update and the projects assembly hint path no longer points to a valid path.

Updating projects takes a very long time. Because we have a large number of projects it can take a good 10 minutes to do an update of our internal dependencies. Previously updating dependencies was as simple as copying output assemblies from CI into our lib folder.

Package.config files are corrupted when merging. When two people change the version of a NuGet referenced package but the versions are slightly different a merge can cause both of the versions to be referenced. This does not show up as a problem straight away as the latest version appears to be used but when updates either by the GUI or the command line we can get strange results. We can mitigate the issue by watching out for merges on package configs but people are not infallible.

Unversioned NuGet folders. We never have a case where half of our solution uses one version of a dependency and the other another version. The benefit to having versioned libraries is alien to us. Prior to NuGet we had lib folder that contained our third party libraries and we upgraded by replacing the content of the folders with the latest release. Having a flat structure in NuGet would help us if we are trying out a new version of our internal dependency but do not want to wait for CI to push the new versions into the repository. We have tried using the unversioned folders flag in the command line tool but we had issues when trying to do updates.

Dec 19, 2011 at 5:39 PM

I think each of these issues could be independent discussions. :)

Branching issue: what specifically do you mean by this? I'm having trouble visualizing the issue.

Managing Multiple Packages: Same with "Managing Multiple Packages". What's the problem here? Did you know that launching NuGet by right clicking the solution makes it easier to install packages into multiple projects? Is there a specific workflow you expect here?

ReSharper: This sounds like a great feature request for R#. Ideally, if they auto-add a reference to an assembly installed within a NuGet package, they'd also update packages.config. Or better yet, call into the NuGet API to install it. NuGet doesn't re-download packages that are already installed in another project of the same solution.

Updating projects: That's probably something we need to look at. We welcome any help in finding the hot spots and optimization. :)

Unversioned NuGet folders: This comes up a lot with JS libraries where folks have multiple versions of a library in the same project due to dependencies. Having said that, I agree that unversioned folders solves a lot of problems. We've had a lot of discussions on that and it's not an easy change to make. It's something we'd like to do someday, but perhaps in a constained manner for those who opt-in.

Thanks for the feedback!

Dec 19, 2011 at 5:49 PM
Thank you for that well thought out email. It's clear that you want NuGet to work or you would not have made such an effort to lay out all of pain you are experiencing. We may want to address each specific area and try to understand what you are doing.
Dec 19, 2011 at 11:45 PM
Edited Dec 20, 2011 at 12:10 AM

@haacked: Yes as you say they probably could be separate discussions. I'll try to give you specifics of what we are trying to achieve.

Branching: When we have enough features in our releasable projects we branch our code as a release candidate. We may need to make changes to that branch either during the UAT stages or as a live bug fix. This could require changes to our internal dependencies. During this time the current work stream will have pushed master forwards which will have added coded to our internal dependencies that we would not want to release. We will therefore want to also branch the dependency to make changes required for the release candidate or live system and merge those changes back into the master branch. I don't know how we use NuGet to handle this bearing in mind we could have branches for each or our releasable projects.

Managing Multiple Packages: I'm not sure what I really want from this, all I can pass on is our learnings. We have about 10 packages and 6 packages in our separate suites. Yes I was aware of solution level NuGet GUI and we do use that but we find it painful perhaps because of the knock on effects of having corrupt package.configs, slow updates and shear number of packages. When it works it works beautifully, you update one of the top level packages and and all the packages get update because of the chain of dependencies however when it doesn't work it can be painful, some examples of which can be:

  1. Not all packages sit in the dependency chain so there is a lot of having to click through each package and click update. This isn't something that is broken, just time consuming. We have found a way to handle this in CI but that had other issues which I will cover at the end.
  2. If due to a merge we end up with the same package but different versions listed in the package.config the GUI will not show that the package has been updated so you are never sure if everything is up to date. To fix this I normally have to go through each package.config and remove the offending lines. Not a bug with NuGet but painful nether the less.

Because of the issue with managing all the dependencies across the solution I played around with an idea our CI (TeamCity).

  1. After a successful package of an internal dependency a build is triggered on the depending project
  2. Using the NuGet command line tool all the required packages both internal and external are installed
  3. Again using the NuGet command line tool an update is done on only the internal packages by using the sources flag
  4. Using MsBuild the solution is compiled against the updated packages
  5. If the compilation is successful the updated package config and project files are pushed into source control which triggers the main build and unit tests

This was working quite well but I hit an issue when one of our packages had a new third party dependency added to it. When the update was called because we had specified only our internal sources to be used the external dependency could not be resolved and therefore could not be installed breaking the whole chain and rolling back the update.

I think the two things I can see us needing for this is the ability to group packages as a suite of packages. If you update one you want them all updated regardless of whether they depend on one another or just being able to click on a higher level node that represents the a suite would be even better. Secondly being able to update from a particular source but allowing NuGet to get sub dependencies from their own sources or is it possible to use wild cards for package names that you want to be updated?

Unversioned NuGet folders: I think for us because our projects in a solution always use the same version of external assemblies it could be just one flat folder. It doesn't even need to be structured like our lib folder because we wouldn't be maintaining it by hand we would have a shiny tool for that.

As ferventcoder said We do want NuGet to work and it is the right tool for the job but unfortunately we cannot afford the down time at the moment. I will continue to use it on side projects and will revisit it for our core products some time next year and I will try to give feed back here when I have anything of value.  

Dec 20, 2011 at 12:55 AM

Hmm, still not understanding the Branching scenario. Could you start a new discussion with a specific example of how a problem arises?

As for managing multiple packages, one thing you might be able to do is take advantage of the Package Manager Console and PowerShell. There might be ways you can group packages and pipe them into the Update-Package command today that would save you time. You could even package those up as new commands.

Sep 10, 2014 at 7:06 PM
  I saw these posts and our team are currently facing two issues. These posts dated back in 2011, I was wondering if there are solutions already available.
a. Updating projects takes a very long time.
- We are currently having this issue. It can take up to 30 mins for us to update on some projects. Is there a configuration to make deleting and adding of files faster?
b. Managing Multiple Packages.
 "As for managing multiple packages, one thing you might be able to do is take advantage of the Package Manager Console and PowerShell. There might be ways you can group packages and pipe them into the Update-Package command today that would save you time. You could even package those up as new commands.". - from haacked (Coordinator)

- We wanted to know if there is some sample code we can follow?
  • Thanks
Sep 11, 2014 at 1:36 AM
      This is regarding my post above. I just checked my Nuget Package Manager Console. Seems like i have an older version. My version is 2.7.40911.287 and the lastest version is 2.8.50313.46. This might solve my problem in part a. However, i am still interested if you have any answers in part b "Managing Multiple Packages".
Nov 7, 2014 at 12:54 PM
Edited Nov 7, 2014 at 1:51 PM
I know this post is a bit old, but since I recently tried to use Nuget to manage inter solution dependencies, and because of all the pain I've been feeling while doing it are in the points (or more) of what @bronumski, mentioned, I thought it would be worth recalling this post. A little context may help:


currently, I'm working on a middleware project that is currently divided into 3 solutions:
Common ------------------------->MainService
               \_____> Module A _______/
  • Common is a 10 project solution with generic libraries that are shared between different module services, and work on, infrequently, by multiple teams. Essencially they are reusable utility classes and DTOs frequently used.
  • Module A is a Data / Repository service, that can be shared by multiple System componentes. They aren't currently Web services, but may be in the future. They are integrated into the MainService as a plugin. The solution has about 60 projects, both with PRoxy classes, and multiple implementations. About 6 packages are generated from this solution.
  • Main service is a large solution - 90+ cs projects with the main service and many plugin-like libraries.
  • Module A consumes Common; MainService consumes packages from Common and ModuleA
Previously, we had a simple dll copy build step that we used to manage dependencies, but started having problems when deploying because of dll versioning. ModuleA could for some reason be built with a compatible build of Common, but different from MainService, and blowup in runtime. We also had a complicated CI pipeline because of this (and actually had some cyclical dependency references in the build cycle).

We thought of using Nuget to manage dependencies, after other teams in the company had success using it (but in single solution scenarios). Unfortunately, though, some of the conventions associated to NuGet were problematic and workarounds needed to be found. I believe I worked out a few, but we are still taking some hits in team performance, similar to what @bronumski mentioned:

Issues that we are having

Feature Branching creates concurrency problems : We work with feature branches, and in the process of implementing a NuGet strategy to manage dependencies, I tried to get SemVar in the process to manage version numbers. Our current convention is that all projects in a solution share the same version number, and the package version is equal to the dll version. Any change made requires a version increment (either, major, minor, patch or revision depending on the type of change, and revision is only used for regeneration of the same output). We have a shared AssemblyVersion.cs file linked to each project, and a internally developed tool to increment the version number in that file. (originally the app incremented each AssemblyInfo in the solution, but we later changed to a single file to reduce changes in the repository).

This creates LOTs of problems: Merges become somewhat of a hellish scenario. Though a change in version in the upmost project isn't so problematic, downstream solution updates show alot of changes and merges become extremely difficult. An update to Common, for instance, will create a change in about 90 projects in MainService, and that is about 180 changed files (.csproj and packages.config in each project). If two devs make changes in the upstream project, when they merge changes in the downstream project, the SCM will conflict on each line, especially because there is a version number in the package path. Merging csproject files have become a risky and time-consuming process.

Also, because of concurrent development in feature branches, and dev decision to increment version numbers, 2 devs could eventually create different versions for the same version number. All-in-All i'd say that the merge problem is the worst and I haven't been able to solve it yet.

Could a proxy folder be a solution? Like, for instance, a version of a package is installed into a folder (without the version number in the path), and projects reference the DLLs in that folder? Somewhat of an intermediate stage. Packages could still be placed in the solutions /package/ folder (with versions distinguished by the version number in the path), but a build step and a single package configuration file could, in a build step, copy the correct version of the dll to the proxy folder. For DLL-Only packages, this should work (cs proj files would no longer be tied to the dependency's version number in the path), but it might be a problem for content files and packages that require install scripts... could package "type" identification / specialization help in some way?

Unexpected safe-version updates: When a package depends on another, Nuget, by default, does the safe install of the dependencies, and doesn't pull them up to the most recent version. I think this is a good thing in essence, especially for external libs who's API we don't control. But for internal packages and something that we what always up-to-date, it can cause a lot of problems. I made a simple console app to work around this.
  1. All package dependencies are referenced in a Nuspec file, and a version range is used.
  2. If the dependency is in the same solution, when incrementing the version number, all lower version numbers in nuspec files, for the dependency are updated. Highest version is always exclusive and always the next major version.
  3. If the dependency is from an upstream solution, when updating the package in the solution, all lower version numbers in nuspecs are updated, at the end of the update.
So, if Common.A package is at 1.0.1 and updated to 1.1.0, then any package dependent on Common.A, in the common solution, will declare in it's nuspec, that it depends on Common.A [1.1.0., The same is done downstream, but during upstream package updates.

Since the GUI doesn't support this, i created a few console apps and included batch files to run these steps. (and unfortunately, this is failing on some computers because the nuget console app is searching for tartget files that don't exist...). Regular updates via GUI simply doesn't support this, and I believe in some cases actually can downgrade a dependent package if a lower version is still in the packages folder.

Also something that I believes complicates things is that it is possible that 2 projects in the same solution can share different versions of a package. I can imagine this is useful in some cases but should it be the default behavior?

ReSharper / VS woes : Updating through the GUI with resharper on , especially in large projects, is painful. When I started doing the integration, I had so many VS crashes, It hurt. Even with the command line, the project unload / reload cycle is still tough. Not sure how this can be improved,

Local vs remote repositories and Nuget.Config: I like the idea that a dev can pull code from source and everything just work. It's not always possible (for instance SSDT still needs to be installed) but the nuget executable and config can be included in a solution and SCMs. This means both remote and local repos can be defined and loaded by VS without the user having to configure / add keys to his own install of VS. Still, I've noticed that the solution's nuget.config isn't always used, and the environment ends up using an installed nuget.exe instead of the solutions .nuget/nuget.exe (when available). I've worked around this by having all my batch files use relative paths to the solution's executable. Also, updates from a local repo is just so much faster, that I ended up using batch files thar explicitly add the source in the command's arguments. A regular update call searches through every one of the repos defined instead of the prioritizing the repo list in the nuget.config.

CI is still using and publishing / getting from the remote "official" repo, but locally, dev work is done through the local (dev machine) repo.

Also, because I don't what to simply update all packages in the solution, especially external libs who's (possibly breaking) changes I don't about, the GUI just isn't adequate for this (it's either one by one or all of them).

Command Line Update doesn't update solution level packages I've added some upstream packages as solution level packages, but the update call in my batch process, though indicating the package name explicity and working on the solution file, doesn't update the package (i guess because it's not referenced in any specific project.


I like NuGet and what It adds. I might be looking at this kind of like the hammer problem (when all you have is a hammer, everything looks like a nail...). It might just not be made to cover this scenario. It works well in certain solutions (or single solution scenarios), but it's painful to with dev concurrency on multiple projects, and very large projects. Maybe this requires a different tool on top of Nuget or parallel to Nuget.