I know this post is a bit old, but since I recently tried to use Nuget to manage inter solution dependencies, and because of all the pain I've been feeling while doing it are in the points (or more) of what @bronumski, mentioned, I thought it would be
worth recalling this post. A little context may help:
currently, I'm working on a middleware project that is currently divided into 3 solutions:
\_____> Module A _______/
- Common is a 10 project solution with generic libraries that are shared between different module services, and work on, infrequently, by multiple teams. Essencially they are reusable utility classes and DTOs frequently used.
- Module A is a Data / Repository service, that can be shared by multiple System componentes. They aren't currently Web services, but may be in the future. They are integrated into the MainService as a plugin. The solution has about 60 projects, both with
PRoxy classes, and multiple implementations. About 6 packages are generated from this solution.
- Main service is a large solution - 90+ cs projects with the main service and many plugin-like libraries.
- Module A consumes Common; MainService consumes packages from Common and ModuleA
Previously, we had a simple dll copy build step that we used to manage dependencies, but started having problems when deploying because of dll versioning. ModuleA could for some reason be built with a compatible build of Common, but different from MainService,
and blowup in runtime. We also had a complicated CI pipeline because of this (and actually had some cyclical dependency references in the build cycle).
We thought of using Nuget to manage dependencies, after other teams in the company had success using it (but in single solution scenarios). Unfortunately, though, some of the conventions associated to NuGet were problematic and workarounds needed to be found.
I believe I worked out a few, but we are still taking some hits in team performance, similar to what @bronumski mentioned:
Issues that we are having
creates concurrency problems : We work with feature branches, and in the process of implementing a NuGet strategy to manage dependencies, I tried to get SemVar in the process to manage version numbers. Our current convention
is that all projects in a solution share the same version number, and the package version is equal to the dll version. Any change made requires a version increment (either, major, minor, patch or revision depending on the type of change, and revision is only
used for regeneration of the same output). We have a shared AssemblyVersion.cs file linked to each project, and a internally developed tool to increment the version number in that file. (originally the app incremented each AssemblyInfo in the solution, but
we later changed to a single file to reduce changes in the repository).
This creates LOTs of problems: Merges become somewhat of a hellish scenario. Though a change in version in the upmost project isn't so problematic, downstream solution updates show alot of changes and merges become extremely difficult. An update to Common,
for instance, will create a change in about 90 projects in MainService, and that is about 180 changed files (.csproj and packages.config in each project). If two devs make changes in the upstream project, when they merge changes in the downstream project,
the SCM will conflict on each line, especially because there is a version number in the package path. Merging csproject files have become a risky and time-consuming process.
Also, because of concurrent development in feature branches, and dev decision to increment version numbers, 2 devs could eventually create different versions for the same version number. All-in-All i'd say that the merge problem is the worst and I haven't been
able to solve it yet.
Could a proxy folder be a solution? Like, for instance, a version of a package is installed into a folder (without the version number in the path), and projects reference the DLLs in that folder? Somewhat of an intermediate stage. Packages could still be placed
in the solutions /package/ folder (with versions distinguished by the version number in the path), but a build step and a single package configuration file could, in a build step, copy the correct version of the dll to the proxy folder. For DLL-Only packages,
this should work (cs proj files would no longer be tied to the dependency's version number in the path), but it might be a problem for content files and packages that require install scripts... could package "type" identification / specialization
help in some way?
Unexpected safe-version updates
: When a package depends on another, Nuget, by default, does the safe install of the dependencies, and doesn't pull them up to the most recent version. I think this is a good thing in essence, especially for external
libs who's API we don't control. But for internal packages and something that we what always up-to-date, it can cause a lot of problems. I made a simple console app to work around this.
- All package dependencies are referenced in a Nuspec file, and a version range is used.
- If the dependency is in the same solution, when incrementing the version number, all lower version numbers in nuspec files, for the dependency are updated. Highest version is always exclusive and always the next major version.
- If the dependency is from an upstream solution, when updating the package in the solution, all lower version numbers in nuspecs are updated, at the end of the update.
So, if Common.A package is at 1.0.1 and updated to 1.1.0, then any package dependent on Common.A, in the common solution, will declare in it's nuspec, that it depends on Common.A [1.1.0., 188.8.131.52). The same is done downstream, but during upstream package updates.
Since the GUI doesn't support this, i created a few console apps and included batch files to run these steps. (and unfortunately, this is failing on some computers because the nuget console app is searching for tartget files that don't exist...). Regular updates
via GUI simply doesn't support this, and I believe in some cases actually can downgrade a dependent package if a lower version is still in the packages folder.
Also something that I believes complicates things is that it is possible that 2 projects in the same solution can share different versions of a package. I can imagine this is useful in some cases but should it be the default behavior?
ReSharper / VS woes
: Updating through the GUI with resharper on , especially in large projects, is painful. When I started doing the integration, I had so many VS crashes, It hurt. Even with the command line, the project unload / reload cycle
is still tough. Not sure how this can be improved,
Local vs remote repositories and Nuget.Config
: I like the idea that a dev can pull code from source and everything just work. It's not always possible (for instance SSDT still needs to be installed) but the nuget executable and config can be
included in a solution and SCMs. This means both remote and local repos can be defined and loaded by VS without the user having to configure / add keys to his own install of VS. Still, I've noticed that the solution's nuget.config isn't always used, and the
environment ends up using an installed nuget.exe instead of the solutions .nuget/nuget.exe (when available). I've worked around this by having all my batch files use relative paths to the solution's executable. Also, updates from a local repo is just so much
faster, that I ended up using batch files thar explicitly add the source in the command's arguments. A regular update call searches through every one of the repos defined instead of the prioritizing the repo list in the nuget.config.
CI is still using and publishing / getting from the remote "official" repo, but locally, dev work is done through the local (dev machine) repo.
Also, because I don't what to simply update all packages in the solution, especially external libs who's (possibly breaking) changes I don't about, the GUI just isn't adequate for this (it's either one by one or all of them).
Command Line Update doesn't update solution level packages
I've added some upstream packages as solution level packages, but the update call in my batch process, though indicating the package name explicity and working on the solution file,
doesn't update the package (i guess because it's not referenced in any specific project.
I like NuGet and what It adds. I might be looking at this kind of like the hammer problem (when all you have is a hammer, everything looks like a nail...). It might just not be made to cover this scenario. It works well in certain solutions (or single solution
scenarios), but it's painful to with dev concurrency on multiple projects, and very large projects. Maybe this requires a different tool on top of Nuget or parallel to Nuget.