Why are packages associated at the solution level instead of the project level?

Topics: General
Aug 24, 2012 at 8:39 PM

I created an issue about this, as it causes difficulty for me and prevents me from truly embracing nuget. I had hoped that someone would weigh in explaining why the decision was made to manage everything, nugert-wise, from a solution level instead of a project level.

It seems bizarre to me given that references themselves are project-level concepts in visual studio.

Does anyone have any insight on this?

Aug 24, 2012 at 9:13 PM

Prior to making package restore flow the primary work flow, the expected workflow was for people to check in packages into the source tree. Consequently it seemed practical to have packages managed in a single repository as opposed to having multiple copies of it checked into your source tree - particularly when you were more than likely to share a common version across projects in a given solution.

Aug 25, 2012 at 3:59 AM
Edited Aug 25, 2012 at 4:00 AM

Yeah. That's what I thought. That it was a historical thing. Hopefully this can be changed in the future! :)

Thanks for your reply.

Sep 20, 2012 at 11:58 AM
Edited Sep 20, 2012 at 11:59 AM
mcassidy wrote:

Yeah. That's what I thought. That it was a historical thing. Hopefully this can be changed in the future! :)

Thanks for your reply.

It's not a "historical thing", it's the sensible approach.  What would you change it to... duplicating the same DLLs into each project?  What would be the benfit of that?

Sep 20, 2012 at 3:41 PM

Many of my projects are mercurial repositories, used as subrepos within different solutions. If I want to reuse one of the projects in a different solution, now I have to manage all my dependencies separately, instead of just including the project, and having everything Just Work.

So now that I've answered your question, what would be the benefit of NOT allowing management of dependencies entirely within the scope of a project, having nothing to do with the solution?

Sep 21, 2012 at 9:48 AM

I think you may be mis-interpreting this as a problem with NuGet - it's actually the incompatibility between your current workflow and NuGet that's the problem.  This is a personal opinion of course, but it only really makes sense for binary output to come from a single project / solution source, not from an arbitrary number of solutions containing a sub-repo/externals project reference.  The problem this that any of those arbitrary solutions will end up building it's own version of that binary output.  

It may help to review your run-time build closure dependency graph and identify where versioned library packages should be published from / subscribed to.

Sep 21, 2012 at 9:58 AM

I read this from the perspective of using NuGet for managing your internal dependencies, so apologies if my response is out of context.  However, the theory still applies.  Binary output should come from a single build.

Sep 21, 2012 at 4:20 PM

If there's something wrong with my workflow, I'd love to know how to fix it, but I can't figure out how to apply what you said. NuGet (at least my usage of it), is generally concerned with binary input, not binary output. It manages the 3rd party binaries that my code depends on.

For example, all my test projects reference Nunit, but there's no real build hierarchy or order to them. Are you suggesting that one of my test projects (or one central project referenced by all of my test projects) wraps Nunit, and the rest reference that? If I do that, I still have to manage that dependency separately.

Typically, my test project and the project being tested by the test project reside together in a single subrepository. If I want to include that in a solution that doesn't yet have a test project, I want to just include the subrepo, build, and have it all work. I don't think that's an outlandish workflow.

In addition, I can't have these projects (with their test project siblings) nested at different levels of the solution tree without resorting to hint paths, and even then, if the same project is nested at two different levels in two different solutions, there's no way to make it work at all, not even with hint paths.

If I've misunderstood, please let me know.

Thanks for your input.

*goes away to google "run-time build closure dependency graph"*

Sep 21, 2012 at 10:08 PM

I realised that you were using NuGet to manage 3rd party dependencies (much like the rest of the community) - currently my day job includes managing internally built dependencies with NuGet hence the glaring assumption.

It looks like you're using subrepositories for a number of reasons, such as to include internal dependencies for a given solution, and to expose the source code for those dependencies such that making changes to the dependency is easy?  This makes it difficult to detect breaking API changes in other solutions that also consume that dependency, and, in the event that your runtime build is comprised of multiple of these solutions, each with references to the same dependency subrepository, then which version of that dependency should you choose for runtime?  I'm certainly not trying to pick holes in your approach here, but I think to answer your original question it's important to highlight this.

Ironically, if you actually used NuGet to manage these internally built dependencies then this would relieve both issues here, i.e. enabling the use of NuGet for 3rd party, and ensuring that you only ever produce binary output from one build source.  To do this, you'd need to implement some way, preferably automated (shameless plug), of publishing these internal dependencies to a private NuGet feed.  This impacts your current workflow in that any change to a dependency can't be made directly in the consuming solution, only the dependency's source.  But it would be mathematically correct, and would allow for the adoption of appropriate gateway processes to test APIs etc prior to publishing a packaged version.

I hope that makes sense, it's how I manage software build now and the benefits by far outweigh the overhead in setting it all up......

Dec 21, 2012 at 6:59 PM

Thanks chappoo. Your assumptions about why we use subrepos are correct. We currently have some large, messy solutions with poorly defined boundaries between projects, and that's what really makes it difficult to package our dependency projects nicely. There's lots of tweaking, and we're constantly breaking the APIs anyway. And recently we've not been shy of using ReSharper to do big sweeping refactoring. Not possible if we work on only one project at a time.

Not long ago I took your suggestion and packaged one of our more mature dependency projects and published it to a local NuGet feed for use in a new project. I was happy with the results. Hopefully, before too long, our refactoring will settle down, and we'll be rid of some legacy mushyness. Maybe then we'll be able to transition more and more subrepos to local NuGet packages.

Thanks again for the reply.