Packages, directory structure, source control

Mar 23, 2011 at 3:58 AM

Hi,

Day one of NuGet and I am struggling with a few concepts. Perhaps I have become so accustomed to the way we work at my current company. I guess it’s a paradigm shift using a dependency management tool like NuGet.

Currently we have a source structure something like this:

├───Dev
├───Source
│   ├───App1
│   │   ├───Proj1
│   │   └───Proj2
│   ├───App2
│   │   └───proj1
│   └───App3
│       └───Proj1
└───ThirdParty
    ├───log4net
    ├───Moq
    └───NUnit

Pretty much anything that is needed to do the dev work or build on the CI server is checked into subversion. We have a number of applications under the source directory, each could reference one or more ThirdParty components which are stored in the ThirdParty directory.  This has worked well. When/if we need to update a Thirdparty component we update the version in the thirdParty directory and all solutions/projects get this update. With NuGet the directory structure would look something like:

├───Dev
├───Source
│   ├───App1
│   │   ├───packages
│   │   │   ├───log4net
│   │   │   ├───Moq
│   │   │   └───NUnit
│   │   ├───Proj1
│   │   └───Proj2
│   ├───App2
│   │   ├───packages
│   │   │   ├───log4net
│   │   │   ├───Moq
│   │   │   └───NUnit
│   │   └───proj1
│   └───App3
│       ├───packages
│       └───Proj1

As has been discussed (http://blog.davidebbo.com/2011/03/using-nuget-without-committing-packages.html) I do not like the idea of checking the packages folder for each application into svn as this would lead to duplicated packages and increase the space required to get the repo from svn. 

However I also don’t like the idea of relying on external websites etc being up to be able to download the packages when doing a build. What happens if the site is down?what happens if a package has been removed for some reason?  I still like the idea of being able to get the source from svn and build with not worrying about external sites/shares being up. So long as I can get the source from my subversion repo then I should be able to build.

To get to the point, there are a few things I need to get my head around:

1. For our current way of working there is a shift from sharing the same library versions across all applications to one where each application maintains the version of the library using NuGet. So this would cause the Thirdparty directory disappear.

2. Not checking all things required to do a build into your Source control system, or finding a good way of doing this.

I would like to understand how people are addressing point 2.

In an enterprise environment you could have a local copy of all approved libraries, would this just be a file share, http location, committed to a common source control system (svn Externals). Perhaps you commit your package files into a packages directory within your source control environment and then point to that package location (going round in circles)?

Something just doesn’t feel right so I am interested in hearing how people work to resolve these concerns if they have them.

Appreciate any responses.

Regards

 

 

Coordinator
Mar 23, 2011 at 4:05 AM

Your diagram wasn’t clear. Is each App its own VS Solution? Perhaps including the .sln files in the diagram would make that clear.

Mar 23, 2011 at 4:13 AM

Yes. Within each App directory there would be one sln file and a number of sub folders representing the projects with a .csproj file within.

--App1->App1.sln

--App1->Proj1->proj1.csproj

--App1->Proj2->proj2.csproj

Hope that makes sense.

Mar 23, 2011 at 4:31 AM
Have you considered one repository == one solution?
____
Rob
"Be passionate in all you do"

http://devlicio.us/blogs/rob_reynolds
http://ferventcoder.com
http://twitter.com/ferventcoder
Mar 23, 2011 at 4:40 AM

At previous places i worked the applications were larger and it was 1 product = 1 Repo.  The current place i work has a history of a number of smaller Solutions making up a repo (like this when i got here). Lets just say there are a number of "tactical" solutions with little code reuse, something we are trying to change.

Mar 23, 2011 at 4:47 AM
The current place I am at we have set up a company nuget feed and we are going the route of restructuring for reuse (nuget packages) and have found ourselves moving some things into their own repositories.

Not completely 1 repo == 1 solution, but close. :D
____
Rob
"Be passionate in all you do"

http://devlicio.us/blogs/rob_reynolds
http://ferventcoder.com
http://twitter.com/ferventcoder
Mar 23, 2011 at 4:55 AM

Thanks ferventcoder. So all your solutions get the nuget packages from your internal server? How do you get the packages from the public servers onto your private servers? Do you download them in VS or PS and then copy to your private server? Is this just a file share or http server is it some source control mechanism? If you dont checkin any packages into source control do you use nuget command line to get the packages on your build server?

When you say "..restructuring for reuse (nuget packages).." do you mean that you are trying to pull out Common components into seperate repos and then products use these Common components in other solutions via an nuget package?

thanks again.

Mar 23, 2011 at 5:58 AM

"So all your solutions get the nuget packages from your internal server?"

Solutions can get them from either. We are still trying to figure out if we are going to maintain a version we want the rest of the developers to use and limit them just to our company feed or not.

"How do you get the packages from the public servers onto your private servers? Do you download them in VS or PS and then copy to your private server?"

There are at least two ways I know to download a package. One can pull it down using the package explorer. One can pull it down using nuget itself.

"Is this just a file share or http server is it some source control mechanism?"

We use the old MVC2 HTTP server app that may still be in nuget source for our feed.

"If you dont checkin any packages into source control do you use nuget command line to get the packages on your build server?"

Sorry for any confusion here. We check our packages in. I'm a build automation guy. I want the same inputs going into the build on every machine so that there is an exact science to the build instead of a recipe with room for error. In the world of package management and algorithms there are no guarantees that each package will be exactly the same version on each machine without checking the packages in. I'm firmly in the camp that believes your build is hermetic (http://blog.bits-in-motion.com/2010/02/setting-up-distributed-build-part-1.html).

"When you say "..restructuring for reuse (nuget packages).." do you mean that you are trying to pull out Common components into seperate repos and then products use these Common components in other solutions via an nuget package?"

Yes, the common components are becoming nuget packages.

Mar 23, 2011 at 6:51 AM

Thanks Rob.

So just to be sure, you:

  1. Download nupkg file to local server (the nupkg file is not checked into source control)
  2. Solutions install the package from the internal server (assuming you go with the internal solution)
  3. When the solution installs the package, the package folder is created under the solution. This package folder is checked into your source control system(including the nupkg file?).
  4. If your solution relies on internal Common component these are distributed as a nupkg file on the local server which will also result in the files being extracted into the packages dir and checked into source control.
  5. Since you do not have a 1 solution = 1 repo you may have multiple copies of dependencies checked into source control

I’m also interested in understanding how teams work if they do not check in their packages (packages directory under solution). 

Thanks for the hermetic link. I am also in that camp which is why I am asking all these questions.:)  I really don’t like the idea of not having all dependencies in source control whether that is the nupkg file or the extracted binaries..  The thought of relying on some external server to be up for a clean build makes me cringe. We check in pretty much everything (nunit runner, ncover runner, msbuild dependencies, etc. Heck we even have CruiseControl checked although we are moving to Jenson)

Regards,

Dom

 

 

Coordinator
Mar 23, 2011 at 5:16 PM

By the way, it may take some time, but we're incrementally adding features to make the sort of workflow where packages aren't checked into source control better and better over time. For example, see David Ebbo's blog post on Using NuGet Without Committing Packages.

Mar 24, 2011 at 5:25 AM

Just a terminology clarification. When you refer to packages are you refering to the nupkg file or the extracted files under the packages directory? I still like the idea of either the nupkg file or the packages directory being checked into SC. I’m just so used to doing that in our current way of working. The thought of relying on some other server (not your source control system) being up to build your code doesn’t sit well with me, perhaps I just need time to get to this way of thinking.

Talking with one of the Java guys here, he was saying that a tool like Maven you never check in the thirdparty jars etc.  Our company has an internal Maven repository which devs point to; you can’t use resources without them being in the local maven repository.  Maven handles the building of their projects etc. I guess this is one of the differences, NuGet is not linked into or does not sit on top of MsBuild so the getting of the dependencies has to happen outside of the build and hence needs to be checked into your SC system (or as the article you linked to you need to manually modify your csproj file to hook into the pre build target).  I will keep an eye on the work being done in this area with NuGet.

PS. Rob, which timezone your in?Seems like your in same as me based on the timing of your replies (Australia)

Regards,

 

 

Coordinator
Mar 24, 2011 at 5:42 AM

“Packages” refer to the nupkg file. Some people over here have taken to calling them “nup-keg” files. I still am a hold out calling them “Nu-Pee-Kay-jee” files. J

As for NuGet, our goal is to support both approaches.

As an OSS developer myself, I’ve always taken the approach of checking in my lib folder because I assumed that my other contributors in far flung places in the world wouldn’t have access to my internal Maven repository. I wanted it to be dirt simple to get latest and start building. That’s been the philosophy behind NuGet from the start.

But now, we’re maturing (3 months old already!!!) and also want to address some of the more enterprisey scenarios such as not storing packages in source control. It will take some time, but we’ll carefully iterate and get there. J

Mar 24, 2011 at 2:03 PM

Central Standard Time (US).

____
Rob
"Be passionate in all you do"

http://devlicio.us/blogs/rob_reynolds
http://ferventcoder.com
http://twitter.com/ferventcoder