TFS or msbuild integration?

Nov 3, 2010 at 9:06 PM

Hi all,

Conceptual question.  Is it NuGet's intention to integrate with msbuild and/or TFS build definitions so that I can just get a project from source control, having no dependencies on that machine, and build it successfully?  That is, do you expect msbuild (or whatever your build tool is) to be able to see via a NuGet plugin or something that there are dependencies to analyze and download via NuGet before actually kicking off compilation?  If so, great.  If not, I'd be moving in that direction.

Until then, do I need a prebuild step to invoke NuGet's dependency analysis and download so that the rest of my build can assume I've got the required dependencies downloaded and placed in a well-known location (like Maven's local repository)?

Sorry if I didn't understand this from the docs.  I tried, but all that I can glean from them right now is that you can use NuGet to more easily add all the necessary references to your project (and/or solution?) via either the GUI or console.

Thanks,

Matthew

Coordinator
Nov 3, 2010 at 9:42 PM

That's not the current direction of NuGet, but the question comes up a lot and it's something we may consider in a later version.

Our overview page: http://nuget.codeplex.com/documentation outlines our current direction with NuGet. For the most part, it automates the steps that developers take today to get a library into your source tree. So we see it as a development time task, not as a build time task.

There are benefits to this approach where after installing a package, a developer can commit their changes to their source repository and another developer can get latest and be in the exact same state. It also makes xcopy deployment easy.

Again, this issue does come up from time to time to support the model you described, but it's not something we will even think to tackle until after v1 if at all.

 

Developer
Nov 3, 2010 at 9:51 PM

A discussion along similar lines: http://nuget.codeplex.com/Thread/View.aspx?ThreadId=231541

At this point, the NuGet doesn't have a way to hook up to the build system. However as part of the build system at my work, we use custom tasks to create packages and another to call into the NuGet API and install the created packages.

 

Nov 4, 2010 at 2:55 PM

I read the other post, and can see the dilemma.  If you wanted to use NuGet as part of your build system in order to resolve and download dependencies, the expression of your dependencies must be in source control along with your code.

NB:  I'm coming from Java-land, where Maven* is used to handle project builds & transitive dependency resolution.

In Maven, one of the files that you keep in source control, along with your project, is the project's "POM" file (POM stands for "Project Object Model"), pom.xml.  Don't worry about the high falutin' sound of "project object model" -- what matters and pertains to NuGet is the fact that you express your dependencies**.  In today's Visual Studio-based world, a project's dependencies are expressed as references in the .csproj file.  It appears to me that the resolution of these references can use <HintPath> to find the reference's physical file location, unless the builder (Visual Studio, msbuild, etc) can resolve the reference itself.  If NuGet were able to be added as a "reference resolver" (my own term), then it could use a local, company, or remote repository to provide the physical assembly file.

One thing that's important to note about using NuGet to resolve dependencies during your build.  In Maven, although version ranges are supported, they are almost never used.  In order to ensure predictable and repeatable builds, you should use exact versions instead of ranges.

Additionally, the NuGet repository should disallow any published package to ever by changed.  Once made public, the package can never change.  That's exactly how the Maven repository works.  Some publishers, notably SpringSource & JBoss, maintain their own Maven repositories, some of which are prerelease in nature.  They still don't change a package once published to a prerelease repo, but reserve the right to remove them at any time.  That way, you are discouraged from using them in the long term, instead moving to the standard repo where the GA versions are published.

-matthew

* : Ant+Ivy is another common combination.  See next footnote for more info.

** : Maven's "project object model" concept embodies several things, including not only a project's dependencies, but also a project's type (jar, war, etc), the location of source & test artifacts within a project, the build plugins used by the project, and more.  It employs a convention-over-configuration methodology, so that the file doesn't need to be as large as it would otherwise be.  For those familiar with NAnt, which is similar to Java's Ant, I can draw an analogy that might help you place Maven in the Java build ecosystem.  Firstly, Ant is a scripting tool, that just happens to be used often for builds.  Secondly, if you use Ant for building, you still have to write (and rewrite, and rewrite, and so on) exactly how you build your project.  Over time, patterns emerged based on how people built their Java projects, including source layout, project types, and dependency management.  After a while, best practices were identified and generally accepted.  Maven is a tool that embodies those best practices in a convention-over-configuration way.  Currently, NuGet is tackling only dependency management, much like another Java project called Ivy.  Ivy competes only with Maven's dependency management features, such that Java developers often choose between either Maven or Ant+Ivy.  Note that Maven provides an Ant plugin, so that with Maven, you can have the best of both worlds, best practice encouragement and the ability to step way outside of the box if you need to.  On the flip side, Ivy can understand Maven repositories, too.  Love?  Sometimes.  :)

 

Coordinator
Nov 4, 2010 at 4:12 PM

We do keep a Packages.xml file within each project that lists the packages installed. So in theory, it’s possible to have a build provider pull in the packages at build time. We just haven’t implemented any such thing yet. However, this is an OSS project so we could be open to having that be a layer on top of NuGet. Perhaps we need a NuGet Contrib project for features like that. J

Nov 4, 2010 at 10:43 PM
Haacked wrote:

We do keep a Packages.xml file within each project that lists the packages installed. So in theory, it’s possible to have a build provider pull in the packages at build time. We just haven’t implemented any such thing yet. However, this is an OSS project so we could be open to having that be a layer on top of NuGet. Perhaps we need a NuGet Contrib project for features like that. J

Hmmm.  I'm not sure what you mean by build provider.  If that refers to Visual Studio, msbuild, devenv, nmake, nant, etc, then I think I know what it is.  The trick is, how to get those build providers to notify you when it's time to resolve project references, especially when I don't think they define a uniform build lifecycle (like Maven's).  It's interesting that as I consider these concepts in .NET, I realize that Maven's build lifecycle begins after dependency resolution, which is implicit in its list of build lifecycle stages.  I now realize that the full Maven build lifecycle includes dependency resolution before validate, initialize, etc. (for the build lifecycle) and before pre-clean, clean, etc. (for the clean lifecycle).  See http://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html.

Also, I'm not sure that a layer on top of NuGet is what is needed, unless that layer is something (like from NuGet.Contrib) that you use to launch your build, which first does reference resolution & download, then delegates to the designated build provider (msbuild, VS, etc).

There is another concept that I think you need to take into consideration.  Maven calls it "scope" (see http://maven.apache.org/guides/introduction/introduction-to-dependency-mechanism.html#Dependency_Scope).  For example, Maven's default scope, "compile", would mean in NuGet that the assembly needs to be present at compilation time and at run time.  Another scope, "provided", would indicate that the assembly is provided by the environment, like mscorlib or similar, instead of being located in a repository.  Other scopes are "runtime" (only needs to be present at run time, not compile time), "test" (only needed during test compilation & execution), and others.  See the aforementioned link for their descriptions.

While I'm on a roll, does packages.xml support the notion of "parent"?  That is, I have a solution with multiple projects, can I define lots of common or shared dependencies at the solution level, and have each project's packages.xml inherit and possibly override the solution's packages.xml?  Maven supports this (concept is called "parent POM"), and it used heavily for effecient dependency management in larger multi-module projects (the Maven term "module" corresponds to VS's "project", and "multi-module project" would correspond to VS's "solution").  Solution packages.xml files would allow for much smaller project packages.xml files and helps keep things DRY.  Thoughts?

-matthew

Nov 4, 2010 at 11:43 PM

The model we have is that there can be any number of packages at solution level, and then the packages.config for each project defines which of these packages is in use by a particular project.  So packages.config is a very small file.  I suppose we could have a concept of a solution level packages.config that lists packages that are in use by all projects, but I don't think it would give enough benefits to be worth the extra logic IMO.

Nov 5, 2010 at 9:20 PM
davidebbo wrote:

I suppose we could have a concept of a solution level packages.config that lists packages that are in use by all projects, but I don't think it would give enough benefits to be worth the extra logic IMO.

Hi David,

Good point.  I might not have explained myself clearly enough in my last post.  What I was trying to describe was the way Maven currently does dependency resolution for multi-module projects that use a parent pom.  The effective pom, that is, the one being used for any given project, is the union of the project's local pom, its parent pom, its parent's parent pom, etc, and finally, Maven's default pom (called the superpom), where children override parents.  This is really worth a quick read, since I would suck at trying to regurgitate it:  http://maven.apache.org/guides/introduction/introduction-to-the-pom.html

What I heard from your quote above is that a solution level packages.config says that whatever dependencies are listed at the solution level **is necessarily** used by all child projects.  That's not what I was trying to say.  The solution-level package.config would simply establish default dependencies and their versions for any projects in the solution.  The project can then choose to use or not to use the dependency and its version as specified in the solution or a different version.

Sorry if I'm not being clear.  I'm just very familiar with Maven, and I'm probably taking some bit of knowledge for granted that isn't so obvious on the .NET side.  Let me know if I need to explain further.

-matthew

Dec 29, 2010 at 9:19 AM

Sorry in advance for beeing a little bit bold.

Every developer on the nuget team should get a maven course if he is not familiar with maven yet.

Maven is widely adopted in the Java space, and nuget adresses the same issues. As looking at (N)Hibernate before concepting EF would have been very good idea, looking at and understanding maven is crucial for a long-term success with nuget.

NPanday brings Maven to .NET, and "we" plan to support the nuget protocol in future releases.

Dec 29, 2010 at 2:05 PM

@larscorneliussen,

Thanks.  I'll definitely have a look at NPanday.  Is there a .NET equivalent of Ivy, which is like just the dependency management part of Maven?  Or is that NPanday also?

-matthew

Dec 30, 2010 at 9:49 AM

What is the use-case for removing parent dependencies? We're working on the equivalent of maven's scope in openwrap, but we only do it per-solution, and it's only add or update, aka the child descriptor can only override or add to the parent descriptor. It would seem to me that introducing exclusion semantics would introduce a lot of complexity.

What we intend on doing however is to provide for optional packages, so a parent package descriptor (or a referenced package) can specify the version ranges it supports, which would limit resolution to those packages provided another referenced package or descriptor makes that dependency declaration non-optional. Would that not cover the scenario that the removal of parent dependencies covers?