Check-in the entire 'packages' folder? Alternatives?

Nov 30, 2010 at 8:48 PM

So I just started using Nuget and TFS recently, I like the idea and direction nuget is going a lot.  Obviously there is a ways to go to get to where Rails3's Gemfile or Node's NPM is but it is still very nice to see it.  Less than impressed with TFS, but that isn't an appropriate discussion topic for here.


Anyways the first thing I noticed was that it is a bit awkward to check-in the library references.  Nuget pulls down a lot of stuff and creates a lot of folders.  I was able to add this with 'tf add packages /recursive' but it really is a lot of stuff that isn't needed.  It would obviously make more sense to just check in a single Gemfile type file and have the person doing the checkout update the dependencies with Nuget before building.  Is this currently possible?  (I did check out the other thread on TFS integration but that seemed more build automation focused, I'm just talking about source control here).


I am just curious, is checking in the packages folder the intended way of working with nuget or is there something else I should be doing?  This really isn't all that big of a problem with intelligent source control systems, but for those working with TFS, where file/folder structure based check-in is a bit awkward I can see this being a bump in the road.

Nov 30, 2010 at 9:15 PM

One of our goal is that NuGet sets up everything such that NuGet is not needed once a package has been installed.  i.e. if you check in something that uses it, and another dev checks it out, there is no NuGet requirement on their box.  Hence, it's important that we expand at least the lib folder for each package so that the VS project can take a reference to it.  That being said, I think we do expand too many files in some cases, and there are some possible optimizations there.

But generally, you're correct that the intent is for the whole packages folder to go into source control, just like if you had a Lib folder with various references you would check them in.

 

Nov 30, 2010 at 11:02 PM

Good to know, I think that is a fine goal initially, but I would think ideally the package.config could be (and should be capable of being) used to pull down dependencies as needed.  I imagine, for example, deploying Azure similarly to the way Heroku and Joyent do fetching the dependencies based on Gemfile or NPM configuration.  Just saying that would be pretty neat, but I know static != dynamic of course.

Anyways I'll rephrase my questions bit here since I don't think either were really answered completely


1. Can a developer with nuget installed check out without the dependencies (but with the packages.config files) and update the dependencies on their end?

2. If you were working with TFS specifically, how would you recommend adding the dependencies and how would you recommend others check them out?  Is the command line approach "tf add packages /recursive" the best option at the moment?  It has the secondary problem of not checking out the dependencies when another dev opts to check out by selecting the .SLN file.  Is the solution here to just check out using the source explorer? Note: I will totally accept "don't use TFS" as an answer to this ;-).

Persnally, I found this was a bit of a PITA with TFS, since TFS doesn't naturally add anything it doesn't see as part of the .SLN.  There feels  to be a disconnect between what you want nuget to be doing and the way TFS expects things to be organized.  Unfortunately a lot of this would be solved, if the whole .SLN approach was optional/abandoned and file structure based solutions were an possible in both VS and TFS... but I digress.

Nov 30, 2010 at 11:13 PM

1: Not today, but this has come up in discussions.  I just opened http://nuget.codeplex.com/workitem/415 to track it in case we didn't have a bug already.  Feel free to add comments there.

2: We started out with using solution folders so that everything would be managed within the solution, with the goal of making TFS work better.  Unfortunately, it was working poorly in many ways (some not-fixable), so we moved away from that.  So yes, a manual add recursive is probably the best thing to do.  As for checking out, I didn't realize that TFS supported checking out everything based on a solution.  Checking out the whole directory would be the way to make this work correctly with the packages folder.

Nov 30, 2010 at 11:19 PM

Thanks on both counts and for being so active in these forums.  I'll definitely be keeping a close eye on nuget.  One final question (I apologize in advance if this has been asked elsewhere), is nuget a community project, in that it will be open to other contributors potentially or is it more of a Microsoft project.  I still don't totally understand how the Outercurve foundation projects work.  And if it isn 't completely open is it open to extension, will people be able to leverage what you build here and extend it?


Thanks again.

Nov 30, 2010 at 11:38 PM

It is definitely a community project, and we already have received a number of important contributions.  e.g. @Kiliman wrote most of the Package Sources management dialog.

Contributions are welcome! :)

Coordinator
Apr 27, 2011 at 5:07 PM

Hi All, I've written up a spec for how this feature will work. Note that it's not the final "end-all/be-all" approach, but just the first iteration of addressing this scenario. We'll probably continue to refine the features around this scenario as we go forward.

However, I do think and hope this first cut will address much of the pain people are talking about. Please review the spec here: http://nuget.codeplex.com/wikipage?title=Enabling%20Using%20NuGet%20Without%20Checking%20In%20Packages%20Folder

Apr 27, 2011 at 6:56 PM

The poorest assumption that has been made so far is that everyone does "Single Solution" development. In my current work we have over 20 solutions composing different parts of the system we want to work with/debug. We currently have one common libs root folder that is addressable by environment variable (MSBuild friendly). Until NuGet supports this king of deployment senario by allowing the specification of a common packages folder, either in the nuget.config (solution level) and/or packages.config (project level), it will just be another toy that is great for one off apps and demos.

Apr 27, 2011 at 6:58 PM

Ok, I think this is great and it means that NuGet would be useful for larger system solutions.  Something more like the Apache Maven solution, to allow multiple groups to consume binary bits for integration.   As an enterprise we have a set of common core components which are reusable across a wide swath. They deal with a lot of cross cutting concerns, logging, security, etc. Then we also have a pluggable framework we're building around(think like Orchard or DotNetNuke, etc.). The framework will follow a different release cycle, and we're coordinating saying you can't consume a new feature until it's gone to prod.

So we want various systems to consume these shared pieces as binaries.  They may be consuming a daily build, or they may be using a more stable QA-validated build, or it may just what's out in Prod.

Some use cases I can think of:

I need to be able to specify which package repository to use, as we'd probably have our own package repository for build artifacts.  Since we have our own, maybe it's just a single one for all.

I need to be able to pick the version.  During development I may want to always grab the latest version of things we've built, but specific versions of third party libraries like say nlog or nhibernate, etc.  I move into a release branch scenario and I now want to narrow that down to specific versions of things we've built.

In areas where we are building pluggable modules for an app, I need to have a build of just the unit tests which grabs binaries for the dependent components just to execute the existing modules against new dependencies to insure it still functions.

Like I said before, I looked at Apache Maven and it seemed to work the way I think we would expect it, but Maven is an entire build solution and it would have been difficult for us to integrate into our TFS builds.

If this is helpful let me know, and I'm more than willing to help in any way I can. 

Apr 27, 2011 at 6:58 PM

I'm not entirely clear, but it appears from the current spec that this feature will be enabled by default.  I'm not sure that's a good idea.

For those of us that prefer the alternative approach (check in all packages into source control), you might end up with a solution that compiles on one machine, but not another:

  1. Developer pulls down package from NuGet gallery, but for whatever reason does not put packages into source.
  2. Another developer pulls down latest from source control, but is able to build because he has access to the internet and the packages automatically download.
  3. Build fails on build machine because packages are not included and machine does not have access to the internet.

Or perhaps the build works on the build machine because it happens to be able to pull down the packages (which might even be worse).  Ultimately, I'm not sure I want my build process automatically pulling down packages unless I tell it to do so.  I'd rather know that everything I need to build my product is included in source control.

If I'm misreading the spec, you can ignore everything above :-)

Apr 27, 2011 at 7:02 PM
EddieGarmon wrote:

The poorest assumption that has been made so far is that everyone does "Single Solution" development. In my current work we have over 20 solutions composing different parts of the system we want to work with/debug. We currently have one common libs root folder that is addressable by environment variable (MSBuild friendly). Until NuGet supports this king of deployment senario by allowing the specification of a common packages folder, either in the nuget.config (solution level) and/or packages.config (project level), it will just be another toy that is great for one off apps and demos.


This is a good point as well. 

Coordinator
Apr 27, 2011 at 7:06 PM

@EddieGarmon: We’ll start to look into that after this release. I know of large enterprises (like Dell) making use of NuGet as more than a toy. So I think it is possible to build larger system solutions. Especially if you think of those systems as isolated components developed independently.

@Sodablue Specifynig which repository to use is possible today. With NuGet 1.3, we now have a per-user config file which stores package sources. So you could use that to specify a set of repositories on the build server.

As for the other scenarios, we’ll have to think through how they fit into NuGet vs how they fit into MS build. NuGet, as you point out, isn’t a build system. It’s only one part of what Maven does. As we get more and more familiar with how Maven works, we can start to think about how we can better integrate with MsBuild.

@MCHampster: In most cases though, if you do check in the packages, you’ll never run into a situation where this fails. I mean, if your developer forgets to put the packages in source control, your build is broken with or without this feature. J

But to be safe, we will allow you to turn it off.

Apr 27, 2011 at 8:04 PM

I think this is a great idea, and I can see lots of uses for it. The only beef I have with the spec is that I think the behavior should be off by default--it kind of flies in the face of how most folks have been handling binary dependencies for ages. And I think the CI scenario is a bit trickier than most. Let's take TeamCity, the way agents work is they check out a completely new copy, so they would be downloading all the binary dependencies every build. Which starts to add up really fast, never mind what it can do to your build times. 

Apr 27, 2011 at 8:35 PM
My perspective is that it should not be the default behavior. But I don't have a concrete reason at the moment as to why. Only a feeling. Perhaps it's too early. Once people really start to trust NuGet and it is battle hardened, I think we can start thinking about making it the default. From my perspective it's too early to start shifting everyone by default. Those that do not want to store them are advanced users that are used to Maven. When it doesn't work perfectly they will already be down the road of why it doesn't.

Plus for me it removes another variation in my build, which can already fail for various reasons. I want it to be simple by convention and advanced by configuration. Simple is checking in my dependencies. Advanced is magically pulling the right version down.
Coordinator
Apr 27, 2011 at 10:06 PM

Just to clarify, being “on by default” doesn’t force nor require anyone to change their existing workflow. So we wouldn’t be shifting anyone by default. In fact, it really wouldn’t have any affect for most people.

Consider that being on by default is a no-op for those using the current model.


Does that change anyone’s opinion about being on by default? J

Apr 27, 2011 at 10:16 PM
Edited Apr 27, 2011 at 10:17 PM

Great to hear this is now being worked on! (even though I was called a troll on #165 :)

@ferventcoder I think on by default is more of fail safe mechanism, in that, if you are committing your packages and continue to do so, nothing changes. If you decide to stop doing that, then this behaviour surfaces. I don't think it's asking people to shift... (correct me if wrong of course).

@sodablue pretty much nailed it...

I am actively doing this right now:

- I have an internal repo on a fileshare where 'blessed' versions of 3rd party packages are stored.
- My CI builds publish new packages to this repo.
- That repo is mirrored to developer machines (optional) and the CI server.
- Packages whose version are 'fixed' and set in stone are added using the NuGet 'Add Library Reference...' dialog.
- I have a 'dependencies.xml' in the solution that defines tools and 'fluid' dependencies (packages) with a version range.
- 'Fluid' dependencies are stored in root\libs.
- A custom NuGet.exe (based on 1.2) that accepts version range (issue #405) in the root to bootstrap the dependency restoration.
- A 'RestoreDependencies' powershell script in the root that restores the various packages.configs and parses the 'dependencies.xml'. If there is a local mirror defined via an environment variable, this is used as the source, otherwise the file share is used.
- A psake build task that runs this script, and thus invoked at CI time, each time.
- A chain of TeamCity build triggers that rebuild upstream dependants whenever a new package is created.
- When the developer wants to depend on a 'fluid' dependency, they must reference it in the usual manner ('Add Reference...'). The assumption is that the hint path and assembly name are not going to change with new versions of the fluid dependency.
- The developer must run the 'RestoreDependencies' script manually before opening the solution for the first time, and occasionally to keep it up to date when desired.

I have approx 70 third party dependencies and tools, 20 solutions (handful are OSS forks) with between 4 and 32 projects per solution (averaging in the teens) and is growing fast. This is all working ok so far. Not perfect though a much better situation than where I was last October.

I would say though, if the concept of 'fluid' dependencies is introduced, where one can build against 'the latest' (or some upper range) in CI, it is probably not desirable to auto update at dev time whenever the developer compiles.

I'm not sure that the ability to put packages into a 'holding state' before QA signoff - I call this concept 'channels' - would be under the remit of NuGet? I'd see it as a seperate tool that sits after the build process and is a feed to the build process ( example: Artifactory and it's TeamCity plugin ).

I can post a sample solution of the above if anyone is interested.

Cheers,

- Damian ( twitter.com\randompunter )

Apr 27, 2011 at 10:41 PM
Haacked wrote:

I know of large enterprises (like Dell) making use of NuGet as more than a toy. So I think it is possible to build larger system solutions. Especially if you think of those systems as isolated components developed independently.

I work at a larger "enterprise" company as well, and I have similar feelings as @EddieGarmon.  While I agree with you in principal that it is good to treat systems as isolated components, pragmatically speaking, it can be useful to create solutions with cherry-picked projects in them based on what you are currently working on.  We've basically reached the point where we see .sln files as irrelevent.  What is important is the structure of our source tree and the structure of our build scripts.  From a build and source code perspective, we certainly look at things as independent systems, but we don't use .sln files to manage that.  Thus, having a system outside of .sln files to specify where packages should live is a must-have for us to use NuGet effectively.  It sounds like that may be the direction you are heading with the NuGet.config file.  I'm fine with a convention of some kind as well, which is obviously how it works today. My main issue today is that the convention relies on .sln files, and that is a no-go for us.

Coordinator
Apr 27, 2011 at 10:52 PM

We are moving closer and closer to that model I suppose. But I can explain why we have the current model.

In general, we need some way to group a set of related projects and a Solution is the best way we have at the moment. We aren’t interested in building our own model for grouping projects (at least not yet).

Keep in mind that packages add items to projects by automating the VS IDE. For example, when we add project references, config transforms, running PS scripts, etc.

Now if you decide to upgrade that package across all your projects, the updated package may need to make updates to every project. But if those projects aren’t opened in VS, we can’t make the proper updates. So for those who have multiple solutions, this would be a problem because while you may update the package contents in the packages folder, you would have only updated the projects in the solution you currently have open. That wouldn’t be good.

Perhaps in the long run, we won’t rely on VS to make modifications to the project, but that’s expensive work and we don’t yet have the bandwidth to make that change since we have many other core features to implement. Ideally, we could convince the Visual Studio team to make their APIs available as assemblies that we could call into outside of VS.

Long story made short, I think that our long term vision may head in the direction of supporting these non-solution specific scenarios, especially if we start to gather a core set of highly motivated talented contributors. But in the short-run, we’re focused on building out these features incrementally. J

Phil

From: andyalm [email removed]
Sent: Wednesday, April 27, 2011 3:41 PM
To: Phil Haack
Subject: Re: Check-in the entire 'packages' folder? Alternatives? [nuget:236592]

From: andyalm

Haacked wrote:

I know of large enterprises (like Dell) making use of NuGet as more than a toy. So I think it is possible to build larger system solutions. Especially if you think of those systems as isolated components developed independently.

I work at a larger "enterprise" company as well, and I have similar feelings as @EddieGarmon. While I agree with you in principal that it is good to treat systems as isolated components, pragmatically speaking, it can be useful to create solutions with cherry-picked projects in them based on what you are currently working on. We've basically reached the point where we see .sln files as irrelevent. What is important is the structure of our source tree and the structure of our build scripts. From a build and source code perspective, we certainly look at things as independent systems, but we don't use .sln files to manage that. Thus, having a system outside of .sln files to specify where packages should live is a must-have for us to use NuGet effectively. It sounds like that may be the direction you are heading with the NuGet.config file. I'm fine with a convention of some kind as well, which is obviously how it works today. My main issue today is that the convention relies on .sln files, and that is a no-go for us.

Apr 27, 2011 at 11:02 PM
Haacked wrote:

 

@MCHampster: In most cases though, if you do check in the packages, you’ll never run into a situation where this fails. I mean, if your developer forgets to put the packages in source control, your build is broken with or without this feature. J 

 

But to be safe, we will allow you to turn it off.

Maybe I'm misunderstanding, but it seems to me what you are saying is that with this feature, the build may not be broken because NuGet would automatically go and get those external dependencies if it had a network connection.  I don't want that behavior.  If someone checks in the solution and for whatever reason the dependent binaries don't get checked-in, or if they get deleted from source control, I want the build to be broken no matter where it is built.  With this turned on by default, it seems that the dependencies would automatically be downloaded, if possible, so the build would occur, even though it shouldn't (because not everything is there that is necessary).

It seems like in this situation, it would not be a "no-op".  But again, I could totally be misunderstanding the spec.

By the way: I should say that NuGet rocks, and I really appreciate the work you guys are doing.  I hope one day to be able to contribute in some way myself!

Coordinator
Apr 27, 2011 at 11:08 PM

Ah yes, you are correct. This is a case where the behavior is different. In that case, you’re relying on the build being broken but it’s not. Good point. J


Phil

Coordinator
Apr 27, 2011 at 11:11 PM

Just to follow-up. I updated the spec to state that it's off by default. I agree that's probably the safest approach given that you should be required to opt-in to a breaking change, not opt-out. Thanks for the feedback!

Apr 28, 2011 at 1:09 AM

I'm really excited that this is being slated for NuGet 1.4. I'm currently using a slightly hacky approach using Dan Turner's fork of NuGet.exe (http://nuget.codeplex.com/discussions/249628). Basically I'm just running nuget.exe as a pre-build step, telling it to use the current package-level packages.config. So far, this has worked fine for our organization as we're slowly moving our dependencies to NuGet, and it plays nicely with Team Foundation Build as well!

I'd be fine with dropping the forked NuGet.exe in favor of the official build when it comes out, but it needs to accommodate multiple repositories, so that we can specify our internal repo in addition to the public NuGet repo. Perhaps I missed it, but I didn't see that requirement anywhere in the official spec.

Cheers,

Dan

Coordinator
Apr 28, 2011 at 3:49 AM

In NuGet 1.3 that we just released, package sources are now stored in a per-user location in a file named NuGet.config.

Here’s where that file lives on my machine:

C:\Users\philha\AppData\Roaming\NuGet\NuGet.config

If you open that file up, you’ll see that there are package sources specified. You could drop this file on your build server in the proper location and it’ll get picked up by NuGet. This is the same location that NuGet.exe looks now.

The only tricky part right now is to make sure that the user that your build processes run under have a profile.

We don’t currently have any fallback behavior for this file, as it starts to get pretty complicated if we do. But we are exploring options for that sort of thing.

Apr 28, 2011 at 4:41 AM

Thanks for the detailed explanation Phil.  Your reasoning makes sense.  I realized that for now, the repositories.config file that gets created in the packages folder might be a good start at what I'm looking for.  If I could execute NuGet.exe against the repositories.config file instead of the packages.config file (or in addition to), then that would give me an easy way to pull in all the packages I need for a given source repository during a build, without an .sln file needing to be involved.  To @damianh's point, I would still love to be able to specify a version range in the packages.config rather than a specific version, but I would be content for now with the incremental step of being able to run nuget.exe against a repositories.config.  Obviously, it would be pretty easy to script an MSBuild target (or something similar) that peeks into the repositories.config file and runs nuget.exe against every entry, but it would be cool to see it baked in.

Apr 28, 2011 at 4:43 AM
For the advanced users with multiple solutions: It sounds like you want to use NuGet.exe (command line version). This will pull down your dependencies for you but it will not reference them in your projects. With what you are doing though, this sounds more appropriate to your needs.
____
Rob
"Be passionate in all you do"

http://devlicio.us/blogs/rob_reynolds
http://ferventcoder.com
http://twitter.com/ferventcoder
Apr 28, 2011 at 4:46 AM
On the other hand I just went to work for a company that had multiple solutions in a single set of folders under the same umbrella. It was a mess for them to deal with and we started to isolate each piece out. It's taken some rework but the end result in my mind is much more manageable.
____
Rob
"Be passionate in all you do"

http://devlicio.us/blogs/rob_reynolds
http://ferventcoder.com
http://twitter.com/ferventcoder
Apr 28, 2011 at 1:33 PM

Just read the spec (man am I behind) and +1000 on turning this off by default. I think it's a great feature but there are many situations when I grab a solution that I don't want my system reaching out to the internet by deafult. Several solutions I have are built on VMs that have no internet so I download the latest from source control, grab the folder and toss it onto a USB then open it up on the VM (or another disconnected machine). Also my practice is to have the build server on the network but not on the internet as I don't want build processes leaking out and grabbing things. If the user checked in the packages.config file then the build should fail if I have the "go get packages" option turned off.

The only comment is around msbuild.exe. I'd rather not have the dependency on msbuild.exe or rather a concrete setting of how I build my solution. For example I have several solutions that are java based but I'm using nuget.exe for package management. So nuget.exe needs to be able to do this itself. It's not clear to me where the responsiblity is here but in some environments all I have is nuget.exe rather than Visual Studio. However it's incorporated, I'd like to ensure we put it in the command line tool as well.

CI servers should be using nuget.exe as their "source provider" or an external tool that shells out and grabs the goods. The workflow on my CI server could be a) pull down source tree from source control (which contains packages.config but no packages) then b) run nuget.exe to read the packages.config to find the location of the repo and pull down the right package version c) build the solution. If it works that way, I'd be a happy camper.

So I guess I'm saying I want to have my cake and eat it too. Turned off by default but easily fetch the packages if needed or wanted.

Apr 28, 2011 at 3:05 PM
bsimser wrote:

Just read the spec (man am I behind) and +1000 on turning this off by default. I think it's a great feature but there are many situations when I grab a solution that I don't want my system reaching out to the internet by deafult.

I think it's entirely reasonable to have this off by default.  I suspect that people who are going to be using this functionality most likely are also going to be using custom nuget repositories.  So they'll be doing a variety of custom config as part of their build to get this working just right.

I also agree with bsimser's other points of being able to just do some sort of command line tool that pulls this down.  I assume that's how it would be implemented anyway, with the msbuild task merely being a wrapper around the same functionality.

 

Only other thing I'm worried about.  I think the custom repository needs to be in the source tree, rather than as an environment setting on the machine.  At least I need the option of doing this.

 

We try to be a big proponent of 'Get latest out of TFS -> Build' as a user story, rather than 'Get latest ouf ot TFS -> Run 10 manual steps configuring stuff -> Build'.

Coordinator
Apr 28, 2011 at 3:46 PM

NuGet.exe already pulls packages (as David Ebbo’s blog post points out). So that will continue to work.

In a sense, the core feature we’re discussing here already exists. It’s just not yet integrated into the build process yet. So what we’ll do is add a build target so if you use VS with MsBuild, or just MSBuild, it’ll work.

If you’re not using MsBuild, just use NuGet.exe per David’s instructions. Winner!

Coordinator
Apr 28, 2011 at 4:34 PM

Regarding running NuGet.exe against repository.config. I talked to David Fowler about this and we’re not going to do that for this upcoming release (we need to scope it down), but we are considering it for the following release.

Earlier in this thread, I gave my reasons why we don’t do this today. However, we did note that probably 99% of packages are really delivery mechanisms for assemblies. So in theory, we could implement a feature like this that would support multi-solution systems in most cases.

This feature would be very constrained. For example, if packages had PS scripts that did important operations, they wouldn’t run because we wouldn’t have access to the DTE for every solution.

I opened a feature request for this: http://nuget.codeplex.com/workitem/1012

Apr 28, 2011 at 7:08 PM

@Haacked Our use case of running NuGet.exe against repositories.config would be to bring in packages that we did not check into source control.  The fact that it would not run the PS scripts and do any DTE modification would be fine, as those scripts are typically modifying project files and other things that we do check into source control, so they wouldn't need to be run again.  We get this today by running NuGet.exe against every packages.config, it would just be more convenient to run it once against repositories.config instead.

Coordinator
Apr 28, 2011 at 7:31 PM

@andyalm Well the idea is that some folks want to use NuGet.exe to update every package in repository.config. An update to a package might have new content or new PS Scripts that need to run.

But as you point out, if all you want to do is reinstall the existing packages, this would work well.

Apr 28, 2011 at 7:44 PM

I think @andyalm's point is that running NuGet.exe over repository.config would just be a shortcut over running it on each packages.config individually. So it would carry the same conditions & limitations:

  • Works great to restore the exact same packages that were previously installed
  • Does not work well to perform update

So maybe there are reasons not to do this (like repository.config possibly going away!), but I don't think the Update limitation is a tangible reason since it's not different from the current packages.config workflow.

Coordinator
Apr 28, 2011 at 7:45 PM

Oh, yeah. I totally agree (and thought I said so in the end).

I’m just looking ahead to future functionality. But we should definitely do this, but probably not for 1.4 just yet. Let’s make sure we nail the first scenario and that we’re ok with supporting repository.config first. J

Apr 28, 2011 at 8:05 PM

@Haacked @davidebbo Sounds good guys.  Thanks for involving us in the discussion.

Apr 29, 2011 at 9:57 PM

@Haacked. We're currently using David's solution and not checking in Packages folder. In combination with our own local Nuget server we're able to (1) synch the use of external packages throughout all our projects (a lot); (2) protect our projects from unsynchronized updates. We are just using pre-build event in VS on the dev stations and MSBuild script on TeamCity to do that. Sure it would be nice to get rid of this additional pre-build event copy line, but it is not a big deal.

Just .02c,

ms440

Apr 29, 2011 at 10:03 PM
EddieGarmon wrote:

The poorest assumption that has been made so far is that everyone does "Single Solution" development. In my current work we have over 20 solutions composing different parts of the system we want to work with/debug. We currently have one common libs root folder that is addressable by environment variable (MSBuild friendly). Until NuGet supports this king of deployment senario by allowing the specification of a common packages folder, either in the nuget.config (solution level) and/or packages.config (project level), it will just be another toy that is great for one off apps and demos.

There are 14 solutions in our system also. We placed all of them in the same folder where the common Packages folder lives. All individual projects live in the their folders. We used to have a common ExternalLibs root folder for all of the solutions but Packages folder replaced it nicely. The only difference with your scenario is that instead of having extra level of folders with individual solutions and their respective Packages and Proj1, ProjN folders, we have a structure were solution files live on the same level as Packages. Nuget works for us in this case.

May 7, 2011 at 9:32 PM

I'm a little confused by what the problem is with David Ebbo's solution (or using an MSBuild Target, which works on AppHarbor).

What is this work item going to do that's different to these build steps?

May 8, 2011 at 5:16 AM

I don't think anything would be that different. The main thing that could be improved is that today you have to manually bring in nuget.exe from somewhere, and then set up the pre build action. That could be made to just work.

May 8, 2011 at 9:34 AM

Ah, that makes sense, thanks!

May 27, 2011 at 4:35 AM
Edited May 27, 2011 at 4:36 AM

Been hacking on adding AllowedVersions to the InstallCommand as well as support for the new allowedVersions attribute in packages.config, though this may be a bit premature. Some discussion here: http://nuget.codeplex.com/discussions/259017. Fork here: http://nuget.codeplex.com/SourceControl/network/Forks/the_chrismo/AddAllowedVersionsToInstallCommand. Apologies to damianh for not doing due diligence to find his similar fork: http://nuget.codeplex.com/discussions/254693

Jun 1, 2011 at 6:32 PM

@haacked - Please explain how we can use the NuGet.Config with multiple package sources.

My NuGet.Config is as follows:

 

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <packageSources>
    <add key="My Company NuGet Repository" value="\\server\share\My Company NuGet Repository" />
    <add key="NuGet official package source" value="https://go.microsoft.com/fwlink/?LinkID=206669" />
  </packageSources>
  <activePackageSource>
    <add key="My Company NuGet Repository" value="\\server\share\My Company NuGet Repository" />
  </activePackageSource>
</configuration>

 

I have a pre-build event in Visual Studio as follows:

nuget install "$(ProjectDir)packages.config" -o "$(SolutionDir)packages"

When I run the command, it seems to ignore whatever is in my NuGet.Config and uses the official repository anyway. It is able to install packages from the official NuGet repository but when I try to install internal packages:

------ Build started: Project: Test_Project, Configuration: Release x64 ------
  Unable to find version '4.1.0' of package 'Company.InternalPackage'.
C:\Windows\Microsoft.NET\Framework\v4.0.30319\Microsoft.Common.targets(902,9): error MSB3073: The command "nuget install "C:\Users\dpincas\Documents\Visual Studio 2010\Projects\Test_Team_Project\Test_Solution\Test_Project\packages.config" -o "C:\Users\dpincas\Documents\Visual Studio 2010\Projects\Test_Team_Project\Test_Solution\packages"" exited with code 1.
========== Build: 0 succeeded or up-to-date, 1 failed, 0 skipped ==========

However, when I supply the -source parameter, like:

nuget install "$(ProjectDir)packages.config" -o "$(SolutionDir)packages" -s "\\server\share\My Company NuGet Repository"

Then NuGet.exe is able to find the packages from my internal repository.

Am I using the NuGet.config file wrong, or has this multiple repository feature not been implemented?

Workitem 978 (http://nuget.codeplex.com/workitem/978) seems to address this and is marked as fixed, but I'm confused as to how to utilize the fix.

Developer
Jun 1, 2011 at 6:40 PM

The fix is in the upcoming version of nuget (not the current release), if you're in a hurry, you can download it from the ci machine (http://ci.nuget.org:8080/repository/download/bt21/1914:id/Console/NuGet.exe).

By default it'll use all sources defined in NuGet.config unless you specify sources using a semi-colon separated list. e.g. nuget install elmah -s "\\sever\share\MyCompanyShare;http://localhost/nuget;". You can also use the name of the source

e.g. nuget install elmah -s "My Company NuGet Repository;NuGet official package source"

Jun 1, 2011 at 7:01 PM

@dfowler - Thanks for the quick response. Sorry for my confusion! Phil's response earlier of:

"In NuGet 1.3 that we just released, package sources are now stored in a per-user location in a file named NuGet.config.
If you open that file up, you’ll see that there are package sources specified. You could drop this file on your build server in the proper location and it’ll get picked up by NuGet."

made me think that this was working in the current stable release.

Thank you for the clarification!