Restore packages from packages.config on build


It would be nice to not have to check the packages directory into source control. Basically, this would require a tool installs all packages listed in packages.config.

Note: This was discussed in the thread http://nuget.codeplex.com/Thread/View.aspx?ThreadId=236592
This was also duped in: http://nuget.codeplex.com/workitem/387
SPEC: http://nuget.codeplex.com/wikipage?title=Enabling%20Using%20NuGet%20Without%20Checking%20In%20Packages%20Folder

Related issue: http://nuget.codeplex.com/workitem/1198

This would make packages.config more analogous to Maven's pom.xml file.
Closed Oct 20, 2011 at 10:09 PM by aldion
  • can be enabled by right-click solution -> "Enable NuGet Package Restore"
    -- asks for confirmation (#1603)
    -- menu item is disabled when NuGet is busy (#1650)
    -- still works when default feed is disabled
    -- menu item name follows conventions (#1600)
    -- can be reenabled and restore files if they are removed (#1597)
  • restores packages from packages.config
    -- including PreReleases
    -- packages from local feed, nuget.server, official feeds
    -- picks feeds from roaming nuget.config file, disabled feed are treated as such.
    -- no issues if packages installed
    -- errors cases (e.g. missing package) gave meaningful error message
    -- note : doesn't restore package behind auth and error message could be improved (#1655)
  • misc:
    -- doesn't work with websites (#1663)
    -- doesn't restore solution level packages (#1495)
    -- default .nuget\nuget.exe from the feed gave me an overflow error on restore, it's fixed in latests nuget.exe (#1595)
    -- can't update .nuget\nuget.exe unless the solution is closed first (#1599)
    -- .nuget\nuget.config feed name should be set to something else than the URL (#1657)


joelmartinez wrote Oct 6, 2010 at 6:05 PM

I'm not sure about this ... it breaks the build server scenario because now one has to manually configure the server to include all of the packages for a given project. It's honestly better to check in dependencies like this on a project by project basis. IMO :-)

damianh wrote Oct 6, 2010 at 6:37 PM

it breaks the build server scenario
...not if the tool is (or includes) a Build Target to verify and download the dependencies.

Checking in binaries to VCS is yuck for several reasons - bloats the repo, nobody is interested in diff'ing bins, can be slow etc.

Openwrap's approach, from http://strangelights.com/blog/archive/2010/05/16/1661.aspx:

"OpenWrap provides a set of msbuild targets that allow you to hook this DSL based description into your build process and, of course, visual studio. These msbuild targets will then take care of downloading the projects you require and there dependencies to a centralised cache on your local machine. This centralised cache will then be used to provide the references required to build you’re project."

There is also another benefit to this approach (and please correct me if I am wrong). nupack's packages.config define a specific version of a dependency. Openwrap allows you to specify greater than version. In a CI situation, you could have a build configuration where by is project consistently being built (and tested) against the latest version of some dependency. This would be very useful for quickly identifying breaking changes. With nupack, you would have to do this manually, for every new version of the dependency. Openwrap's approach would work better with agile teams regularly releasing code.

I have hand-rolled a powershell solution that kinda does Openwrap's approach but am not proud of it at all. I would like to replace it with something a bit more solid. Committing packages to VCS won't work for my team at all.

Kiliman wrote Oct 6, 2010 at 7:48 PM

Checking in binaries to VCS is yuck for several reasons - bloats the repo, nobody is interested in diff'ing bins, can be slow etc.
I disagree. Part of version control is being able to get the exact bits used to build a specific version. How do I know that some dependency is no longer available when I have to rebuild a version from 6 months ago? That's why I commit my libs as well as tools (e.g., nant)

Although I do think that the actual .nupkg file should be downloaded to a common cache location, and only the lib folder added to the solution.

damianh wrote Oct 7, 2010 at 11:43 AM

Part of version control is being able to get the exact bits used to build a specific version.
Do you put the .Net framework under version control too?

maburke wrote Oct 8, 2010 at 7:00 PM

I'm thinking about adding an Install-Packages cmdlet that reads any packages.config/packages.xml in the solution and installs missing dependencies. Is there a better name than "Install-Packages"? Maybe, "Restore-Packages"? Would it make more sense to have it as a part of Install-Packages?

Haacked wrote Oct 8, 2010 at 11:04 PM

Do you put the .Net framework under version control too?
Of course not. Nor do you put the OS under version control. But you can reasonably expect the framework and OS to be on every machine. Or at the very least, you always have a min bar of pre-requisites that need to be in place in order to work on the project. I think that question is really just trolling.

I've used OSS libraries in the past that suddenly disappeared. The guy who hosted the library's machine dies and he's long stopped developing on it. I don't think it's wise to assume that URLs are truly permanent when people and their projects drop off the planet all the time. That's why it's useful to have the specific libraries in source control.

Haacked wrote Oct 8, 2010 at 11:05 PM

But I should add, I like the idea of this command. :) Perhaps Restore-Package is good. We need to see if "Restore" is one of them "approved" PS verbs. List-Package -Restore is another option. :)

andrei_dzimchuk wrote Oct 11, 2010 at 8:46 AM

But isn't the issue of a library having suddenly disappeared exaggerated?
If you released a version or two of your software that included that library chances are you can find a distribution of your software somewhere and thus you get the missing lib. You can then put it in a local repository (not really a source control) and make nupack load it from this repository. Does nupack support multiple alternative repositories by the way?

If you never managed to release your software and the library you have been planning on using in V1 suddenly vanished... well maybe you just don't need that library. Look for alternatives.

Or maybe it's something you use at build time only and don't package with your application. What would that be? Test frameworks but they are not likely to go and there are alternatives, some custom buld task but you normally write ones yourself... Can you give an example of what really screwed you?

damianh wrote Oct 11, 2010 at 3:45 PM

Does nupack support multiple alternative repositories by the way?
It would appear so. See http://weblogs.asp.net/bsimser/archive/2010/10/06/unicorns-triple-rainbows-package-management-and-lasers.aspx , Visual Studio Integration section. This could be very useful for private and corporate repositories.

AnglicanGeek wrote Oct 11, 2010 at 7:22 PM

I closed issue 210 as a duplicate of this issue.

AnglicanGeek wrote Oct 11, 2010 at 7:23 PM

I closed issue 209 as a duplicate of this issue.

ryanohs wrote Oct 11, 2010 at 11:29 PM

I agree with andrei. I think the idea of a library disappearing is exaggerated. Besides, I'd like this scenario for a corporate situation where I will have all dependent libraries (including 3rd party) hosted on a computer I control. We have several internal libraries that get updated frequently (2-3 times a week) and that are used by several projects. I don't want to bloat the repos with each new revision of the binaries. It should be up to each team using NuPack to decide whether they want to version dependencies with their code or not.

Haacked wrote Oct 14, 2010 at 12:31 AM

We'd like to support this at some point. We're going to move to vNext.

maburke wrote Oct 14, 2010 at 3:35 PM

I worked up a simple proof-of-concept.

I'm not sure if it's getting the package list in the best way. It's also not getting solution packages. It would be nice to have a GUI hook. It would be really nice to automatically hook into a pre-build event. It would be nice to avoid re-fetching packages. It would be nice if packages.config stored the original package repos (a la bundler). Anyway, that's a brain dump. Any other comments?

Oh, and I did try it out (but didn't add tests, bad me). Here's my example project: http://github.com/spraints/nupack-restore-package
If you clone it, you can open the solution, hop into the console and run "Restore-Package", and then it's all set!

ascalonx wrote Jan 5, 2011 at 5:25 PM

This would be a great addition. I've always disliked having binaries checked into source control. Yes, it's nice for getting those perfect bits that were used to build that one time. But, having used maven in a past life, this workflow really sounds good to me.

With regards to a package suddenly disappearing: back in my java days, if there was any package we used frequently, we would clone it onto our organization's package server. This would pair really well with daisy-chaining local/network/remote repositories.

Also nice would be some tools to manage a local or network repository.

bahadirarslan wrote Jan 10, 2011 at 2:01 PM

I hope that this feature will be available in next release. I am not sure about nuget release cycles but i couldn't see in v2 plans.

gtsiokos wrote Jan 19, 2011 at 11:23 PM

Good point Mr. Martinez

nicenemo wrote Jan 28, 2011 at 3:56 PM

Love to see this feature the sooner the better both in vs.net as for teambuild. Now using a shared folder in teambuilds.

the_chrismo wrote Feb 8, 2011 at 4:01 PM

@joelmartinez - i think the idea is to have a part of the NuGet tooling that would automatically grab the dependent packages during a build so that you wouldn't have to manually configure the build server. Keep your packages in an artifact repository, not in source control. And any build could also publish its output back to the artifact repository as well for other builds to ... ahem ... build on.

Haacked wrote Feb 12, 2011 at 4:39 AM

For those that don't check in binaries, I'm curious how you handle tagged builds then? For example, when you ship 1.0 of your product, you probably tag your source tree "1.0" in source control. Then you might release versions 2 and 3 later on. But at some point, a customer with 1.0 calls you up and says the product is broken. Wouldn't you want to have the EXACT binaries you used to build 1.0 in your source tree?

I mean, a lot could happen if you relied on the build process to download the packages for that old version. Packages could get pulled from the gallery. Someone could replace the package in place to fix some minor bug.

Seems to me that in order to support older versions, you'd want to have 100% confidence that you could check out all the files needed to recreate any publicly released version of your product.

ryanohs wrote Feb 12, 2011 at 5:24 AM

@Haacked, We don't actually use NuGet, but we have a similar homegrown system that works like this: There is a separate internal repository of 3rd party assemblies that each programmer must check out. In each of our project repos we then have a ruby script that can retrieve the necessary dependencies and place them in the local lib folder.

So in the dependency repo we'd have


Each local project has a ruby script that specifies which dependencies and version numbers we need. The script would pull the correct dlls and place them in a local /lib/Spark/ folder. This way we don't have to check in each dependency multiple places. The primary reason we did this is not for 3rd party libs, but rather for our own internal libraries (MSMQ message contracts and such) that get updated frequently. Basically we're trying to prevent bloating the size of our project repositories.

the_chrismo wrote Feb 12, 2011 at 6:13 AM

"I mean, a lot could happen if you relied on the build process to download the packages for that old version."

I disagee. Modern CI servers are very good and being able to recreate old builds, with or without labeling in the CI. TeamCity, for example, records the changeset # with each build and can reliably rebuild it.

Even so, I'd want a backup of my internal artifact repository ... and I realize at some point there's little difference between keeping binaries in version control vs. a network share somewhere that's backed up, but I side with the people that don't want to put (potentially large) binaries or any other build artifact in version control.

Haacked wrote Feb 12, 2011 at 9:18 PM

I know that CI servers can label the changeset, but what good is a changeset if the changeset is missing the files used to build that changeset? I guess if you backup the binaries used for every build somewhere else, that’s fine, but you have to maintain the mapping of your backup to your changeset. Is that the typical practice in this case?

I could imagine a workflow where you don't regularly check-in binaries except when tagging a public release.

damianh wrote Feb 12, 2011 at 10:16 PM

Wouldn't you want to have the EXACT binaries you used to build 1.0 in your source tree?
We have the EXACT binaries in our artifact repo, which is as important as the version control system. The 2 go together.
I mean, a lot could happen if you relied on the build process to download the packages for that old version. Packages could get pulled from the gallery.
If you are serious about your product, you maintain your own 'local gallery'.
Someone could replace the package in place to fix some minor bug.
That's a new version and if someone does that, that is poor version management. I, for one, would not take a dependency on a library that would do that. Or, I put it in my local gallery and protect myself from such things.
Seems to me that in order to support older versions, you'd want to have 100% confidence that you could check out all the files needed to recreate any publicly released version of your product.
As mentioned, source control + artifact repository gives you this.

The product I work on generates 10s (maybe 100s) of MBs of artifacts on a daily basis and are but into our artifact repository (ahem, a file share). These artifacts are picked up by lower level dependents, which in turn are built by CI and generate more artifacts, which are picked up by the next level of dependents, ... well.. you get the picture.

There is no way we are putting those into source control- it would explode!

Whatever the direction of the gallery and the accessibility of opensource libs, there is a definite requirement for a nuget compatible artifact repository that can be installed within an org and that nuget can restore packages from.

You might want to have a look at Apache Maven ( http://maven.apache.org/ ) and Artifactory ( http://www.jfrog.org/products.php ) for inspiration.

the_chrismo wrote Feb 13, 2011 at 5:57 AM

"I guess if you backup the binaries used for every build somewhere else, that’s fine, but you have to maintain the mapping of your backup to your changeset. Is that the typical practice in this case?"

Some of what's may be missing here is what NuGet should help provide. My project files/config files/whatever in the changeset in version control should have a specific reference to the version of its dependencies and trust NuGet to go off to the repo to get them. Even if the NuGet references said 'the latest built of library foo' (a) there'd likely be enough date info to reconstruct, but more importantly: (b) the build output from that build would most likely also be in the artifact repository, greatly reducing the changes you'd need to rebuild from sources anyway.

Haacked wrote Feb 14, 2011 at 8:49 PM

Thanks for the input! So is an "artifact repository" a formal term? Or is it simply any location where you store your artifacts? So for example, if changeset "123" represents the 1.0 version of my application, somewhere there's a file share with a folder named "myapp_changeset123" or something like that?

That way, if I ever need to rebuild 1.0, I just need to check out the changeset 123 from my source repo, and grab the folder corresponding to the 123 changeset, and then rebuild.

I guess for small projects, that seems like overkill, but for large projects, I can see why you might want to do that given the sheer amount of artifacts you have. :)

damianh wrote Feb 15, 2011 at 9:52 PM

'Aritifact Repository' appears to be well known in Java land. It's also mentioned quite a lot in the Continuous Delivery book (worth a read if you can get a copy) in which an artifact repository is an key part of continous build and deployment automation.

From a tooling perspective, .Net land appears to be very under-served compared to our Java friends. Perphaps it's a sign of relative 'enterprise' maturity, I don't know...

My current solution is a cobbled together powershell + fileshare + naming convention setup. Dependencies are defined in a text file which is checked into source control. This is parsed by a powershell script at CI build time, or manually by a developer at dev time.

If this workitem is done, I am hoping I can upload our own artifacts to our own local gallery (cough artifact repository) on each build and have nuget retrieve on-demand :)

the_chrismo wrote Feb 16, 2011 at 5:13 AM

What damian said.

SEWilson wrote Feb 23, 2011 at 3:31 AM

As with others, we have full CI in our build environment and the /packages/ folder doesn't really work for us as the CI process is meant to pull in the latest-greatest version of each dependency. In addition to using packages.config as an input to nuget.exe it would be convenient if (optionally):

1) The version numbers could be ignored, and the latest version of each package pulled
2) Package Management within Visual Studio, and thus Package Installation via Visual Studio, did not install to versioned folders (e.g. "-ExcludeVersion" as part of packages.config or similar, per-project)

The problem I'm having with this is that it's an excellent package manager, but it makes proper CI a real pain in the ass.

Once code is promoted from 'development' to 'qa' CI stops, and the versioning becomes important. The ability to pull down specific versions as-identified in packages.config becomes critical. "pre-lockdown" (e.g. in-development) is when these version numbers get in the way.

Additionally, configuring CI for each and every dependency is a pain without being able to feed the tool a list of packages to install.

It looks like I'll be writing a command-line tool to do what I need as I need this done by 8am tomorrow and I can't expect miracles :(

Otherwise, thanks for an excellent tool! :) /win

lindsve wrote Mar 9, 2011 at 7:45 AM

Anyone using DVCS (Mercurial in my case) would know that this would make perfectly sense. Binaries take up a heck of a lot of space in Hg and it would be really really nice if NuGet could take care of the dependencies for me instead of us having them in source control.


davidebbo wrote Mar 11, 2011 at 2:17 AM

I just blogged about a new feature of nuget.exe that supports this workflow: http://blog.davidebbo.com/2011/03/using-nuget-without-committing-packages.html. Please give it a try and let us know how that works in your environments.

damianh wrote Mar 31, 2011 at 4:04 PM

Example alternative PowerShell script to do same https://gist.github.com/896457 without the cost of a pre-build event, but instead dev does have to manually run occasionally.

damianh wrote Mar 31, 2011 at 4:52 PM

To get Nuget fully workable in CI, it will need to be able to update package references from outside Visual Studio. Mentioning it here because it's related. I've filed a bug here http://nuget.codeplex.com/workitem/902 , it needs your votes :)

Zasz wrote Apr 20, 2011 at 11:44 AM

I Neeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeeed Thiiiiiiiiiiiiiiiiiiiiiisss... I think the way maven does the dependency download is well thought out and highly sucessful, why not Nuget do the same? I need this feature together with a local repository at a project level, and at a system level (maybe C:\Program Files\Nuget\LocalRepo), and if possible at a user level (C:\Users\<username>.nuget)

Haacked wrote Apr 27, 2011 at 6:04 PM

Hi All, we're planning to implement the first iteration of this feature. Note that this first cut won't be the end-all/be-all, but it should address this particular issue pretty well.

Please review the spec here: http://nuget.codeplex.com/wikipage?title=Enabling%20Using%20NuGet%20Without%20Checking%20In%20Packages%20Folder

monoman wrote Jun 20, 2011 at 8:26 PM

Wasn't this issue solved with 1.3 and 1.4? Shouldn't this issue be closed?

hangy wrote Jun 30, 2011 at 10:59 AM

monoman: Unfortunately not. NuGet.exe 1.4.20615.182 does not include the MsBuild task, which is one of the requirements for closing this issue. It seems like this was pushed back until 1.6 in August 2011, which has the main roadmap target "Improve Support for CI Scenarios".

dotnetjunky wrote Sep 19, 2011 at 8:22 PM

I'm doing this.

dotnetjunky wrote Sep 23, 2011 at 1:28 PM

Fixed in changeset 3b47d9f07360