Possible to use Nuget externally?

Feb 21, 2011 at 5:47 PM

Since there are many people that cannot use Nuget at all inside of Visual Studio in any fashion, is it possible in anyway to work with Nuget outside of Visual studio?

Perhaps through the command line directly or maybe writing a console app that pushes calls to Nuget binaries?

Coordinator
Feb 21, 2011 at 7:08 PM
Check out our FAQ: http://nuget.codeplex.com/wikipage?title=Frequently%20Asked%20Questions&referringTitle=Home

Short answer, yes!

Phil

From: dotnetchris [notifications@codeplex.com]
Sent: Monday, February 21, 2011 9:47 AM
To: Phil Haack
Subject: Possible to use Nuget externally? [nuget:246942]

From: dotnetchris

Since there are many people that cannot use Nuget at all inside of Visual Studio in any fashion, is it possible in anyway to work with Nuget outside of Visual studio?

Perhaps through the command line directly or maybe writing a console app that pushes calls to Nuget binaries?

Feb 21, 2011 at 7:17 PM

"The command line tool, NuGet.exe, will download and unpack packages, but it won't automate Visual Studio and modify your project files."

With this caveat doesn't this make nuget just a weird version of robocopy?

Coordinator
Feb 21, 2011 at 7:30 PM
How so? Robocopy copies files and that's it.

If you type: NuGet.exe install EFCodeFirst.SqlServerCompact it's going to download this package as well as all of its dependencies, unpack each of the packages into a conventional layout, and all you have to do is add references to the assemblies in those directories into whatever project system you're using.

At some point, we can add the ability of NuGet.exe to list available updates for packages in that directory and update packages in the directory.

If Robocopy does all that, then I haven't used it in a long time. :)

Phil


From: dotnetchris [notifications@codeplex.com]
Sent: Monday, February 21, 2011 11:17 AM
To: Phil Haack
Subject: Re: Possible to use Nuget externally? [nuget:246942]

From: dotnetchris

"The command line tool, NuGet.exe, will download and unpack packages, but it won't automate Visual Studio and modify your project files."

With this caveat doesn't this make nuget just a weird version of robocopy?

Feb 21, 2011 at 7:49 PM

If it won't add the references, I guess that also means it's not capable of running any of the config or code transformation files that actually lets projects get installed to consuming applications?

Feb 21, 2011 at 7:56 PM

That correct, it does not apply a package to a project, but only takes it as far as making all the bits available. We might consider having it do more in the future.

Coordinator
Feb 21, 2011 at 8:44 PM

Out of curiosity, if you’re not using Visual Studio, what *are* you using? What kind of project files do you have?

Feb 21, 2011 at 10:13 PM
Haacked wrote:

Out of curiosity, if you’re not using Visual Studio, what *are* you using? What kind of project files do you have?

We use Visual Studio for our entire shop, however nuget doesn't function in any of our environments across the board, from metal Windows 7, to Windows 7 on Hyper-V, to Windows-7 on parallels on mac. All machines exhibit http://nuget.codeplex.com/discussions/239798

Feb 21, 2011 at 10:21 PM

The reason I posted this question is recently I ended up building scripts to speed the deployment of rapidly changing binaries (that I wanted to originally use nuget for), for my 1 use it worked out well having some robocopyness on it. This made me think about whether it would be worth wrapping some conventions around this and making it a formal pattern at my shop. At this point I realized it doesn't really do much besides copy stuff, that I'd need to be able to invoke the config transformation stuff. This is when i realized I'd effectively be building nuget, which made me wonder if it'd be possible to directly use nuget, but it doesn't really seem to fit that way.

Coordinator
Feb 21, 2011 at 11:12 PM

If you have time, it might help to arrange an IM session with one of the NuGet devs to debug that issue. AFAIK, we should work in all those environments.

Feb 23, 2011 at 3:28 PM
Haacked wrote:

If you have time, it might help to arrange an IM session with one of the NuGet devs to debug that issue. AFAIK, we should work in all those environments.

I'm all for this that hopefully this can get resolved since adding nuget to be a core component of my shop's entire ALM was a primary goal for this year as many of my dependencies are starting to grow out to be shared by many solutions.

Mar 31, 2011 at 3:53 PM
Edited Mar 31, 2011 at 4:54 PM

Throwing this in the mix... with respect to agile dev and continous integration and adopting nuget inside our org, the inablity to script and automate the update of a project's package references from outside visual studio is a showstopper.

(Related: NuGet #165)

Edit: File a corresponding bug here.

Apr 6, 2011 at 8:17 AM

>the inablity to script and automate the update of a project's package references from outside visual studio is a showstopper.

+1

With this feature NuGet will change the way we work and be awesome, without it the build server can not be involved so we wouldn't even start. I am off to hack it up with reflector and figure out how I can force an "update-packages" from my build scripts.

Coordinator
Apr 6, 2011 at 5:38 PM

Hi Folks. I think I'm still having a mental block on some aspects of this discussion. Please forgive me for my thick headedness. I'd like to focus on one thing at a time if you'll bear with me. 

There are 2 issues related to this one: http://nuget.codeplex.com/workitem/818 and http://nuget.codeplex.com/workitem/902.

Issue 818 requests: 

Think things like Creating a new solution and adding 7 packages with one script, as well as other possibilities it can open up.
When I read this, I think "all this can be done inside of the PowerShell console, make a command to do this."
But LeeCambell adds to that issue:
it seems that you are implying that if the Build server had VS installed, that I could automate this?
Ah, so you want the build server to add a new solution and add 7 packages? That's the part I don't understand. Why would you'd want a build server to do this?
Please hear me out so you can correct my misconceptions. :) I can understand why you wouldn't want to check in the packages folder and binary packages. That's why we implemented the ability to restore packages from packages.config using NuGet.exe: http://blog.davidebbo.com/2011/03/using-nuget-without-committing-packages.html. That is something you can run on the build server.
Please hear me out so you can correct my misconceptions. When I think of a build server, I think of a server that takes specific changesets developers check-in and automatically builds all the related artifacts, pulling in whatever is needed to make that build.
When I think of NuGet packages being installed into a project, I think of it changing the project in such a way that the developer can code against the library. That particular task is not something I would want a build server to automate because a build server isn't going to write code against the library I just installed. A developer is. I'd be worried about a build server making changes to a solution that I know is working when I checked it in.
After a package is installed or upgraded, I would want a developer to test out the application, make necessary tweaks, write some unit tests, then check the code in under a known working state, and have the build server build that specific changeset, because it's in a known working state.
So again, I can understand why you'd want to automate pulling in packages explicitly specified in a project on a build server, but I don't understand the other scenario. If I understood it better, we'd be able to make better progress on it or perhaps help guide you to implement the workflow yourself.

 

Apr 7, 2011 at 10:28 AM

Thanks for the quick reply @Haacked.

I see our miscommunication. Let me explain my requirements and hopefully you can see why I have jumped on to this thread.

We are building a Silverlight composite application. It has a client and server side components. There is a set of core (in-house) libraries that are used for logging, communications etc... that are cross-compiled to work in .Net 4 and SL4. There are also some libraries that are specific to .Net4(server-side such as DataAccess) and others that are specific to SL4(clientside such as controls).

A client side component might have a dependency graph that looks something like 

  • ModuleA
    • ModuleA.Contracts (our public interfaces that are exposed to other modules)
    • Internal Control Library
    • 3rd Party Control Library
    • Internal.Communications
      • Internal.Core
      • 3rd Party Communications library
      • Reactive Extensions
    • Internal.Core 
      • Reactive Extensions
      • 3rdParty Logging

Server side component may then look like 

  • ServiceA
    • DataAccess
    • Internal.Communications
      • Internal.Core
      • 3rd Party Communications library
      • Reactive Extensions
    • Internal.Core 
      • Reactive Extensions
      • 3rdParty Logging

Consider that we have about 5 server side components/services and 4 clientside modules. That covers the production deployable code, on top of this we have

  • fakes for many our back end dependencies that also use the Communications library
  • Monitoring tools that use the Core and Communications libraries
  • Testing tools that use the Core and Communications libraries

When our team make a fix or an improvement to the Core or Communications library, they currently have the manual process of updating all of the projects that depend on them. This process is very time consuming (2-6hrs) and due to the friction and laborious nature of the task it is often done quite poorly. We also have a ramp up coming where we expected to double the number of modules and services in the next year which would make this manual process even more unbearable.

What we are looking for is the following Automated process:

  1. when an internal library is modified the build server will version, build, test, package and publish it into our internal NuGet Repository
  2. the build server will then activate dependent builds (4 modules, 5 services, fakes, test tools and monitoring tools)
  3. builds for projects that consume a NuGet dependency will check the internal NuGet repository for newer version and get the latest
  4. they will update the csproj files and packages.config to the latest versions and check in the changes.

External versions of products (eg. Reactive Extensions) will be manually added to our internal Nuget Repo as we don't want to automatically update as new versions of these come out.

FYI: I have managed to create a nasty MSBuild Task that does automate this final step of the process.

Our perceived alternative was for us to have the Mega-solution where all projects were referenced and built together. This was what we initially had but it became unusable as we have over 100 projects and build times breached 10minutes. This eventually meant that devs would find ways to reduce the frictions (ie not running tests, checking-in without getting latest, creating mini solutions) which just resulted in more broken builds and more developer down time while people waited for almost unrelated code to be fixed so the solution would build.

 

I hope that explains the pain that we are feeling and why we see NuGet as a great fit for our requirements. I also hope that explains why we have the (unusual) requirement for the build server to modify files and to check them in too.

Lee

Coordinator
Apr 7, 2011 at 5:05 PM

Thank you Lee. That makes total sense to me.

Here’s something interesting to consider. In most cases, I would bet that the only change that’s needed to your .csproj file would be to change the hint path, right? For example, here’s what gets added to a csproj file when you install a package that only has an assembly (no content).

<Reference Include="MyAssembly">

<HintPath>..\packages\MyPackage.1.4\lib\NETFramework40\MyAssembly.dll</HintPath>

</Reference>

One feature request we heard from folks was to not put the version in the package folder. We’re still investigating this idea as it would have a huge impact on NuGet. But if we did this, that reference would look like:

<Reference Include="MyAssembly">

<HintPath>..\packages\MyPackage\lib\NETFramework40\MyAssembly.dll</HintPath>

</Reference>

If we did this, in theory, it sounds like your process would work without automating VS using the process described by David Ebbo using NuGet.exe. Where it would break-down is if these internal packages of yours added content to a project. But it sounds like that’s not the intent of these internal packages and you could work around that by refactoring the packages you want upgraded automatically to not include content.

Are my assumptions correct? I’m looking for ideas that can work sooner rather than later as automating VS outside of VS is a pain. Editing csproj files directly (not using VS) while maintaining parity with VS is also a pain.

Thanks for describing your scenario! We’re still in the process of learning a lot about the myriad of ways people build software (not just write it). J

Apr 7, 2011 at 8:20 PM

Indeed, the tricky part is that while updating some packages simply mean referencing a newer version of the same assembly, there are all kind of scenarios where it means doing more things that can't easily be done outside of VS (e.g. running install script that may rely on DTE).

As Phil suggests, things will become easier if we move to a non-versioned folder scheme, as it will at least allow for an in-place update outside of VS to happen. It won't be a complete update if the package has changed in any significant way, but it will work fine if the only change is a newer version of the existing DLL.

Aug 8, 2011 at 12:47 AM

I think distinction between the use cases of NuGet is important - people build software in different ways and the relationship between a project and the assemblies it references vary:

  1. Using NuGet to manage 3rd party references (i.e. log4net, NUnit) - which is what it seems is currently aimed at doing
  2. Using NuGet to manage the context application references (which is where the automated safe update requirement comes in), and therefore requiring CI server capability

Put simply, to be used in a corporate CI process the product really needs to support both of these.  This dicussion could be of interest : http://nuget.codeplex.com/discussions/268100

Jul 7, 2014 at 5:22 PM
Our use case for installing a package (in full) outside of Visual Studio differs from the others here it seems so I would like to share it in hopes that it helps get this issue some traction:

One of the ways we release our software is through NuGet. We also strive for CI, which means we want to automate as much of our build/test process as we can. Currently, the only thing we have to manually test is the installation of our NuGet package. The reason for this is because we cannot currently have functional tests that will install our package to a solution, build the solution, and see if our deployment worked. Instead, someone has to go manually install the package to a solution then kick off the automated tests.

Unfortunately, this has bit us in the ass because just recently, our NuGet package deployment went awry. Unfortunately, due to the manual nature of the test it was basically at the very end of our release process (after everything had gone green). It took us several days to track down the source of the problem and resolve it which caused us to release our software without a NuGet package update.

If we had an automated test for install/uninstall scenarios we would have seen the problem weeks in advance and had more than enough time to resolve the problem, or possibly avoid the bulk of it.

We are currently investigating automation of Visual Studio, but that is no trivial task on a build server.

To attempt to generalize the problem, I feel that the lack of automatability of package install/uninstall discourages enterprise shops from utilizing NuGet since they can't have automated test scenarios of their own NuGet packages. Looking at other people's comments on this issue, this generally seems to be an enterprise problem surrounding automation. In general, anything one can do manually, an enterprise shop is probably going to want to automate at some point in an effort to eliminate/reduce risk.
Jan 19, 2015 at 8:12 PM
Edited Jan 19, 2015 at 8:14 PM
Our use case is an msbuild system on TFS for a desktop application that needs to first restore existing packages and then update certain internal dependencies automatically. We have several solutions and projects sharing the same directory tree (some projects are shared or have different build configurations for different solutions).

Currently we can restore packages, but can't udpate individual packages solution-wide (since our projects depend on each other, they all have to have the same version of the same dependencies within a solution).
Our working solution is:
  1. call nuget restore X.sln
  2. search for all csproj files in the sln file
  3. for each Y.csproj file, re-write its packages.config file to set allowedVersions=[version] for the packages we're not updating, so that we can update certain packages.
  4. for each Y.csproj file, call nuget.exe update Y.csproj
It would be quite helpful if steps 2-4 could be replaced with "nuget,exe update PackageName X.sln".