Resolve Solution Dependencies

Oct 8, 2010 at 6:53 AM

I used NuPack for the first time to add NUnit to an open source project. While committing to source control, I noticed that there are 5MB worth of files in the NUnit package. Fair enough, but I only needed one small dll for my project. I suppose I should have just picked out the DLL I needed to add and ignored the rest.

However, I was wondering if NuPack has a feature where it can scan a solution for package.config files and re-download the dependencies? If so, then I wouldn't need to add any of the package folder contents into my source control tree. Users could simply use NuPack to download the necessary packages. In fact, the VS addin could check behind the scenes for you when you build. Does anything like that exist currently? And if not, do you think it would be a cool feature?

Mark

Oct 8, 2010 at 7:39 AM

Exactly, I was just about to write the same request.

I wouldn't want to check in 3rd party libs as I have packages.config that already lists all my dependencies and their versions. So when I check out a solution on a clean box I want a context manu command 'Update packages' that would go and download missing ones of the exact versions that are specified in packages.config. If there is an option for VS to do it automatically it would be great.

Andrei.

Oct 8, 2010 at 3:13 PM
Why wouldn't you want to check your third party libs in?

Just curious.

-d

On Fri, Oct 8, 2010 at 2:39 AM, andrei_dzimchuk <notifications@codeplex.com> wrote:

From: andrei_dzimchuk

Exactly, I was just about to write the same request.

I wouldn't want to check in 3rd party libs as I have packages.config that already lists all my dependencies and their versions. So when I check out a solution on a clean box I want a context manu command 'Update packages' that would go and download missing ones of the exact versions that are specified in packages.config. If there is an option for VS to do it automatically it would be great.

Andrei.

Read the full discussion online.

To add a post to this discussion, reply to this email (nupack@discussions.codeplex.com)

To start a new discussion for this project, email nupack@discussions.codeplex.com

You are receiving this email because you subscribed to this discussion on CodePlex. You can unsubscribe or change your settings on codePlex.com.

Please note: Images and attachments will be removed from emails. Any posts to this discussion will also be available online at codeplex.com


Oct 8, 2010 at 3:57 PM

Becasue now I have nupack :)

Let me explain.

Before I frequently checked it libraries and I even tried to manage their versions by putting them in separate directories and then mapping particular version directories to my solution directory where the libraries were actually refrerenced from. I also worked on big projects where we had hundreds of megabytes if not gigs of libraries in the source control. In my opinion, source control is supposed to keep source. Source of the libraries is maintained by  the guys developing those libraries. All I care about is the binary of a particular version that I tested and am comfortable with.

Now that we have nupack and it stores everything it needs including versions in packages.config metadata file why should we store those libraries in source control? Of course, you can if you have reasons but nupack has potential to manage this stuff on its own.

Being able to restore the libraries repository according to packages.config comes closely with the ability to control versions of the libraries. It would be great to easily tell nupack 'ok, let's update assemblyA to 1.2 but leave others as is' and then after a second thought and some testing 'no, that was a bad idea, let's stick to 1.1'.

Oct 8, 2010 at 4:09 PM
I see where you are going now. Although it most likely won't make v1 I like what kind of picture you are painting. Please log a ticket so that we don't forget about it. And as always. We accept patches. :)

-d

On Fri, Oct 8, 2010 at 10:57 AM, andrei_dzimchuk <notifications@codeplex.com> wrote:

From: andrei_dzimchuk

Becasue now I have nupack :)

Let me explain.

Before I frequently checked it libraries and I even tried to manage their versions by putting them in separate directories and then mapping particular version directories to my solution directory where the libraries were actually refrerenced from. I also worked on big projects where we had hundreds of megabytes if not gigs of libraries in the source control. In my opinion, source control is supposed to keep source. Source of the libraries is maintained by  the guys developing those libraries. All I care about is the binary of a particular version that I tested and am comfortable with.

Now that we have nupack and it stores everything it needs including versions in packages.config metadata file why should we store those libraries in source control? Of course, you can if you have reasons but nupack has potential to manage this stuff on its own.

Being able to restore the libraries repository according to packages.config comes closely with the ability to control versions of the libraries. It would be great to easily tell nupack 'ok, let's update assemblyA to 1.2 but leave others as is' and then after a second thought and some testing 'no, that was a bad idea, let's stick to 1.1'.

Read the full discussion online.

To add a post to this discussion, reply to this email (nupack@discussions.codeplex.com)

To start a new discussion for this project, email nupack@discussions.codeplex.com

You are receiving this email because you subscribed to this discussion on CodePlex. You can unsubscribe or change your settings on codePlex.com.

Please note: Images and attachments will be removed from emails. Any posts to this discussion will also be available online at codeplex.com


Coordinator
Oct 8, 2010 at 4:28 PM

The concern I have with that approach is what if the owner of a library yanks it from the NuPack gallery? Now what?

My philosophy on source control is that as much as possible, I want a snapshot of my source code and its dependencies in source control. For example, I tag every version of the products that I ship. That way, I can go to any version of my application in source control and know that's *exactly* the version I shipped. I don't have to pull dependencies from an online resource and *hope* that's *exactly* bit for bit the version I shipped.

However, I'm not opposed to a command that does what you described in regards to scanning for missing packages. Someone could have accidentally deleted it. Or maybe someone doesn't share my philosophy. Please do log an issue and we'll consider it. :)

Oct 8, 2010 at 4:59 PM

Agreed that we could potentially have something like this.  Generally, we need some better support for getting things back into a clean state after they got messed up, and this can be part of that story.

Oct 8, 2010 at 5:24 PM

@haacked - my issue is not so much with having nunit.framework.dll in my source control - I do that anyway - it's just that another 5Mb of stuff now comes along for the ride. It's my own fault - I should have noticed before checking in. But since my project is using Mercurial, all that junk is stuck with me for good (or at least that's how I understand Mercurial). Another option would be for an NUnit-lite NuPack package.

Oct 8, 2010 at 5:57 PM

Regestered #209 and there is already #210 about same thing.

Oct 8, 2010 at 5:58 PM

I like this idea. It was the first thing I tried, after I installed a package with NuPack -- could NuPack install my dependencies from a packages.config file? Things like bundler and maven make it dead-simple to have a manifest in place of including every dependency, and it seems like a really useful thing for NuPack to have.

Issue 165 was my attempt to capture this idea, if you're looking for something to vote on. :)

Oct 8, 2010 at 8:37 PM
markheath wrote:

do you think it would be a cool feature?

 i do :)

Oct 8, 2010 at 8:47 PM
The reason I like this feature is that using Hg has shown me that for long lived projects where your dependencies change over time. Pulling the entire repository down to a new system is horrible with all of the binary references stored in the repositories.  I see this feature making remote DVCS work with nupack like Peanut Butter and Jelly. 
 
I think the paradigm of checking in all of your dependencies works better with a centrally located source control system where as a user, you normally pull down the tip.
Oct 8, 2010 at 9:07 PM

But that does imply that just pulling down the Hg repository will give you something that doesn't compile, right?  i.e. you'll first need to run a NuPack cmd to bring down the package bits?  Just making sure we're on the same page.

Oct 8, 2010 at 9:18 PM

@erichexter - agreed, I don't want my Mercurial repository getting bloated up with all the packages I ever tried out.

@davidebb - yes it won't compile, but you could have the option to include a custom nupack msbuild extension (or simple executable) that had the ability to download the package dependencies for you. I think this is something that openwrap has the capacity for with o.exe if I understand correctly.

Oct 8, 2010 at 9:37 PM

The first line of my build script would execute the bits to load the packages. I would pretty much see standing up a feed server for my company and we would make sure we have a copy of our packages running locally so that the download speeds are fast. think of this as a site cache for the packages we use for all of our projects. There are a bunch of scenerios that could build on top of each other to make this awesome.

Oct 8, 2010 at 9:38 PM
I think that is the basis behind openwrap. That openwrap manages your dependencies and updates for you in the back end. It also keeps a local cahce for you.
____
Rob
"Be passionate in all you do"

http://devlicio.us/blogs/rob_reynolds
http://ferventcoder.com
http://twitter.com/ferventcoder

Oct 8, 2010 at 9:38 PM
And if I can learn how to spell. cache.
____
Rob
"Be passionate in all you do"

http://devlicio.us/blogs/rob_reynolds
http://ferventcoder.com
http://twitter.com/ferventcoder
Oct 8, 2010 at 9:44 PM
I definitely see this being useful. I'm still a keep your references local, but everyone has their own way and I respect that.
 
I also like the idea that you keep your stuff down in a local cache. The server could quickly get overloaded if enough people wanted to update at the same time.
 
We're only talking like, what, a million or so users of NuPack projected over the next few years? Or is it more?
____
Rob
"Be passionate in all you do"

http://devlicio.us/blogs/rob_reynolds
http://ferventcoder.com
http://twitter.com/ferventcoder
Oct 8, 2010 at 9:56 PM

The current estimate is 3.7 billion users by next April. :)

Note that having a local cache on the machine is something we've been planning all along, but left out of v1.  Our repository architecture should make this pretty easy when we get to it, by basically chaining IRepository instances.

Coordinator
Oct 8, 2010 at 9:58 PM

So what happens when you clone a repository, go to build, and you find out one of the packages you depend on was yanked from the gallery. At that point you probably do wish you had at least checked in the assembly into your repository, no?

Oct 8, 2010 at 10:00 PM
I prefer the broken build token myself actually. ;) 
Oct 8, 2010 at 10:02 PM

Phil, it's a choice users will have to make, but I do think we should support that model.  And in most cases, I think well known dependencies will not disappear.  Or maybe they'll be updated to a newer build, and the behavior will then be that you'll get an update when you ask for the packages (assuming the version requirements allow it).

Coordinator
Oct 8, 2010 at 10:15 PM

Sure, I'm not opposed to supporting this option too. Probably not in v1, but we should make sure we have the right core APIs so others *hint hint* can build this.

Oct 9, 2010 at 5:11 AM

So, we just went through a big TFS migration at Dell. and there issue of storing binaries in the repositories came up as a big issue because, as a company we want to manage this stuff a little better.  In our case, we actually migrated all of the reference assemblies to their own repository. Using nupack we will host an internal repository for all of our internal team to share dependencies, but we will also add the 3rd party libraries to that repository.  For instance we created a AppFabric client package.  That way if some thing is needed we have a team that will make sure those binaries are saved and backed up and everything else that the enterprise needs. So, I guess I am laying out a scenario which is not a core scenario and most shops would not need this.  But, yes. for those that do need it, we can step up and contribute that feature.  I do not think it is a v1 feature as well.

Oct 10, 2010 at 6:47 PM
Edited Oct 10, 2010 at 6:54 PM

Echoing erichexter here...

In our company, 3rd party dependencies, oss or otherwise, are stored in a local repository (really filesystem at this point) and we have a shared powershell script that will 'download' the dependencies on-demand and on each build by CI server. This completely isolates us from having libraries yanked from the gallery / internet. Our own assemblies and components are deployed to this repository as part of CI process also.

Oct 11, 2010 at 7:55 AM

turns out there is already issue #165