335
Vote

Allow package resource folders to be configurable

description

I <3 Nuget. But one thing I don't like is that it forces things like JavaScripts to be in one location. I keep my website organized a specific way, and I don't want my scripts in that folder. So it would be nice if certain folders could be configurable to drop items in an alternate location.

file attachments

comments

JeffHandley wrote Feb 8, 2012 at 8:32 PM

Interesting idea for sure.

JeffHandley wrote Mar 16, 2012 at 6:38 PM

An idea:

One could create a MyPackageName.config within the .nuget folder, specifying a mapping of where files should be installed. During package installation, we'd respect the mappings in place.

Going further, we could have UI around defining the mappings by folder or file.

balexandre77 wrote Oct 14, 2012 at 8:57 PM

This will be lovely as when using multiple libraries e tend to create folders for each library, an example is attached to this comment

ParaSwarm wrote Nov 11, 2012 at 2:29 AM

I would love to include stuff like jQuery via NuGet if I could control the script location.

bhuvak wrote Dec 1, 2012 at 12:15 AM

Amateur wrote Dec 3, 2012 at 8:10 PM

Any updates as to when or if this may be investigated in to? I think its something that needs to be looked, as projects loose structure when JavaScript files for example are added to folders which may go against the natural structure of the project.

bartmax wrote Jan 4, 2013 at 1:29 PM

PLEASE!

I guess something will be great to MAP into the 'client|user-side' so existing packages will still work (not sure how they are created) but imagine that you can do a mapping like:

~/Scripts => ~/Content/Scripts

then when installing files you can 'parse the output directory and if it's ~/Scripts* change to ~/Content/Scripts*

JeffHandley wrote Jan 23, 2013 at 8:35 PM

Triage: moved into 2.3 due to high vote count.
This is not a guarantee that it will be completed in 2.3 though.

bhuvak wrote Feb 2, 2013 at 12:39 AM

dotnetjunky wrote Feb 4, 2013 at 4:32 PM

We won't be able to do this in 2.3. Move to 2.4

roblang wrote Mar 13, 2013 at 3:53 PM

Now in March 2013 and this feature is becoming more pressing. Would love to add to create a series of resource destinations per project in a nuget.config file.

dotnetjunky wrote Apr 11, 2013 at 5:16 PM

koistya wrote May 22, 2013 at 11:53 PM

How about this:

Image

We don't even need an interface for this for the first time.

koistya wrote Jun 16, 2013 at 4:12 PM

Having mapping info inside packages.config as opposed to nuget.config would allow installing packages from different vendors exactly as you need them. Different package vendors may have different folder structures. For example one package installs scrips to /Scripts root folder another package installs scripts into /Scripts/{Vendor} folder making your overall project structure inconsistent.

Instead vendors could just be forced to use /Scripts, /Styles, /Images or /Content folders for their assets and consumers would map these folders to their real-world project structures.

dotcom wrote Aug 15, 2013 at 8:42 AM

+1 for the idea from koistya, above.

bartmax wrote Aug 15, 2013 at 3:41 PM

I think the best would be to have unopinionated folder structure.
Something like I define that packages install on /Modules folder.
Then, each package install all the necessary items in /Modules/{package}/ and from there anything the package owner decides.
Example por jquery:

/Modules/jQuery/scripts
/Modules/jQuery/styles
/Modules/jQuery/styles/themes
/Modules/jQuery/images/

for bootstrap:

/Modules/Bootstrap/Less/{version}/
/Modules/Bootstrap/Scripts/
/Modules/Bootstrap/Styles/
/Modules/Glyphicons/Styles/
/Modules/Glyphicons/Fonts/

for knockout could be:

/Modules/knockoutjs/

It all depends on package owner, and still the structure is easy to manage, update, delete, etc.
In this scenario, package owner can include any files into it's package folder without messing the project at all.

Also, it would be great if user can change 'package name' installation, so I can install SomeNamePackage into /Modules/MyNamePackage for that package.

to continue with the example above, I may configure jQuery to install in /Module/TheAwesomejQueryFramework/ being this completely optional.

Packages that requires some files to be placed on a specific folder like glimpse.axd should be able to do so.

I think this gives the best of both worlds, package owners and users have freedom to structure their folders as they wish.

You won't be able to have /Scripts/jquery.js and /Content/Styles/jquery.css but there's not real benefit / use to have jQuery.js in one folder and jQuery.css on another.

The idea is not mine, BOWER (http://bower.io/) is currently doing this and I think it makes lot more sense to me.

koistya wrote Aug 27, 2013 at 10:39 PM

Also, keep in mind that there could be much more file types than simply .css, .less, .js ... We need some generic solution to configure where exactly content files are going to be installed / restored.

Image

koistya wrote Aug 31, 2013 at 9:19 AM

/Scripts/knockout.js > /Scripts/knockout.min.js
/Scripts/knockout.debug.js > /Scripts/knockout.js

koistya wrote Aug 31, 2013 at 10:31 AM

Also having an ability to skip some files during a restore would be beneficial. Consider a scenario where you want to install Bootstrap 3 library, but you don't need any of its .js files because you're using AngularJS..
/Scripts/bootstrap/*.js > null

enorl76 wrote Sep 16, 2013 at 2:26 PM

Upvoted. My organizations' standard is ~/Assets/js.

A redirect for, for instance the jQuery nuget package, would be grand.

I would recommend redirect nodes WITHIN the package reference nodes:

<package [...]>
<redirect source="/Scripts/*" target="/Assets/js/" />
<redirect source="/images/*" target="/Assets/images/" />
</package>

The problem could be introduced though that Nuget would now be responsible for "rewriting" the internal paths of JS librarys/CSS/etc... UG!

koistya wrote Nov 2, 2013 at 4:21 PM

Having an ability to redirect individual files is essential, as well as to ignore some files.
<package id="Twitter.Bootstrap.Less" version="3.0.1" targetFramework="net45">
  ...
  <redirect source="/Content/bootstrap/type.less" target="/Styles/core.type.less" />
  <redirect source="/Content/bootstrap/grid.less" target="/Styles/core.grid.less" />
  ...
  <redirect source="/Content/fonts/*.*" target="/Fonts/" />
  ...
  <ignore source="/Scripts/*.js" />
</package>

spiderM9 wrote Nov 8, 2013 at 9:22 PM

Please do not implement this to be at the solution file level. It is too presumptuous to presume that solution files are even being used to build applications.

I like the <redirect...> above, assuming that it can be placed in a file that is discovered the same way that the NuGet.config file is discovered, by walking up the source tree until a file with a particular name or file extension is discovered. If I have 600 project files and need to hand curate the packages.config file for each one of them, well, forget it.

andresraieste wrote Dec 4, 2013 at 5:37 PM

I wouldn't make this overly complex. I really like Bower's solution, as it's unopinionated as also stated earlier here. It's a tried and praised solution which yes does have it's quirks, but it works best.

Therefore, I would add an optional attribute to <packages> tag in packages.config, something like this:
<?xml version="1.0" encoding="utf-8"?>
<packages defaultassetdir="Assets/Vendor">
  <package id="bootstrap" version="3.0.0" targetFramework="net451" />
</packages>
In which case there is a possibility to install components to a custom directory, e.g. Assets/Vendor like in the example above.

Per file configuration (overrides) on the other hand is yes more flexible, but on the other hand I think package management should be only about package management and not about introducing another layer of difficulty to the project. It could be misused a lot and might (or rather will) easily break in package upgrade scenarios, for example.

And as the talk started about having a choice about "Contents" and "Scripts" folder, I'd say that all sane developers anyway use minification and bundling, therefore the package target attribute solution works very well for that reason.

andresraieste wrote Dec 4, 2013 at 5:58 PM

Oh, never mind, just understood that optional attribute to <packages> tag wouldn't work at all for any other package than JS/CSS packages.

fejesjoco wrote Feb 20 at 12:56 PM

My colleagues and I don't really like that Nuget copies physical files (instead of references) into our projects at all. Storing them virtually code solve this problem as well. See https://nuget.codeplex.com/workitem/3903

enorl76 wrote Feb 20 at 1:54 PM

@fejesjoco: Your desire (for references instead of physical files) is technically out-of-scope for this particular request. You could keep your request more aligned with this one by adding the request for a better naming scheme, and then that parlays into you having a better naming scheme that should then override the package producer's naming scheme.

phil_ke wrote Mar 19 at 8:46 AM

How come this simple fix is not in place yet? It just completely destroys the folder structure of existing projects.

koistya wrote Mar 19 at 9:35 AM

As a workaround you can have client-side NuGet packages installed on a solution level (as opposed to a web application project) and have your build script copy required libraries to website's output folder. If you're working on a non-trivial web app, most likely you already have a client-side build system in place, also this way your source control system will be free of any 3rd party libraries. If you're new to client-side build systems, take a look at http://gulpjs.com and Gulp.js NuGet package.

Haggis777 wrote Apr 2 at 8:50 PM

This feature is the only reason I haven't embraced nuget for my web projects. I can't stand having everything dumped into one directory. Its just sloppy.

jvanderstad wrote Apr 19 at 12:38 AM

@Haggis777.. I agree..

I just want a very clean folder structure. No random subdirs..
But I just love the update function..

so... big upvote!

vlab wrote May 15 at 4:42 PM

So 2 years and almost 300 votes later, what do we got?

milanjaros wrote May 15 at 5:56 PM

Vlab, it seems it's not even planned for release 3.0. :'-(

It's sad true that sometimes the feature which is no. 1 for users is not important for the product owner.

JeffHandley wrote May 15 at 7:19 PM

We put quite a lot of effort into trying to design this out a while back, but we kept identifying so many issues in the user experience, we put it on the back burner.

When does the content file mapping need to be respected?

During package install

When a package is installed, the content file mappings need to be respected so that the content files are copied into the project in the desired structure.

During package uninstall

When a package is uninstalled, we have to know where the package contents were mapped in so that we can cleanly remove the package's contents.

What kinds of mappings are desired?

Rules for both folder and file name mappings would need to exist. Additionally, generic rules for all packages and specific rules for specific packages/files would expect to work.

Generic rules to apply to all packages

Rules such as the following are desired:
  1. Put all JavaScript files into /js
  2. Put all CSS files into /css

Package/File specific rules

There are also scenarios for wanting to map specific files for specific packages
  1. Redirect /Content/bootstrap/type.less to /Styles/core.type.less

What is the desired user experience?

Ideal workflow

Let's paint the most ideal picture of how this would work.
  1. Package is installed (using its default content structure)
  2. The content files are moved around and/or renamed in Solution Explorer, and mappings are generated behind-the-scenes, and persisted
  3. When the package is uninstalled, the mappings are respected and the files are removed from their new places
  4. When the package is updated, the mappings survive for the new version to respect the same mappings

Acceptable workflow

Because the ideal workflow is pretty dreamy (with the amount of solution explorer integration that would be required), let's define an acceptable workflow that would be more manual.
  1. Package is installed (using its default content structure)
  2. The content files are moved around and/or renamed (either in Solution Explorer or outside of VS)
  3. A mapping file is created that defines the rules for how the content files were moved around
    1. Some of these rules might be generic, for all packages
    2. Some of these rules might be specific to the package's content files
  4. The package is then uninstalled
    1. The mappings that were hand-authored are respected during this uninstall.
    2. If the package uninstalls cleanly, then the rules were authored correctly.
  5. The package is then reinstalled
    1. The mappings are again respected.
    2. If the end result suits the user's desires, then the rules are doubly confirmed to be correct.
  6. The mappings are committed to source control with the project

What design work is there to do?

Looking at the acceptable workflow, we need to design the following aspects of the feature before we can act upon it.
  1. What is the persistence format for the mapping rules?
  2. Where are those mapping rules persisted? (packages.config, nuget.config, etc.)
  3. What scenarios are expected to work and which ones aren't?

Persistence Format

On the surface, the persistence format seems pretty straight-forward. But once we got into the details of all possible kinds of mapping users want, it evolved into something that matches what nuget.exe pack uses for mapping content files.

http://docs.nuget.org/docs/reference/nuspec-reference#File_Element_Examples

Persistence Location

Where do we store these mappings? Here are some options:
  1. packages.config
    1. This was shown a few times as being the intuitive place to store these mappings
    2. But when a package is uninstalled or upgraded, those mappings would then be lost because the package element would be removed from the packages.config file
    3. This also prevents the feature from being used at the solution level or even globally for the user/machine
    4. But the fact that the mappings would be lost when a package is upgraded was the true blocker for this approach
  2. nuget.config
    1. We could put these mappings into nuget.config at the project level, solution level, or anywhere else
    2. We already have hierarchical configuration support
    3. But this could lead to a situation of one config file being seen during package install, and then for some reason that nuget.config not applying to the project later on when the package is updated or uninstalled
  3. Somewhere else
    1. Perhaps we use a packages.mapping.config file in the project where mappings would be persisted
    2. These would only be respected at the project level

Working vs. Non-Working Scenarios

It's apparent there would be scenarios beyond working boundaries of this feature. We'd need to clearly define the boundaries of the feature, but even with the best documentation imaginable we're sure to get bug reports when users hit scenarios that don't work.
  1. Path references within the content files
    1. What happens when a CSS file refers to an image file and the image file no longer exists where the CSS file expects?
    2. What happens when a package has an install.ps1 or uninstall.ps1 script that works on the content files?
    3. Are users expected to fix up paths in the files?
    4. If so, the package uninstall/update will fail because the file will be modified and left behind
    5. If content files are mapped into new locations, you're really out on your own and you have to be responsible for the package not functioning properly anymore
  2. Operating on mapped packages using old versions of NuGet
    1. What happens when one user upgrades NuGet, applies some content file mappings, checks in, and then another user without the latest version of NuGet tries to update the package?
    2. This would clearly leave the project in a broken state and it would have to be okay
  3. Limitations on content file mappings
    1. Is the existing mapping syntax we have sufficient?
    2. How do we avoid it evolving into regular expressions?

Why we halted the effort on designing this feature

As we talked through the working vs. non-working scenarios, we started realizing that the "happy path" for this feature is actually pretty narrow, and as soon as you wander off the happy path it falls apart pretty easily.

Should we build such a huge feature (that has a lot of impact and risk) when its working scnearios are so narrow? We didn't think so.

With that said, this problem is still on our minds. We'd love to hear more thoughts on how we could approach it so that it's robust and has a wide working surface area without many non-working scenarios that would basically become a bug farm.

milanjaros wrote May 17 at 5:55 PM

First of all I'd like to apologize for "complaining" ;) and say with respect "Thank you".

Design to do

Persistence Format

Sounds reasonable.

Persistence Location

Unfortunately, I don't see/understand an issue you mean by "some reason" in this text:
We already have hierarchical configuration support. But this could lead to a situation of one config file being seen during package install, and then for some reason that nuget.config not applying to the project later on when the package is updated or uninstalled.

Working vs. Non-Working Scenarios

Path references

Issue with paths is obvious. IMHO no-one can expect that NuGet will handle that. There is no wizard and drag'n'drop in Explorer for this so, if we are modifying config files manually we need to read documentation. And documentation says: "If you do this you could get in troubles (with content paths)." ;) It is responsibility of developer.

On the other hand, I can imagine to have a hook into automatically running scripts which could developer and/or package owner use for reflecting "Content file mappings".

Old versions

I have clear answer for this. Older versions just ignore these settings. If this will be in version 3.0 and one developer has 3.0 and the rest of team has lower version, then the developer is responsible for notifying the rest of team to update NuGet to 3.0. If not, it will be broken (by the developer).

Limitations

Yes, I think it should use the same pattern, it make sense. If the regexp will be requested in future by someone, we can vote for it.

Can we try to finish design

I don't see blockers (but Persistence location) and as you can see this feature is requested. It consumes a lot of effort, sure. It has huge impact and it is risky. But all of us here want to have clean Solutions, isn't worth it? It doesn't seems to be too narrow if you have 80 (and more) third party files in the root of Scripts folder (and this is not the only scenario ;)).

Can we talk about it a little bit until it is still in your minds? :o)

JeffHandley wrote May 22 at 7:34 PM

@milanjaros - no worries on "complaining." It was 100% perfectly reasonable for you all to yell at us for being so silent on this. It was our mistake for not commenting that we'd put a bunch of thought into this and had gotten blocked on the design. So, thank you for complaining that we were seemingly ignoring this issue. Sincerely.

Responses to your comments/questions

Persistence Location

I'm referring to edge cases here. Situations where a project might be moved around between a package install and uninstall operation. Or a situation where there was a mapping in place in a developer's nuget.config under %AppData% when the package was installed, but a different developer uninstalls the package and they don't have the same mappings configured.

We could solve this problem by copying the applied mapping into the packages.config file (or somewhere else in the project folder) so that uninstall would respect what was used at the time of install.

Path References

It's easy to say "you're on your own" in the documentation. But when users get themselves into bad situations, they still often file bugs and it costs us time to investigate and ultimately say "it was your fault." It is usually edge cases that lead to huge support costs for us, so we're always paranoid about edge cases.

What about not even copying content files into the project?

I got an email about this topic where someone asked if we could just completely avoid the problem by not copying content files into the project at all. They mentioned that we don't copy DLLs into the project folder--why do we copy content files in?

We’ve talked about this idea too, but it also doesn’t hold up.

DLLs aren't copied in

It's been stated that NuGet handles DLLs by not copying them into the project. I’ll make a small correction to that—they are copied into the project by way of the Reference that is added to them. Sure, we lay them down on disk under the /packages folder rather than adding them directly into the project, but we’re just following the same pattern you would have used to manually add a reference to a 3rd party library:
  1. You’d create a lib folder
  2. You’d put the DLLs in there
  3. You’d add a reference to the DLLs
After the Reference is in place in the project, MSBuild takes over. It does the work of resolving references and copying DLLs into the output folder. We don’t copy anything into the /bin folder, because the build does that for us.

Content Files are Different

But looking at content files, things are different. There’s no equivalent step for MSBuild to copy content files into the runtime folder. If you’re in a web application and you hit Ctrl+F5 to run the application locally, you need the javascript files to be served from the web root—the file must be there physically on disk under the project folder. When you’re deploying, the files need to be in the project in order to be deployed. If you’re using Web Deploy, it will only copy items that it knows are part of the project. And Git-based deployment systems like AppHarbor or Windows Azure Websites also look for physical content within the project folder, and marked as Content items in the project file itself.

Merely Automating Acquisition

Remember that NuGet is not involved in the steps of running or deploying an application. We’re merely automating the acquisition of artifacts. Trying to take an approach as drastic as what you’ve suggested would mean changing how the rest of the workflow works in VS and in deployment. We’ve not found a good way to simulate today’s behavior without putting content files directly into the project folder and referencing them in the project file. I’d be interested to see if someone could though.

Other Notes

  1. Content files can be anything. Why many times they are js/css/image files, they can also be:
    1. Web.config.transform files (that certainly need to be applied to the project folder)
    2. Code files (C#/VB), which are meant to be edited by the consumer
    3. T4 code generation template files
    4. Resx files, XAML files
    5. Or anything else
  2. When NuGet adds a content file into the project, we rely on the project system to recognize the file type and set its compile action accordingly.
    1. .tt files get a different compile action from .js files, which is different from .cshtml or .cs files
  3. Content files can contain source code transforms
    1. Especially useful for code files
    2. It can set the namespace and other properties used in tokens within the file to match the target project
  4. Burying content files in paths specific to the package/folder can lead to problems
    1. For js/css/image files, references to those paths can break more easily during updates
    2. Users then still need to dig up the paths to the files to reference them
    3. So we can’t just hide those files from VS either
It’s easy to look at this problem under the lens of js/css/image files, but really any type of content can be at play. We don’t want to get ourselves involved in build/run/deploy—we’re merely automatic artifact acquision. If we can find a new pattern for representing content artifacts in projects that still works through the whole lifecycle, then that’s cool—and we can change NuGet to do its automation differently. But we can’t solve this problem of unwanted content files in the project within NuGet—we have to solve it without NuGet first, and then teach NuGet the new pattern.

Idea for a Different Approach

What if we looked at this with a completely different approach? What if we don't try to solve the problem during install, uninstall, and update? Maybe we could do the following:
  1. After a package is installed, run a script to move files around in the project
  2. Before the package is uninstalled, run a compensating script to reverse any actions taken by the post-install script
We could define the mapping the same way we've talked about here, but instead of trying to have NuGet respect the mapping while copying the content files into the project, we let the content files come in naturally. But after we completely finish installing the package (including executing its install.ps1), we would then enter the code to read the mapping and move files around. Then, before uninstalling (or updating) the package, we'd reverse the actions before executing the core package uninstall (including the uninstall.ps1).

In fact, we could make this a more general feature. We could allow the user to define postInstall.ps1 and preUninstall.ps1 scripts for packages. Those files would be carried with the project/solution to ensure the symmetry for install/uninstall and that all developers on the project would have the same scripts apply.

Thoughts on this approach?

Mohamed_Meligy wrote May 28 at 4:02 PM

Hello,

This is how I'd imagine this feature can work.

Today, if I understand correctly, content files are copied on install, and are located in same place they were copied on uninstall. No tracking of files takes place.

NuGet can keep this model as-is, and only provide a way to map relative paths, like moving "Scripts" to "Scripts/Vendor". It then does no tracking and on uninstall, it will do reverse mapping and clear files from mapped location.

This will work just fine for mapping defined before installing packages, but having mapping "after" packages (e.g. the ones that are part of a project template) will require a strong warning about the need to manually move the files. You won't need to mention lack of file movement tracking afterwards because that's what you have today (and bringing the topic can confuse people in fact).

Since this feature may not have a UI support, users should expect it to be a bit of advanced stuff. I don't think it's that complex though. Messing up the upgrade/uninstall after moving files will not be much different than what happens today with moving the same files. Users may appreciate some verbosity about folder paths though in output log or errors.

This can start small, like having static mappings (can later have nicer features like tokens for say package name, etc, but let's avoid these for first version because they can confuse users quite a lot when they use a token and don't fully get what their mapping does), and as mentioned no UI support. UI support if added later can verify mapping and can attempt to move files as well, but this can be complex and can wait several versions I'd say.

The idea of executing NuGet user PowerShell on install/uninstall is nice in general, but I think can be overkill for this feature and the commands likely will not look very nice (probably will end with some manual unsupported file move commands that increase operation time and are potential source of errors).

JeffHandley wrote May 29 at 5:04 PM

Sorry to be obtuse, @Mohamed_Meligy, but I don't understand how your proposal differs from the mapping approach I explained wouldn't work.

Where are the mappings configured? What do they look like? How do we ensure that the mapping that was used at the time of install is recorded for reuse upon uninstall and/or update? How does this affect content files that have references to each other where changing paths would cause them to break?

Another Idea - Patches

Thinking about how content files themselves might need to be updated to reflect changed paths, I realized that something already exists to cover all of this: patches.

What if we used the patch/diff file format to apply/revert changes after install/before uninstall? These files could be easily generated by doing the following:
  1. Package is installed
  2. Commit to source control
  3. Rearrange and modify files from the package however you wish
  4. Create a patch file and save it into the .nuget folder following a convention
  5. Upon uninstall/update/re-install of the package, NuGet would see the patch file in place and apply/revert it appropriately
This would support file movements as well as content updates that are needed to reflect the file movements.

milanjaros wrote May 30 at 3:45 AM

According to Persistence Location: I got it and I agree. In that case I vote for packages.config. But... Can we persist mappings config between versions? Or is upgrade equal to "un-install and install" operation?

The Content Files should be part of project - AFAIK it's common practice all around the IDEs and platforms.

Hooks

You mentioned the paths and it's problems, I'd call it "Content Transformations". So, if someone has any particular reason to move files there would be also useful the script hook I mentioned, because of js/css/image references (links), namespaces, etc. So, the Different Approach you described is exactly what I meant. :)

There could be parametrized script before and after every stage of Automatically Running Scripts, i.e. preInit, postInit, preInstall, postInstall, preUninstall, postUninstall. For sure there will be also need of param($installPath, $toolsPath, $package, $project) variables.

Patches

This idea is really interesting. I still ask myself why it wasn't mine. :) Well, it will cover both Content Transformations / Modifications and File moving. It will even support Build Action modification, i.e. modification of project file.

But, there are another questions for this:
  • Do patches have the same format for TFS/Git/Svn and can we rely on it?
  • Is underlying Source control needed to achieve uninstall/update/re-install of package?
  • What about content changes between versions? E.g. My patch is for jquery-1.11.0.min.js but will it be possible to patch with it next version?

milanjaros wrote May 30 at 3:45 AM

Little off-topic: Now I understand what you mean by edge cases. I don't usually "do" software for developers, but we have always some libraries and frameworks in company. And, the "error is always in the framework". :) In general, "error is always in the code of someone else". ;o)

@Mohamed_Meligy, I think you understand it correctly. The improvements you mentioned are really things of this discussion. Your ideas are good, so I'd welcome if you could read all thread and join the discussion.

I'd also welcome if the rest of (about 300 voted) brains could join this thing...

milanjaros wrote May 30 at 4:09 AM

  • More about my jquery example: What happens when patch fails (and it should fail for minified script)? Will installation fail?
I think that the advantage of Hooks is the of lowest edge cases factor because it seems to be most generic solution we've found.

diryboy wrote Jun 2 at 6:33 AM

@bartmax

+1 for the unopinionated folder structure. It's very convenient for user to bring in libraries and for library developers to organize their stuffs.

For example, the CodeMirror editor places the main js and css into one lib folder, then there are themes, addons, and modes etc. Poking into it and "redirecting" the files to user customized folders seems not worth the effort, instead, keeping all of them in one ~/Assets/libs/CodeMirror/{version}/ folder seems best.

JeffHandley wrote Jun 2 at 9:00 PM

So would NuGet just re-root every package's content files into a subfolder for that package? If we omit the version from the folder, that would allow packages that depend on content from their dependency packages to reliably know where the content is.

So in the project:

/Assets/{packageid}/

milanjaros wrote Jun 14 at 6:14 PM

Jeff, to be honest I didn't get your note, so I left it without response because I get it just like reaction to diryboy's comment. But I'd be rather sure if this is/is not proposal for final solution. :) In my humble opinion such re-rooting in general (for all packages) is not good idea. You will not have possibility to have two versions of one library together.

koistya wrote Jun 14 at 6:35 PM

As an alternative, you can install packages in the default location (./packages/ folder) and copy/concat them to a build folder during compilation. Here is an example:

http://visualstudiogallery.msdn.microsoft.com/d65d6b29-6dd7-4100-81b1-609e5afce356

I personally don't think there is much value behind this feature, it's mostly needed for front-end libraries (Java

koistya wrote Jun 14 at 6:37 PM

...I personally don't think there is much value behind this feature, it's mostly needed for front-end libraries (JavaScript, CSS etc.). In a real-world front-end project, you need to bundle all these assets anyway.

Mohamed_Meligy wrote Jun 15 at 1:10 AM

@koistya,
The point is not production URLs, it's organizing your own project the way that works for you. It's really a workaround for a bad default that ASP.NET has, having all NuGet packages writing directly to /Scripts (and /Content) without any way of namespacing.

There is no good convention for this like other platforms have base "vendor" (or whatever) folder, and then each package has its own folder (inside which the package can have any folder structure it likes).

Currently the full folder structure is available to package authors, which is powerful, but often you end up with very unmaintainable /Scripts folder due to that. If only you could trick the package into thinking the Scripts folder is, say, /Scripts/vendor, or even better /Scripts/vendor/package-name, that'd be really nice.

As I mentioned, you don't need to worry about any safety or tracking for this really.

koistya wrote Jun 15 at 7:12 AM

@Mohamed_Meligy as a workaround, 3rd party client-side libraries can be installed on a solution level (./packages/) rather than a project level (./Project/Scripts), this way your project will be nice and clean, free of 3rd party code and will contain only your custom .js,/.less files.. and during a build of the front-end app, the build scripts pulls necessary libs from the ./packages/ folder alone with your custom code and creates bundles (bundle.js, bundle.css etc.), for testing, debugging and publishing.

koistya wrote Jun 15 at 7:19 AM

@Mohamed_Meligy, there is also a way to customize the name of this ./packages/ folder (via nuget.config), similar to Bower. Bower just don't copy any files and keeps everything in its packages folder, NuGet does - but that's only needed for simple projects, where you don't care about compiling your client-side code and 3rd party libraries into bundles.

JeffHandley wrote Jun 18 at 4:16 AM

The tough thing here is that in order to make sure we don't break deployment of the web project through Web Deploy or Azure CSPKGs, we have to make sure the files are included in the CSPROJ file as Content files (when appropriate) with the path you want it to have.

I wonder if we could get by with simply allowing a switch to force package content files into a /{packageid}/{packageversion} root folder, instead of going into the root of the project. We could have a nuget.config setting for a single, simple definition of the root content folder, and respect that.

For instance:

<contentRoot>/{id}/{version}/</contentRoot>

Then, when a package is installed, we'd copy the respected contentRoot value into the packages.config file:

<package id="jquery" version="1.10.0" targetFramework="net45" contentRoot="/{id}/{version}/" />

If you move things around after install, then you could just update the packages.config contentRoot entry for that package to reflect where you moved things. During uninstall, we'd respect it. And during a package update, we'd remember the previous contentRoot that was in place for the previous version.

We'd do this manipulation after running install.ps1 so that anything the install.ps1 was relying on would be stable. And we'd need to invert the operation before running uninstall.ps1 so make sure uninstall.ps1 can succeed too. But we'd have no other logic.

The following scenarios wouldn't work well:
  1. Content files that have absolute path references to each other
  2. Moving some files to one root folder and other files to other root folders
  3. web.config.transform would either be busted or would not respect this setting
This would also be a perf hit, since we'd be going through the DTE to move the content after it's already been added to the project in the original location.

I have been looking into Grunt though, to see how this problem is handled in Node. It doesn't exactly transfer over since there's no project file or DTE to worry about--it's just files on disk for Node. But Grunt is really code-centric too, like the PowerShell suggestion I made. Let's explore this for a moment though...

If we made this script-based, leaning on PowerShell scripts, we could:
  1. Create solution-level packages that provide init.ps1 scripts that export new PowerShell cmdlets (similar to Grunt tasks)
  2. Share those packages on nuget.org as common utilities for applying post-install/pre-uninstall transforms to package contents
  3. List in your nuget.config file the tasks to be run after installing/before uninstalling packages (could be parameterized per package)
I could a MoveContentToPackageFolder NuGet package with the following:

\tools\init.ps1
  • exports Install-ContentToPackageFolder(string contentFileName, string packageId, string packageVersion)
  • That function is called for each content file NuGet adds to the project, and the function uses the DTE to move the file to the desired location: /{packageId}/{packageVersion}
  • exports Uninstall-ContentFromPackageFolder(string contentFileName, string packageId, string packageVersion)
  • That function is called for each content file NuGet needs to remove from the project, and the function uses the DTE to move the file back to the desired location, from /{packageid}/{packageversion}
\content.nuget\nuget.config.transform
  • Adds a new element into the project's nuget.config file (slightly ironic, I know) that configures these methods to be called when content files are install/uninstalled
Yeah, the more I think about it, the more I like a code-based solution for this. And being able to package the code up into NuGet packages themselves seems pretty handy.

I might have to spike out this concept of using init.ps1 scripts to export functions for NuGet to call.

Sorry if this wasn't entirely coherent - just rambling on the keyboard

bartmax wrote Jun 21 at 2:04 PM

Looks like solution is pretty damn simple :

Change /Scripts for /Scripts/packages or /Scripts/{package-name} and 95% of the problems solved without any added complexity or breaking change.

koistya wrote Jun 21 at 3:10 PM

@bartmax, or just

<package id="bootstrap" version="3.1.1" copyToProject="false" />

and they will remain in the /packages folder, similar to Bower.

bartmax wrote Jun 21 at 3:34 PM

@koistya see Aug 15 2013!! That was my suggestion back then, almost a year ago.

Mohamed_Meligy wrote Jun 21 at 3:43 PM

A property to disable copying to project isn't a bad idea at all, given:
  • It's available via command at least (UI support doesn't have to be from the time it's introduced)
  • It can survive updates (including these from UI), which is probably the tricky part about it. (and without it, it's almost useless)

joebeazelman wrote Jun 27 at 5:14 PM

How about adding a special configuration file that behaves much like the web.config transformation files. During a nuget install, it checks the transformation configuration file and installs accordingly. For instance, if you're installing bootstrap, you can configure the location and name of the folder or decide whether to create a sub folder or not. The uninstall procedure checks this same file.

Come on Microsoft, this work has already been done, it just needs to be included in Nuget. web.config transformations, msbuild and TFS already have a system like this. I would add a UI feature where you can use the GUI to do the most common transformation and give the user the ability to edit the configuration file for more complex scenarios.

As a temporary solution, you can just add an option to execute a batch file on post and pre installation and let the developers handle it themselves until you find a better solution. Forcing users to use predefined install locations is barely tolerable.

cervengoc wrote Jul 8 at 4:12 PM

I like the copyToProject option, that's simple, seems like almost no impact at all, and satisfactory: it's not that hard to maintain manual file copies after updates..I was exactly looking for something like that when I got here to this discussion.

cervengoc wrote Jul 9 at 10:59 AM

Sorry I meant "no risk at all", accidently I wrote "no impact at all" :)

nvivo wrote Aug 1 at 3:11 PM

@JeffHandley,

I think the solutions proposed are turning into a quite complex solutions. Maybe we are trying to go from 0 to 100% into one jump.

Most people don't actually want to redirect package by package or file by file. I believe most complaints about this boils down to the fact that we want nuget files to be put "somewhere else", and not to /Scripts and /Content, because those are generic top-level folders that we want to organize with our own files.

Once I have /Scripts with 30 files added by nuget, I don't want to put my own files there because it is confusing. And adding /js and /css to add my own files creates more confusion. A new dev looks to the project an asks "which folder should I use? /js or /scripts or /assets? Gets even uglier as nuget packages add sub folders to these folders.

So, maybe a "good enough" solution would be to just put nuget files somewhere else and not polute the project folder, just like bower does. Just making something like "/nuget_components" the root of installation would solve 90% of the problems. It would have its own Content and Scripts folder, and wouldn't mess with the project structure and require little changes to nuget, and zero changes to packages.

In the future, this could move into a structure like "/nuget_components/{package-id}/", but I think this could be another step.

MBODM wrote Aug 6 at 2:46 AM

absolutely THIS.

i think, what "nvivo" wrote, is the right direction. it is a 80/20 solution. if you look at the recently released software development solutions by microsoft and others, we can see, that the following rule of thumb is widely approached today (because its fine): "get easy and clear structured to the first 80% and do not try to get the last 20% perfect for every single person out there".

in the case above, it simple means: you can not, definitely NOT, build a package manager, that fits the needs of every single developer out there. after 300 posts or more, i think we can agree to that. so take "nvivos" approach and get that 80% out there happy. in that case with a with a straight and simple solution, which is also backward-compatible. because that is waaaaay better than nothing (actual situation). after that, try to achieve the last 20% with a layer for mapping the pathes, including patch-mechanisms, security- and version-functionality and so on, if you have the time and get all problems solved with that design.

just my 2 cents

(sorry for the bad english, i´m a german guy)

MBODM wrote Aug 6 at 3:12 AM

and btw: it is not a shame, to come to such a solution after your mind has gone so widely through all the pro´s and con´s. sometimes we must think about all possibillities, only to come to the conclusion, that we "need" a simple way.

i know so many developers, who think long and deep into a problem, and then come with a "beast of a solution", only to not confess, that all the thinking and design time leads to nothing in the end. they have the wrong perspective: it did not lead to nothing - it leads them to the enlightenment, that they need the simple solution that they have in the first place. if you are a strong developer, you can handle this :)

time to quote George Spencer-Brown:
"The value of crossing made again is NOT the value of the crossing. (Laws of Form)".

means: to come to a conclusion without any thinking or effort and come to the SAME conclusion, after thinking about all ways it can be done, is NOT the same ! In example, you can NOT have the same feelings that you have with your first car, REGARDLESS how many cars you buy in your life :)

a little bit philosophically, but i think thats the point of the 300 posts :)