# Extending NuGet.exe with plugins

 ecoffey Jan 24, 2011 at 4:06 PM A bit has changed in my branch, and how I was approaching the solution, so I decided a fresh discussion would be helpful You can see my previous thread here : http://nuget.codeplex.com/Thread/View.aspx?ThreadId=237902   In summary, I've been monkeying with different ways to build packages, settling on a combination of Mono.Cecil and parsing project files.  Furthermore I need to be able to install packages on VS2008 instances, and Express instances.  That can't really be done with a VS2010 Extension :-)   So what have I been doing to core nuget?  Well through the power of MEF I've extended the NuGet command to support "plugins". I moved a lot of CommandLine to CommandLine.Core.  Now CommandLine is just the parser, the program, and the builtin commands. The program now also recurses up the directory tree looking for a folder named "NuGetPlugins".  If found then that directory is added to the AggregateCatalog. So now given NuGet.CommandLine.Core, and NuGet.Core you can write a "plugin" that exposes an ICommand. So now the only thing in NuGet proper is the original refactoring (IPackageFileStrategy and the like), and some small changes to support "plugins" All of my crazy "create a package from your own dna!" kind of stuff can be constrained to "plugins".   How do people feel about the way I structured the plugins logic?  Right now it kind of silently loads those plugins, should there be a way to show where this command came from?  Or be more opt-in?  i.e.: "nuget runfrom path\to\plugins\directory commandexposedbyplugin --someoptionforplugincommand"   The code currently lives here : http://nuget.codeplex.com/SourceControl/network/Forks/ecoffey/PackageFromProject It builds cleanly for me in vs2010, but not I haven't gotten the build.bat to work because of the FxCop stuff.  I think my Pro version of VS might not be good enough. osbornm Jan 24, 2011 at 10:16 PM This is something we have wanted to get in for a bit now... I'll take a look at your changes soon. Something that might be helpful is the changes I just committed to default.  Before: [Command("TestOne","This is a test Command")] public class TestOneCommand: ICommand {     public List Arguments { get; set; }     public IConsole Console { get; private set; }     [ImportingConstructor]     public TestOneCommand(IConsole console) {}              public void Execute() {} }   After: public class TestTwoCommand: Command {     public override void ExecuteCommand() {} }   The major difference is that there is now an abstract base class the provides some default behaviors that are common (Arguments, Console, and support for ‘-?’). ICommand is still there if someone needs more control of their command’s behavior.  Another big change is that you don’t have to add the Command Attribute unless you would like to provide more MetaData add/or restrictions for the command. By default we just use the class name minus the suffix “Command” if present and there is no description. The last Change is that you no longer need to Export your command as an ICommand to get it to work. Hopefully this makes it easier for those of you that want to add commands. ecoffey Jan 24, 2011 at 10:22 PM Cool I like those changes. Apart from formatting I feel ok about my code.   Also I went ahead and implemented that "runfrom" command, just because it makes development a bit easier:   \$> \path\to\bin\Debug\NuGet.exe runfrom \path\to\plugins\bin\Debug NewShinyNewCommand dfowler Developer Jan 25, 2011 at 2:46 AM @ecoffey Do you really need to add all of those classes and interfaces to achieve what you're trying to? I feel like we've lost sight of the feature and all I see is alot of changes without much context. What is the end to end scenario we're going for here? ecoffey Jan 25, 2011 at 3:19 AM I might be able to use the vanilla PackageBuilder, but I'm not sure.  I still feel like the refactorings I made to the core are valid.  All I really did was call out the difference between Metadata, Package Files, and Writing a package. The ability to compose together Metadata, i.e. information from using Mono.Cecil against an assembly, metadata from a nuspec file, and then use just the assembly as the Package File has been very useful. So the context was to open up the core a little bit more to make usage of the API a bit more extensible. My reasoning is that if people are happy with the refactorings in the core, and the changes to support "plugins" then we can have discussion about particular usages of those changes. For instance my 'packsolution' command might not be whatever one needs, but it's working well for me and has some of those original features we discussed.  When it builds a package for a project it looks at packages.config and Project References to build the set of dependencies for the package, uses Mono.Cecil (implemented behind IPackageMetadataProvider) to scrape Id, Authors, Owners, etc from the Projects assembly, and finally outputs a package with just that project assembly. Now that could, and might even need to change in behavior, depending on what I encounter as I use it more.  But the changes in vanilla NuGet to support that command seem minimal to me, considering what we're enabling. Was there something particular about my changes that seems troublesome?  The original review cycle we did about a month ago now seemed more concerned with code idioms and formatting.  Since then I've removed the commands I added, but left the core changes, and enabled plugins which is pretty straightforward with MEF. dfowler Developer Jan 25, 2011 at 3:56 AM The plugins changes for the console is unrelated, a good feature to have but still not related to the original discussion. I don't have a big problem with the refactoring, but I want to know if it is still needed since your approach to the problem has changed. What makes using the PackageBuilder more difficult? Do you remember the initial problems you had with it? It seems to me the only real difference is reading metadata. Your changes imply more of a pull model and what we have now for package builder is a push model (setting properties vs providing a specific implementation). ecoffey Jan 26, 2011 at 5:18 PM Cool we're agreed that plugins are good. I still feel the refactoring is worthwhile, since to me it makes it more obvious where different pieces of a package can come from.   I was originally going to use PackageBuilder.  I would feed it a nuspec file, and then layer in my own stuff.  What lead me down the refactoring path was that when you create a PackageBuilder from a nuspec file it does a bunch of work figuring out what files are going to be in that package.  The work is even greater when you don't specify files, since it includes the current working directory tree. For the work I'm doing all that work would have been for not, since I immediately would have thrown away that set of files PackageBuilder built in it's constructor and replaced it with things I was calculating.   Doing it that way felt wasteful, and non-obvious.  Reading my code you'd be confused why I was setting the PackageBuilder file set to a whole new list instead of just appending, and just reading PackageBuilder you would reasonably expect to find files that I was actually just throwing away somewhere. Also PackageBuilder made it a bit more difficult to "compose" the metadata.  With CompositePackageMetadata : IPackageMetadata you can have a cascading chain of "authority" for the resulting metadata.  Makes it's easier to answer questions about who should get the final say in what the metadata is. But composing doesn't have to stop at Metadata.  You could also compose, and aggregate together IPackageFileStrategy's.  I can already think of a usage of that internally here at my company.   Finally PackageBuilder2 shows a decent start at using PackageBuilder in the most basic sense of create an empty object and mutate it until you're happy and then write it out.  All of that can be implemented under the covers with my refactorings.   Hopefully that answers your question. dfowler Developer Jan 28, 2011 at 4:46 PM That sounds rational. If you want to move to the next step I'd say re-implement the core pieces in one changeset, then send that for a review (this time actually replacing PackageBuilder not having PackageBuilder2).  The command line changes should be done separately. ecoffey Jan 28, 2011 at 6:25 PM Cool, so we all remember last time when I attempted to submit a review how smooth that went :-) If you look at the current PackageFromProject fork, what do you think the best way to isolate my core changes would be? dfowler Developer Jan 28, 2011 at 8:59 PM I don't know if you can salvage that fork. You might have to create a new one and copy the work over. ecoffey Jan 28, 2011 at 9:01 PM Haha fair enough :-) Looks like I was way worse at hg than I thought.... ecoffey Jan 30, 2011 at 9:49 PM Ok the new fork (http://nuget.codeplex.com/SourceControl/network/Forks/ecoffey/PackageBuilderRefactoring) contains the one sane changeset for my core refactorings + the packagebuilder reimplementation. The review request is at : http://reviewboard.nupack.com/r/310/