Calling another powershell file from install.ps1

Aug 16, 2011 at 7:54 PM

I want to call another powershell file from within the install.ps1 file.

This stems from my practice of making reusable scripts. I have already tried the following:



param($installPath, $toolsPath, $package, $project)

.\ContentManagement\post-process-projectfile.ps1 $project.FileName



#My reusable script

I get an error that says it doesn't recognize the command/script:

The term '.\ContentManagement\post-process-projectfile.ps1' is not recognized as the name of a cmdlet, function, script file, or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and try again.At C:\TFSWorkspaces\Workspace1\Shared\Frameworks\ResourceManger\packages\ReusableComponents\tools\install.ps1:22 char:49



Aug 16, 2011 at 8:02 PM

I suggest you don't rely on relative path inside the install.ps1 script to locate the other script. Instead, concatenate the $toolsPath, which is the folder containing your install.ps1, with the relative path to get the absolute path.

Aug 16, 2011 at 8:20 PM

You can look at the package SqlServerCompact (version 4.0.8482.1) for an example of how to do it.

Aug 16, 2011 at 9:18 PM
Edited Aug 16, 2011 at 9:18 PM

Why should I have to remove my relative reference this code works from the PS command prompt, it should work the same way when invoked from Nuget. Unless of course, Nuget itself breaks the entire purpose of relative file execution away from powershell and what is running is some mutant version.

Does Nuget run PS or does it run a variant? The Powershell nuget support needs to be fully documented, which features are not available, which features are modified, which features have been added. Please I should not have to read your source code just to do something that isn't "out side the box".

Aug 16, 2011 at 9:25 PM

The issue is that the when we execute your install.ps1 script, we don't set the current directory to its parent folder. We set the current directory to be the location of the .sln file of the currently open solution. I agree that we need to document this behavior.

Aug 16, 2011 at 9:27 PM

Is there a way I can shim this "neglect" into my install.ps1?

If all I need to so is find the residing folder and set this value then I would have no problem with that solution, again my "hot fix" would be great to go along with the documentation of this "feature"

Aug 16, 2011 at 9:32 PM

take a look at the SqlServerCompact package.

Aug 16, 2011 at 9:35 PM
Edited Aug 17, 2011 at 6:59 PM

I find that:

. (Join-Path $toolsPath "GetSqlCEPostBuildCmd.ps1")

Is not the acceptable process 


Why can I not accept this process? Because the moment the nuget framework redefines what is stored in $toolsPath I will have to update my script file. If the NuGet framework allowed my script to behave as it does in the powershell environment, I would never have to worry about where are my scripts executing in relation to *insert script here*. And I would never have to worry about package compatibility with the next version of NuGet, that is unless something else fundamentally changed causing a breaking event.

Aug 16, 2011 at 9:47 PM

This allows me to write my PS scripts more naturally:


param($installPath, $toolsPath, $package, $project)

Set-Location $toolsPath

# Call my script as I expect
.\ContentManagement\post-process-projectfile.ps1 $project.FileName

# Additional Scripts

Doing this allows someone to look at my script and see that I am setting the executing location and that they can continue to add more scripts without having to adjust the path for each new one.

Aug 16, 2011 at 9:52 PM

If you want to go this route, please set the location back to what it was before your script is run (you can call pushd and then popd). People don't expect that installing a package changes their location. Remember, it's not all about your package.

Aug 16, 2011 at 9:55 PM
Edited Aug 17, 2011 at 7:39 PM

Ah yes, that would be nice, if the install processes are all using the same executing context, which is odd, one would think that the installation of dependency nugets run in a scoped environment so that they don't interfere with each other?

If this is the case then I will have to toggle the path for my package only.

Actually, it is all about my package. If I have dependencies or not is irrelevant to a point. If people are not expecting the installing package to change the location, running the powershell scripts in their own context will prevent any bleed over effects if I want to change paths to point all over the place. Since I would like my scripts to run as I would expect them to be ran, I will take the advice and use pushd/popd. Since I do have dependencies I will make sure that my package install/init/uninstall scripts all toggle the path, so that their executions are unaffected*.

* My solution still does not remove the effect of a change to $toolsPath, though at least I will have an easier time updating, than having to update multiple lines of code to correct such a change.


I have updated my script to be:


param($installPath, $toolsPath, $package, $project)

Push-Location $toolsPath 

# Call my script as I expect
.\ContentManagement\post-process-projectfile.ps1 $project.FileName

# Additional Scripts

Aug 17, 2011 at 1:26 AM

So much wrong in these statements.

When requesting work from someone that is providing it for free, one should never demand things.

Also when something doesn't work the way you think it should, it doesn't mean that it is incorrect.

Just a couple of notes on etiquette. Maybe I misread the tone...
"Be passionate in all you do"

Aug 17, 2011 at 3:44 AM
Edited Aug 17, 2011 at 3:45 AM

There is nothing unusual about script files executing in their requested context. With power shell you can spawn environments within environments so it isn't that difficult to think that since each NuGet package is in itself independent of each other during installation, that they should each have their own separate executing power shell environments.

As for the tone, I have been very frustrated with the lack of documentation for anything beyond simple dealings with NuGet. If more of these advanced topics were documented I might not even of had to post here because I would have known the context of powershell script execution of NuGet and been able to handle these dealings myself.

I see references to code bases instead of actual documentation as a disservice to the platform. I am sad to see the platform developers and promoters doing this. It would be great if documentation could be provided so that the platform can be promoted in a better light.

Open Source software is a community activity, not a dark covenant gathering of elders in the late hours of the night. 

Aug 17, 2011 at 3:49 AM
Thanks for the feedback. We're always trying to improve our docs and I'll work to make sure this gets in there. We're a small team with a lot to do. Indeed, as you say, it is a community effort.

Speaking of which, our docs are hosted in source control at and we accept pull requests if you're interested in helping out. :)

Aug 17, 2011 at 7:17 PM

I appreciate the help and patience demonstrated here. I hope to see features such as this taken into the documentation, as I don't completely understand enough about the executing context of the scripts myself to contribute to the documentation.