Failed to download package correctly. The contents of the package could not be verified.

Sep 16, 2011 at 10:50 PM


  I'm using a custom nuget server. I can't seem to download any files right now - and the error isn't telling me anything useful about what is going on.

  Here is the server: (this is what I have entered in VS 2010's feed list). It doesn't matter what project I install against, it is always the same right now. For example, you can try the package named "ROOTNET-Core-v5.30.01.win32.vc10" into a C# console app if you want. Here is the error from the output console:

------- Installing...Ninject -------


------- Installing...ROOTNET-Core-v5.30.01.win32.vc10 2.2 -------

Failed to download package correctly. The contents of the package could not be verified.

  I'm not sure what I've done wrong. I'm using the nuget 1.5 nuget.server package to run that (I updated earlier today as part of my x-checks). Many thanks for any help you can give me!

  Cheers, Gordon.

Sep 17, 2011 at 1:16 AM

This definately has somethign to do with how the packages are being served: when I copy the package file locally via basic copy, and tell nuget to install it from that file, it works fine. So I don't think my package files are corrupt...

Sep 17, 2011 at 10:27 AM

I have tried both adding and removing the MIME time for that directory in IIS for "nupkg" set to application/octet-stream. And cleared my client's internet caches just in case. In both cases I got the same behavior.

Sep 18, 2011 at 5:26 AM

This error typically means that the PackageHash from the feed doesn't match the hash of the downloaded package.

Sep 18, 2011 at 5:33 AM

Take a look at this issue

Sep 18, 2011 at 12:38 PM

Thanks! So I took a look at that work item. The symptoms are remarkably similar, but there are two potential differences:

1) I upgraded the server and the problem remained

2) I can reset the app pool and it seems to be the same as well.

But to debug this further - how can I query the feed url in order to get the hash back? Can I do it from the nupkg command line? And then how can I inspect the actual nupkg to extract the package hash? If those aren't the same then it would be likely I have the same problem, otherwise I will have to look harder.

Many thanks for your help!


Oct 9, 2011 at 10:42 AM


  Ok - I'm getting a little desperate here. I'm really not sure how to debug this further. I've tried doing things like resetting IIS (even rebooting the machine) and nothing seems to change the behavior.



Oct 9, 2011 at 5:50 PM

It indeed is broken when running: nuget.exe install ROOTNET-Core-v5.28.00f.win32.vc10.debug -source

You say you're running a custom nuget server, but what bits exactly are you running? There are at least 3 nuget server implementations out there, each with various versions.

To answer your question, you can see the package hash in the feed. e.g. for that package, you'll see:


Oct 9, 2011 at 7:05 PM

Thanks for your help!

I didn't realize there were 3 implementations - I've been using NuGet.Server v1.5.20818.9011. At the time I started this thread it was the most recent one - I see that it has been udpated since (sorry, had I realized that I would have tested that before re-pinging this thread).

My requirements are very simple. I just need the server to scan a directory for nuget packages and serve that feed correctly. My build server pushes out the packages to build into that directory, and I'd like them served directly from there.

One thing that may not be quite "standard" that I do, I now realize, is sometimes I will "re-push" a package. That is, same version number, same package name. While it has the same contents (i.e. it is built from the same check-out tag from VCS), since the build times are different it would well be that the hash of the file on disk changes. If the server doesn't keep its hash up to date when a file is replaced with an identical one, that could be causing my problem. Perhpas there is a way to reset the server's cache? Just copy all the files out of the shared directory?

I will update my version of nuget.server to the latest and see if it makes any difference. If another implementation would do me well, then I'd definately be curious to know what I might use!

Oct 9, 2011 at 7:21 PM

Ok - updated the web server to the latest version (all nuget stuff is at 1.5.20902.9026) and the behavior is no different. My understanding is the server keeps no cache around on disk so it must have rebuilt everything when it was restarted. I've turned off the step that copies files into the Packages directory in my build server for now - so there is something about the combination of packages that is causing the trouble (please correct me). I'll do some testing - like removing all but one, etc.

Oct 9, 2011 at 7:47 PM

Ok - a little more testing...

1) I removed all packages from the Packages directory, and restarted the application pools and web server.

2) I copied in one nuget package (nuget.exe install ROOTNET-Core-v5.30.01.win32.vc10 -source - it works just fine.

3) Reset the server, copy all the packages in - seems to work just fine now too!

4) Copy the (32) packages out, execute nuget query on the feed., copy them back in. Still seems to work.

Ok - it seems like it just "works" now - it seems like I was in a funny state that persisted accross an upgrade that I can't get back into again. I'll start up my build server again and see if that causes the problem to reappear.

Sigh. Sorry!

Oct 10, 2011 at 1:07 AM

Glad you have it in a working state now! Though it's strange that it was acting up before. I would really expect a server restart to re-read all the packages from disk and hence have the correct hashes.

Oct 11, 2011 at 10:42 AM

Awsome, it is back! The only thing I did was delete a package from the directory and copy in a new one (with different name, even!) and let it sit idle for over a day. This time restarting the server fixed it.

I'll let it go another day and see if it fails again... perhaps when IIS retires the code due to disuse it gets confused when it comes back? Also, the process of coming back seems to take a very long time (> 40 seconds).

Oct 11, 2011 at 9:39 PM

It takes a long time if you have lots of packages because it has to crack them all open when it starts. We've discussed possible solution to make this better.

Oct 12, 2011 at 4:11 AM
davidebbo wrote:

It takes a long time if you have lots of packages because it has to crack them all open when it starts. We've discussed possible solution to make this better.

Ah, cool. Right now I have 32 packages. I timed it - on this machine I'm using (old, but steady) it takes 72 seconds to spin up. 3 packages are 100MB, 6 are about 20 MB, and rest are 5MB or below.

Did this discussion occur on another task here on the website? If so, can you point me to it? If not, what solution was finally settled on (though perhaps no one has had time to implement yet)?

Also, could the corruption be happening because I get fed up with waiting for the first 72 seconds and cancel and re-make my request? So the second request comes in before the server has managed to finish cracking open the files? I'll see what I can do about testing this all out.

Oct 12, 2011 at 4:36 AM

I think it was in the context of the new (not yet in use) NuGet gallery ( We were discussing whether we could also make it cover the simple Nuget.Server scenario where files are just dropped in a folder. It would then get the benefit of being database driven. But right now it doesn't do that.

Not sure about the corruption, but that's an interesting theory if the code was not written to handle that...

Oct 12, 2011 at 5:01 AM

So are there plans to put the nuget.server aside? I ask because as it stands the nuget.server isn't that useful - this repository is low use - and everytime you want to build somethign having to way 60+ seconds isn't going to work. I'm willing to do some work on it - an easy thing would be to add a small cache file along side the pkg that contained the information (and do a date check), or do something similar to that. The nice thing about that is not having to put in the infrastructure and deployment issues for a db - thought that could be done as well. If there are some other small DB's that have free distro policies (i'm not sure what satisfies the nuget license) and aren't overly complex I'd be happy to make an attempt that you all could review.

Oct 12, 2011 at 5:06 AM

I'm not sure at this point. We need to either fix it to be faster and more reliable, or make the new gallery handle that scenario. But first, we need to get the new gallery to the point that we can launch it as, and it's getting close.

Oct 12, 2011 at 3:55 PM

What do you mean "every time you want to build something"? Is that a server reboot? If you prefer using a db then maybe NuGet.Server isn't for you and you should look at using the github project.

Oct 12, 2011 at 5:10 PM

No, that isn't a server reboot. I think what I was talking about up there was my continuous integration server. I want to have it push its builds into the directory that the nuget server fetches files from - so these CI builds are availible for installation via nuget in people's projects.

That was said above before I'd understood better what is going on. Right now there seems to be one problem and one bug. The two may be connected, but I"ve not proved that yet. The problem is that the first request to the server takes over a minute to be filled while the nuget server scans all the packages on the disk. While the server rarely gets rebooted (history indicates just during patch dates), the application pool spins down the nuget server when there are no queries after a time period. This server is basically used by no one right now - so it often is spun down. I suppose a workaround for this is to change this timeout or have some task that periodically fetches the package list.

The bug is this check-sum error discussed above. I've got some anecdotal evidence that if I'm patient when I do the first query of the server - and in the IDE wait the full 1+ minutes for the list of packages to come back, I can load them without errors. What I need to do now is go back, and test interrupting that first query, redoing the query (or a search or similar) and see if that causes this failure mode again. I didn't understand previously what was causing the slow response - so I often thought things had hung and was interrupting the search.

If I can turn the bug on and off, then I'll let you know on this thread. Having characterized the bug I'm happy to take a look at the code and see if I can help. And this may or may not be connected with the bug that is listed as a task (see the issue above).

Oct 17, 2011 at 1:29 AM

Ok. I've now verified that if I interrupt the display of results in the nuget GUI that is part of the IDE then the download will fail with the above errors.

Interrupting means:

  • After the GUI comes up it is searching for all the pacakges availible, interrupt it by clicking on one of the feeds (like my root packages feed).
  • Type a search term into the search box

The bug almost certianly seems to be on the server end b/c I have not been able to download the package successfully w/out restarting the web server, or waiting a long time until IIS shuts it down due to inactivity.

The above comes up because of what I'll call a second bug, though others may disagree and call it a feature. The scan time for the webserver to unpackage and examine the meta-data for all the packages in my Packages directory is over 70 seconds. This is probably caused by a few of my packages that are quite large. Even this simple server needs a persistent caching mechanism - even if the bug above was fixed, it is not acceptable to have to way for > 60 seconds to get a package listing.

I'm happy to takle either of these bugs over the next few weeks. Is the source code just a download & build affair for the web server?



Oct 17, 2011 at 6:56 PM

Thanks for debugging through the server. Yes, it is mostly download and build - the server sources are located under src\Server in our source tree. Instructions on getting started are available here - (ignore the setup debugging instructions - I just realized they are out of date)

Oct 23, 2011 at 8:56 AM


  I finally got a chance to start looking at this. You guys don't like comments, do you? ;-)

  You have no debugging instructions for running the server. While I'm at the very beginning of this, it seems pretty easy: Select the Server project, click "debug". You can then use the URL to send a query. One thing - you should make sure to exit the VS 2010 web server between each run: it seems to retain some state about initalization.

  I'm learning code thinking about it from decent #'rs of packages (30-40) but some of them large (100 MB). One thing I notice is the ServerPackageRepository::CalculateDerivedData. You read the full package into memory. For a package that is 100MB, this will put some real stress on the webserver for a few seconds. As far as I can tell (at least in that method) you do this only to calculate the hash.

  It looks like you open the package twice - once to extract the manifest, and the second time to read the hash. I guess this is unavoidable - and the manifest read, while not fast, doesn't seem amazingly slow either (then again, I've only tried it on a small pacakge). Does ZipPackage force a read of the complete package in order to unzip the file?

  It looks like a lock around the "if" in LocalPackageRepository::GetPackage will do the trick. This will serialize the access to the local repository cache. However, this file is in the Core library, which I assume means it is used by lots and lots of clients. So I'm a bit nervous about making a change there. The other thing to do is lock out the web server so all queries are single threaded (since I think at the lowest level many of the web service calls end up in GetPackage). I am, btw, making these statements w/out testing anything - just from looking at the code. I'll try to do some testing tomorrow or early next week.

  Also, you are using a test harness I'm not familiar with (I know about, basically, only the VisualStudio test stuff). How can I run the tests so that if I do modify the Core library I can be sure I don't mess something up?