Question

I've having a small issue loading a ~50meg file into a database. Unfortunately, and before we ask why I'm doing this, I need to add this feature for legacy purposes. The column is setup as type Image which means I can't unfortunately load the data in chunks and concatenate them (since sybase doesn't not allow it for Image type columns)

I'm building my parameter from the DBProviderFactory using DBType.Binary currently. I've tried other types and nothing helps.

I get the follow error:

There is not enough procedure cache to run this procedure, trigger, or SQL batch. Retry later, or ask your SA to reconfigure ASE with more procedure cache.

The Procedure cache on the server is set to 100meg.

Other Info:

  1. It works work ~35 meg files
  2. The old code (written in PB) uses UPDATEBLOB and it works there.

Any suggestions?

Was it helpful?

Solution

100 MB of procedure cache is pretty small. Especially if you have a single procedure taht you know will take over 50 MB. That is the total procedure cache for the whole database instance not per session or per procedure.

I would suggest you make your procedure cache bigger.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top