Question

I have a Sharepoint list that is a Linked List with MS Access (2010). It has just under 5,000 items (4,864), thus meaning it avoids the reduction in functionality lists of greater than 5,000 items have.

Among the list are five calculated columns. These take the information from another column (different employee numbers), and by using a formula produce a clickable link to my company's Directory page, where that particular employee's info is displayed. I'm not sure if these five columns are relevant to my query, but I'm mentioning them in case.

My problem is this: Frequently when I run any query on the list that updates the list, I get this error: "There were errors executing the bulk query or sending the data to the server. Reconnect the tables to resolve the conflicts or discard the pending changes".

When I review the errors, it says they conflict with errors a previous user made (with that previous user being me). It frequently highlights several columns, despite the info in them being identical, and the calculated columns (with the original showing the value they contained and the new showing #VALUE! (as Access can't display the formulas).

However, if I click Retry All Changes, all the update stick and everything seems fine. Why is this happening? It's proving very annoying and is really stopping the automation of my large number of queries. A list of 5000 items isn't particular big (and they've got roughly 100 columns, although I didn't think that was too large either, given Excel's 200+ column limit).

Is this due to poor query design and SQL? Is this due to connectivity issues (which I doubt, as my line seems perfect)? Is this due to Access tripping over itself and not executing the update on each row in the table, but rather executing them concurrently and locking itself up? I'm at wit's end about it and really need to get this sorted - as it's preventing large scale automation of my queries.

Edit: Sample query that has failed at least once (but passed often as well):

UPDATE [Universe Of Systems] INNER JOIN [Summary] ON 
       [Universe Of Systems].[SALSA ID] = [Summary].IT_System_ID 
SET    [Universe Of Systems].[SALSA Name] = [Summary].[IT_System_Name],
       [Universe Of Systems].[System Type] = [Summary].[IT_System_Type],
       [Universe Of Systems].[CAO Critical] = [Summary].[Critical], 
       [Universe Of Systems].[LOB] = IIf(IsNull([Summary].[LOB]),"None Specified",[Summary].[LOB]), 
       [Universe Of Systems].[Business Ownership] = IIf(InStr([Summary].[LOB],"Shared")=0,[Summary].[LOB],"Shared Application"), 
       [Universe Of Systems].[GS System?] = IIf(InStr([Summary].[LOB],"Global"),"Y","N"), 
       [Universe Of Systems].[Business Owner] = [Summary].[Business_Owner],
       [Universe Of Systems].[System Domain] = [Summary].[System Domain],
       [Universe Of Systems].Delivery = [Summary].[Delivery Unit], 
       [Universe Of Systems].Platform = [Summary].[Platform], 
       [Universe Of Systems].[Delivery Owner] = [Summary].[Delivery Owner],
       [Universe Of Systems].[Lifecycle Stage] = [Summary].[Lifecycle_Stage],
       [Universe Of Systems].[Service Management Level] = [Summary].[SML],
       [Universe Of Systems].[SoX] = [Summary].[SoX], 
       [Universe Of Systems].[Bld Code] = [Summary].[Location_of_IT_Assets];
Was it helpful?

Solution

Strategies for accomplishing this query while working around the constraints:

  1. Shorten your query: Do you really need to update 5k records at the same time? Prefer to reduce updates to those actually being changed. If still too big for a successful update then you can break up the records and only update 1k at a time.

  2. Change your connection method from access using Datasheet view to using the more robust and fault tolerant web services. Use a tool to explore Web services and/or SOAP connections to learn what you can do. Creating web service connections in Access.

Likely a combination of the two solutions above would give you the best results. However, since you are nearing the logical limits of what SP claims to do you are running into the actual reasons for the limits. You can successfully do what you are intending to do, but you will need to change how you do it. When you are doing these updates SP is running a ton of additional logic in the background.

Another option (especially if you are running SP 2013) would be to move your raw data to SP and do your joins there instead of in te update query. It would be worth your time to look at the updates to cubes etc in 2013. It swallows farm memory whole, but there is some great functionality there. The linked slide deck would give you a list of terms to google and an overview of features integrated at the brochure level, so take it with a salt mine.

Licensed under: CC-BY-SA with attribution
Not affiliated with sharepoint.stackexchange
scroll top