Question

We are building an application using SharePoint Online as document repository service. There is a requirement for a user to see number of documents as a badge on top of each page (pages are built using .NET and Angular and HTML). This badge number needs to be updated when another user adds a document to the repository for all users already in the app.

There is a debate to use SignalR to update the badge number or to have a timer on the page to update the badge number by querying the number of documents using a call to SharePoint Online Search API.

We expect millions of documents and hundreds of users so using timer and calling the API makes me worried. I have two concerns, performance and also the latency for the search API to see the new added documents and as result delay in updating the badge.

Anyone has done something similar or can comment on my concerns?

Best

Était-ce utile?

La solution

It sounds like you're using SharePoint as some sort of real-time monitoring application. This isn't really a use case SharePoint was designed to handle.

You are right to be worried about API performance - it's not going to work given the amount of documents you expect to have. But that's the least of your problems. You will encounter throttling limits very quickly if you attempt to push large amounts of data into SharePoint.

You've given a very thin description of what the application is trying to do, but based on what little detail there is, I would advise you to not do this in SharePoint, and build a custom application instead. SharePoint was not designed for the performance you need.

If you really want to go this route, try using a remote event receiver or possibly a webhook to update a store external to SharePoint, and have your SignalR process updatethe clients. But if it were my project, I would push back hard on doing this in SharePoint.

Autres conseils

Interesting, we're doing the same. Enterprise application with 20M documents being migrated in, and an annual growth estimated at 2.5M new documents a year. SharePoint Online is being used for the document repository, with an Angular custom front-end app talking to a .NET Core middle tier. That connects to SPO through REST, as well as to various enterprise systems. ~300 active users planned in the system.

We've gone through the scalability considerations, and segmented documents into multiple site collections, libraries, and a folder structure to maintain the < 5,000 items limits, the 30M documents in a library, and 25 TB per site collection (see https://docs.microsoft.com/en-us/office365/servicedescriptions/sharepoint-online-service-description/sharepoint-online-limits).

Most of the document access is at a folder level, using SPO REST calls. We do have the need to list all documents with particular tags, and have a top header indicator of the count of all those documents on all pages. For these two use cases, we're using SharePoint Search. Yes, there is a search latency of 5-15 minutes. Our small set testing so far has shown 1.5 - 2.5 minutes latency. Aside from that issue, search is very scalable, and performs extremely well. Results come back in a few hundred milliseconds typically.

There are some thoughts around using SignalR (https://dotnet.microsoft.com/apps/aspnet/signalr) for the search scenarios to reduce latency. I still prefer the search approach, as the complexity is significantly less.

On the throttling side we are working with Microsoft Consulting to make sure we play nicely there. Some good points in https://docs.microsoft.com/en-us/sharepoint/dev/general-development/how-to-avoid-getting-throttled-or-blocked-in-sharepoint-online on that topic. HTTP decorating with a registered app is a strong recommendation, and properly respecting the 429 return codes with their Retry-After header.

Another learning to share, is to be careful if planning on doing load testing. That is not supported by Microsoft, and is a good way to get yourself throttled quickly. Building mocks for your SPO calls with reasonable delays is one way to still do load testing, and you can stress the rest of your application to see how it behaves. We're using a combination of jMeter and Application Insights to do this, but early days still. Doesn't load test the Angular components, but everything downstream from that gets tested.

Besides pulling with a timer against Search API, the other option is to use SharePoint Webhook to listen for changes to a document library and then broadcast the change via SignalR or other Real-time notification technologies. This eliminates both the problem of delay and throttling.

Licencié sous: CC-BY-SA avec attribution
Non affilié à sharepoint.stackexchange
scroll top