Question

The existing fragment

foreach (var ownerCandidates in ownerToCandidatesDictionary)
  {
    foreach (var candidate in ownerCandidates.Value)
    {
       transactionEntities.AddToSomeEntity(someObject)
    }
  }
                transactionEntities.SaveChanges(System.Data.Objects.SaveOptions.AcceptAllChangesAfterSave);

Is rewriting to

    int i = 0 ; 
    foreach (var ownerCandidates in ownerToCandidatesDictionary)
    {
         foreach (var candidate in ownerCandidates.Value)
         {
             transactionEntities.AddToSomeEntity(someObject)
          }
          if ( i++ % 1000 == 0 ) 
          {
               transactionEntities.SaveChanges(System.Data.Objects.SaveOptions.AcceptAllChangesAfterSave);
          } 
    }

transactionEntities.SaveChanges(System.Data.Objects.SaveOptions.AcceptAllChangesAfterSave);

gives us the same functionality in case of successful program termination ? My concern we keep adding, does SaveChanges in a loop work only with what was added since previous SaveChanges. Do we have saving in batches here ? If it is not the case how the original fragment may be changed to avoid

12/06/2012 7:50:37 PM : System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
   at System.Data.Mapping.Update.Internal.Propagator.Project(DbProjectExpression node, PropagatorResult row, TypeUsage resultType)
   at System.Data.Mapping.Update.Internal.Propagator.Visit(DbProjectExpression node)
   at System.Data.Common.CommandTrees.DbProjectExpression.Accept[TResultType](DbExpressionVisitor`1 visitor)
   at System.Data.Mapping.Update.Internal.Propagator.Propagate(UpdateTranslator parent, EntitySet table, DbQueryCommandTree umView)
   at System.Data.Mapping.Update.Internal.UpdateTranslator.<ProduceDynamicCommands>d__0.MoveNext()
   at System.Linq.Enumerable.<ConcatIterator>d__71`1.MoveNext()
   at System.Data.Mapping.Update.Internal.UpdateCommandOrderer..ctor(IEnumerable`1 commands, UpdateTranslator translator)
   at System.Data.Mapping.Update.Internal.UpdateTranslator.ProduceCommands()
   at System.Data.Mapping.Update.Internal.UpdateTranslator.Update(IEntityStateManager stateManager, IEntityAdapter adapter)
   at System.Data.EntityClient.EntityAdapter.Update(IEntityStateManager entityCache)
   at System.Data.Objects.ObjectContext.SaveChanges(SaveOptions options)
Was it helpful?

Solution

Subsequent call to SaveChanges() will not have any effect if changes are not made to the entity set after the previous SaveChanges(). After the SaveChanges() what goes in database is the new batch. Batching is the option to overcome the OutOfMemoryException during huge inserts with EntityFramework. Actually SaveChanges() should by called only once but in your case as the data is huge it must be divided into the batches.

Also while adding entity in bulk to get significant performance improvements, temporarily disable the automatic detection of changes by setting AutoDetectChangesEnabled to false. Here is the infocontext.Configuration.AutoDetectChangesEnabled = false;

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top