Base solution for your next web application
Ends in:
01 DAYS
01 HRS
01 MIN
01 SEC
Open Closed

Large data import performance #9450


User avatar
0
adamphones created

We would like to create a module or use app service) to import large data into the system. We noticed is that when current services are used or domains are used to import the data, the process takes ages or sometimes never to finish. The issue due to the fact that EF tracks changes to all entities. As the loop continues to increase, memory gets smashed with all tracking an as a result process gets very slow.

You can read the issue here: https://weblog.west-wind.com/posts/2014/dec/21/gotcha-entity-framework-gets-slow-in-long-iteration-loops#:~:text=There%20are%20a%20few%20simple,tracking%20for%20the%20dbContext%20instance

The suggestion is to get a new instance of the db context or stop tracking to speed up the process.

I read through your documation and disabling UnitOfWork speeds up the process a bit (basically using a single unit of work for each loop and commiting changes at the end of the loop). However we believe that creating nearly 10 000 records ( in various tables) should not take around 15 minutes.

Somehow we need to access the DB context and disable the following to gain more speed while importing data as it suggested in this link : https://stackoverflow.com/questions/6107206/improving-bulk-insert-performance-in-entity-framework

yourContext.Configuration.AutoDetectChangesEnabled = false; yourContext.Configuration.ValidateOnSaveEnabled = false;

My questions are:

1- What is your suggestion about importing large data(not just into single table but multiple tables) with best performance? 2- How can we access the DB context to disable those options while we import the data? 3- Would we be able to still use Repository injections? For example : IRepository<Product>


1 Answer(s)