Base solution for your next web application
Open Closed

Saving records takes too long when using Unit of Work #9855


User avatar
0
rasoulshams created

In my app, I need to go through hundreds of records one by one, process each record, gather and analyse data for that record, create a new entity and store it in the database. Now, the storage of each record affects the processing of the next record; in other words, when the newly created entity is inserted, it has to be read along with other data from database to help decide how the next record on the list is processed, and s on.

Since UoW is used and data is persisted at the end of it, I use "CurrentUnitOfWork.SaveChangesAsync" to store each record after I process it within the loop but this results in a very slow overall process. I tried to disable UoW for this method using the "UnitOfWork(isDisabled = true)" attribute but I get an error stating that "Cannot access a disposed object". I did use the virtual keyword as well but no luck.

I have tried removing "CurrentUnitOfWork.SaveChangesAsync" and it does run 6 to 7 times faster but as all the data is persisted at the end of the unit of work, I won't have updated data when iterating through the original records. I need updated data after the newly created entity is inserted in order to process the next record.

Note that this is not a batch insert as I know EF Core is not suitable for that. This is individual insersion of numerous records.

I am currently running this as a Hangfire background job. I've tested the speed when not run as a job and the results are the same.

Is there any way to speed this up? I hope it's clear and appreciate if you can help.

Many thanks.


1 Answer(s)
  • User Avatar
    0
    zony created
    Support Team

    Hi rasoulshams, I think you should use a temporary List to store the data processed this time, use foreach to insert it into the DbContext after the processing is complete, and finally call SaveChagesAync().