I have been using Bulk update in my program extensively, it’s a very useful tool.
After some time we began using the program actively, and users realized that saved data is being lost, namely overwritten.
A user can change an item price from $10 to $20 and after 15 minutes or so the price is back to the original value, like a phantom flipping switching inside the program.
After some soul searching I started to think what is the problem.
We have a lot of background processes which are running every hour in order to get updated info from Amazon, namely buy box price, selling price, fba fees, etc.
The process goes like this:
The program gets a list of items from the table and loads them into memory using regular EF Core, it modifies them and updates the table by passing in the list of objects to the bulkupdate Function.
Now let’s say this scenario:
Process1, which is modifying the buy-box column, starts at 3:00 p.m., it runs for 20 minutes. It gets a bunch of objects from the database and does its work.
5 minutes later, 3:05 p.m., process2 starts, it get’s a list of objects from the database and starts it’s work of modifying the selling price column.
It finishes in 5 minutes, and on 3:15 p.m. it updates the database. Let’s say the selling price of “item1”, was modified in process2, from $10 to $5.
But “item1” is also in the list which process1 took out of the database, now at 3:20 p.m., process1 sends back the updates to the database, using bulk update. It has many buy-box fields modified, but it overwrites the whole record with the old values it collected while getting the items from the database on 3:00p p.m., Including the info that the selling price of “item1” is $10, like it was at 3:00 p.m., and those overwriting the new selling price value which is in the database, as updates by proces2.
So my question is, does the update selectively update only the affected columns or the whole record
Please comment.
Naftaly