Git Product home page Git Product logo

borisdj / efcore.bulkextensions Goto Github PK

View Code? Open in Web Editor NEW
3.6K 3.6K 580.0 3.22 MB

Entity Framework EF Core efcore Bulk Batch Extensions with BulkCopy in .Net for Insert Update Delete Read (CRUD), Truncate and SaveChanges operations on SQL Server, PostgreSQL, MySQL, SQLite

Home Page: https://codis.tech/efcorebulk

License: Other

C# 99.41% TSQL 0.59%
batch bulk copy efcore entity-framework-core entityframework entityframeworkcore mysql postgresql sql sqlbulkcopy sqlite sqlserver

efcore.bulkextensions's Issues

Temp tables staying in database

Hi Borris. I noticed periodically in my code that the temp tables the Bulk insert creates end up not being dropped. the thing that bothers me, is that I now have a piece of code where it happens every time, what is the solution?

Could BulkUpdate specified conditions?

Hey:)
Thx for your EFCore.BulkExtensions very much. However, could BulkUpdate specify conditions?
I want sql like this:

update [dbo].[table1] 
set [IsActive] = arg1
where [Id] = arg2 and [OrderId] = arg3

And, could BulkUpdate increase count? for example:

update [dbo].[table1]
set [count] = [count] + 1
where ...

FireTriggers Option

I needed the ability to fire triggers on my tables when doing bulk inserts. I don't actually know how to do a pull request, but below is the code (Seperated by file) that I modified to allow me to do that if anyone else needs to ability too. It would be nice if this were in the package. This also gives the ability for All the SQLBulkCopyOptions. Essentially I added a BulkCopyOptions into BulkConfig and anywhere BulkConfig was being used I modified it to accept BulkCopyOptions for a parameter. Note, I did not remove the KeepIdentity property which is a SQLBulkCopyOptions option, but there are now 0 references too it. I also left the original code (commented out) in GetSqlBulkCopy() simply because I only tested it for my scenario and wasn't sure if it followed the same logic path that is expected for other scenarios.

BulkConfig.cs

using System.Collections.Generic;
using System.Data.SqlClient;

namespace EFCore.BulkExtensions
{
    public class BulkConfig
    {
        public bool PreserveInsertOrder { get; set; }

        public bool SetOutputIdentity { get; set; }

        public int BatchSize { get; set; } = 2000;

        public int? NotifyAfter { get; set; }

        public int? BulkCopyTimeout { get; set; }

        public bool EnableStreaming { get; set; }

        public bool UseTempDB { get; set; }

        public bool KeepIdentity { get; set; }

        public List<string> PropertiesToInclude { get; set; }

        public List<string> PropertiesToExclude { get; set; }

        public List<string> UpdateByProperties { get; set; }

        public SqlBulkCopyOptions BulkCopyOptions { get; set; }
    }
}

SqlBulkOperation.cs

public static void Insert<T>(DbContext context, IList<T> entities, TableInfo tableInfo, Action<decimal> progress)
        {
            var sqlConnection = OpenAndGetSqlConnection(context);
            var transaction = context.Database.CurrentTransaction;
            try
            {
                using (var sqlBulkCopy = GetSqlBulkCopy(sqlConnection, transaction, tableInfo.BulkConfig.BulkCopyOptions))
                {
                    tableInfo.SetSqlBulkCopyConfig(sqlBulkCopy, entities, progress);
                    using (var reader = ObjectReaderEx.Create(entities, tableInfo.ShadowProperties, context, tableInfo.PropertyColumnNamesDict.Keys.ToArray()))
                    {
                        sqlBulkCopy.WriteToServer(reader);
                    }
                }
            }
            finally
            {
                if (transaction == null)
                {
                    sqlConnection.Close();
                }
            }
        }

        public static async Task InsertAsync<T>(DbContext context, IList<T> entities, TableInfo tableInfo, Action<decimal> progress)
        {
            var sqlConnection = await OpenAndGetSqlConnectionAsync(context);
            var transaction = context.Database.CurrentTransaction;
            try
            {
                using (var sqlBulkCopy = GetSqlBulkCopy(sqlConnection, transaction, tableInfo.BulkConfig.BulkCopyOptions))
                {
                    tableInfo.SetSqlBulkCopyConfig(sqlBulkCopy, entities, progress);
                    using (var reader = ObjectReaderEx.Create(entities, tableInfo.ShadowProperties, context, tableInfo.PropertyColumnNamesDict.Keys.ToArray()))
                    {
                        await sqlBulkCopy.WriteToServerAsync(reader).ConfigureAwait(false);
                    }
                }
            }
            finally
            {
                if (transaction == null)
                {
                    sqlConnection.Close();
                }
            }
        }

private static SqlBulkCopy GetSqlBulkCopy(SqlConnection sqlConnection, IDbContextTransaction transaction, SqlBulkCopyOptions options = SqlBulkCopyOptions.Default)
        {
            if (transaction == null)
            {
                return new SqlBulkCopy(sqlConnection, options, null);
            }
            else
            {
                var sqlTransaction = (SqlTransaction)transaction.GetDbTransaction();
                return new SqlBulkCopy(sqlConnection, options, sqlTransaction);
            }
            //if (transaction == null)
            //{
            //    if (keepIdentity)
            //        return new SqlBulkCopy(sqlConnection, SqlBulkCopyOptions.KeepIdentity, null);
            //    else
            //        return new SqlBulkCopy(sqlConnection);
            //}
            //else
            //{
            //    var sqlTransaction = (SqlTransaction)transaction.GetDbTransaction();
            //    if (keepIdentity)
            //        return new SqlBulkCopy(sqlConnection, SqlBulkCopyOptions.KeepIdentity, sqlTransaction);
            //    else
            //        return new SqlBulkCopy(sqlConnection, SqlBulkCopyOptions.Default, sqlTransaction);
            //}
        }

Create extension method

I'm trying to create an override for the BulkInsert function. In my dbcontext I use an override on SaveChanges() where i edit several properties of the object and return the base.SaveChanges();

Is it possible to create this for the bulklinsert?

Code Exception when insert, but data inserted into db

Hi,
I have two problems and was hoping someone can point me to the right direction. I am trying to insert and get the Id populated by using the 'SetOutputIdentity = true' flag with v2.0.7.

  1. I got an exception 'There is already an open DataReader associated with this Command which must be closed first.' The data was inserted into SQL Server with the exception.
    image

  2. I am having the same issue as the below where the Id property does not change to reflect the Id on the table.
    https://github.com/borisdj/EFCore.BulkExtensions/issues/18

Any suggestions?

Thanks,
ED

Throws exception on BulkInsert when SetOutputIdentity = true and the Primary Key db column name is different than the actual model property

I have a model defined like so

public class Entity
{
    [Key]
    [DatabaseGenerated(DatabaseGeneratedOption.Identity)]
    public int Id { get; set; }

    //some other properties
}

However, using the fluent api in the context, the Id property is actually stored in the database as EntityId (name of the class + "Id").

When using SetOutputIdentity = true, it throws an exception saying "Invalid column: 'Id'", so it appears to be using the name from the model and not the actual column name somewhere.

Columns with spaces cause BulkInsertOrUpdate to fail

**How to reproduce: ** Create a table that has a column name with spaces in it. Then, try to run a query.

This can actually be easily fixed by detecting whitespace in the Column names and wrapping them with brackets [] when whitespace is detected.

NOTE: This does not occur on a mere BulkInsert, but a BulkInsertOrUpdate call.

Cannot drop the table

Hello,

I encoutered an issue using EFCore BulkExtensions.
The db user is set as db_owner on the related database

image
Does anyone has any clue on that ?

image

Code I use for that :
image

If you need any information, do not hesitate.

PS : It's not about rights, I can delete a table with Navicat / SQL Management Studio / MSSQL Tools with the same credentials

Btw, thanks for your awesome work, it's saving my life <3

Support for Owned Types (EF Core)

Class configuration

    public class CentraStageAlert : Entity<string>
    {
        public CentraStageAlert()
        {
            AlertMonitorInfo = new CentraStageAlertMonitorInfo();
            AlertSourceInfo = new CentraStageAlertSourceInfo();
        }

        public CentraStageAlertMonitorInfo AlertMonitorInfo { get; set; } // owned type

       // rest of props
    }
    public class CentraStageAlertMonitorInfo
    {
        public bool SendsEmails { get; set; }
        public bool CreatesTicket { get; set; }
    }

Entity Configuration

    public class CentraStageAlertConfiguration : IEntityTypeConfiguration<CentraStageAlert>
    {
        public void Configure(EntityTypeBuilder<CentraStageAlert> builder)
        {
            builder
                .Property(p => p.Id)
                .ValueGeneratedNever();

            builder
                .OwnsOne(p => p.AlertMonitorInfo);
        }
    }

The following exception is thrown when using the InsertOrUpdate method.

SqlException: Cannot insert the value NULL into column 'AlertMonitorInfo_CreatesTicket', table 'PortalCore.dbo.CentraStageAlerts'; column does not allow nulls. UPDATE fails.
The statement has been terminated.

After a bit of investigating myself, i think its this line which is part of the problem

var allProperties = entityType.GetProperties().AsEnumerable();

As the GetProperties() method does not return the Navigation properties that are present. I think a combination of GetNavigations() and IsOwned() would be needed to get all the required properties.

Is adding support for Owned Types be something you're interested in? If so i can have a crack at a PR, i'm just not overly familiar with the SqlBulkCopy side of things.

Lost password in connectionString

BulkInsert method works fine. But BulkInsertOrUpdate throws an exception like below.

System.Data.SqlClient.SqlException occurred
  HResult=0x80131904
  Message=Login failed for user 'user'.
  Source=Core .Net SqlClient Data Provider
  StackTrace:
   at EFCore.BulkExtensions.SqlBulkOperation.Merge[T](DbContext context, IList`1 entities, TableInfo tableInfo, OperationType operationType) in C:\Users\llamar\Dropbox\Bapulcorp\Code\bapul-account\src\EFCore.BulkExtensions\SqlBulkOperation.cs:line 81
   at EFCore.BulkExtensions.DbContextBulkTransaction.Execute[T](DbContext context, IList`1 entities, OperationType operationType, BulkConfig bulkConfig) in C:\Users\llamar\Dropbox\Bapulcorp\Code\bapul-account\src\EFCore.BulkExtensions\DbContextBulkTransaction.cs:line 22
   at EFCore.BulkExtensions.DbContextBulkExtensions.BulkInsertOrUpdate[T](DbContext context, IList`1 entities, BulkConfig bulkConfig) in C:\Users\llamar\Dropbox\Bapulcorp\Code\bapul-account\src\EFCore.BulkExtensions\DbContextBulkExtensions.cs:line 15

Seems like the reason is context.Database.GetDbConnection().ConnectionString returns connectionString without the password.

public static void Insert<T>(DbContext context, IList<T> entities, TableInfo tableInfo, Action<double> progress = null, int batchSize = 2000)
{
    var sqlBulkCopy = new SqlBulkCopy(context.Database.GetDbConnection().ConnectionString)
    {
        DestinationTableName = tableInfo.InsertToTempTable ? tableInfo.FullTempTableName : tableInfo.FullTableName,
        BatchSize = batchSize,
        NotifyAfter = batchSize
    };
    sqlBulkCopy.SqlRowsCopied += (sender, e) => { progress?.Invoke(e.RowsCopied / entities.Count); };

    foreach (var element in tableInfo.PropertyColumnNamesDict)
    {
        sqlBulkCopy.ColumnMappings.Add(element.Key, element.Value);
    }
    using (var reader = ObjectReader.Create(entities, tableInfo.PropertyColumnNamesDict.Keys.ToArray()))
    {
        sqlBulkCopy.WriteToServer(reader);
    }
}

The reason is that after invoke CheckHasIdentity method, DbContext somehow lost it's password.

public void CheckHasIdentity(DbContext context)
{
    int hasIdentity = 0;
    var conn = context.Database.GetDbConnection();
    try
    {
        conn.OpenAsync();
        using (var command = conn.CreateCommand())
        {
            string query = SqlQueryBuilder.SelectIsIdentity(FullTableName, PrimaryKey);
            command.CommandText = query;
            DbDataReader reader = command.ExecuteReader();

            if (reader.HasRows)
            {
                while (reader.Read())
                {
                    hasIdentity = (int)reader[0];
                }
            }
            reader.Dispose();
        }
    }
    finally
    {
        conn.Close();
    }

    HasIdentity = hasIdentity == 1;
}

Thanks.

It would be nice to specify columns

One feature that would be useful is to specify the exact columns. Any column not listed would be ignored by BulkExtensions which would allow the database to set the default values instead of EFCore. For example, I am having an issue while upserting an Sql Server timestamp column where the default value is generated in the temp table and then fails when merging into the actual table because timestamps can't be explicitly inserted.

I would like to allow BulkConfig to override the list of columns to update.

Computed column error when SetOutputIdentity is set to true during BulkInsert

Can you please confirm the following issue?

I noticed BulkInsert fails if you have computed column in table and you are using SetOutputIdentity = true.

Code for loading the data:

public IEnumerable<TestResult> SeedTestResults(IEnumerable<TestResult> data)
{
	///This code was auto generated, if you make any change in this method then next time when the code is regenerated your changes will be overwritten 
	if (data.Any() == false)
		return data;

	using (var transaction = Database.BeginTransaction())
	{
		var items = data.ToList();
                //No effect of PropertiesToExclude for Insert
		//List<string> excludedProperties = null;
		//excludedProperties = new List<string>();
		//excludedProperties.Add("TestStatus");

		this.BulkInsert(items, new BulkConfig { SetOutputIdentity = true, PreserveInsertOrder = true /*, PropertiesToExclude = excludedProperties*/ });
		var relatedItems = new List<TestMeasurement>();
		foreach (var item in items)
		{
			foreach (var subItem in item.TestMeasurements)
			{
				subItem.TestResultId = item.TestResultId;
			}
			relatedItems.AddRange(item.TestMeasurements);
		}
		this.BulkInsert(relatedItems, new BulkConfig { SetOutputIdentity = true, PreserveInsertOrder = true });
		transaction.Commit();
		return data;
	}	
}

Following is the merge query that gets generated:

MERGE dbo.[TestResults] WITH (HOLDLOCK) AS T USING (SELECT TOP 60 * FROM dbo.[TestResultsTemp4b97610a] ORDER BY [TestResultId]) AS S ON T.[TestResultId] = S.[TestResultId] WHEN NOT MATCHED THEN INSERT ([BuildCode], [CustomerSequenceNumber], [ErrorMessage], [JobId], [RecipeId], [ShiftId], [StationId], [TestDate], [TestStatus], [TotalTimeInSeconds]) VALUES (S.[BuildCode], S.[CustomerSequenceNumber], S.[ErrorMessage], S.[JobId], S.[RecipeId], S.[ShiftId], S.[StationId], S.[TestDate], S.[TestStatus], S.[TotalTimeInSeconds]) OUTPUT INSERTED.[TestResultId], INSERTED.[BuildCode], INSERTED.[CustomerSequenceNumber], INSERTED.[ErrorMessage], INSERTED.[JobId], INSERTED.[RecipeId], INSERTED.[ShiftId], INSERTED.[StationId], INSERTED.[TestDate], INSERTED.[TestStatus], INSERTED.[TotalTimeInSeconds] INTO dbo.[TestResultsTemp4b97610aOutput];

Table script

CREATE TABLE [dbo].[TestResults](
	[TestResultId] [int] IDENTITY(1,1) NOT NULL,
	[JobId] [nvarchar](50) NULL,
	[BuildCode] [nvarchar](50) NULL,
	[CustomerSequenceNumber] [nvarchar](50) NULL,
	[TestStatus]  AS (case when ltrim(rtrim(isnull([ERRORMESSAGE],'')))='' then 'Pass' else 'Fail' end) PERSISTED NOT NULL,
	[ErrorMessage] [nvarchar](512) NULL,
	[TestDate] [datetime] NOT NULL,
	[TotalTimeInSeconds] [decimal](18, 2) NOT NULL,
	[StationId] [int] NOT NULL,
	[ShiftId] [int] NOT NULL,
	[RecipeId] [int] NULL,
 CONSTRAINT [PK_TestResults] PRIMARY KEY CLUSTERED 
(
	[TestResultId] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
) ON [PRIMARY]
GO
...
...
...
--removed indexes and other details for brevity

OnModelCreating code for TestResult table

modelBuilder.Entity<TestResult>(entity =>
{
	entity.Property(e => e.TestStatus)
		  .HasConversion<string>();

	entity.Property(e => e.ErrorMessage).HasMaxLength(512);

	entity.Property(e => e.JobId).HasMaxLength(50);

	entity.Property(e => e.TestDate).HasColumnType("datetime");

	entity.Property(e => e.TestStatus)
		.IsRequired()
		.HasMaxLength(4)
		.IsUnicode(false)
		.HasComputedColumnSql("(case when ltrim(rtrim(isnull([ERRORMESSAGE],'')))='' then 'Pass' else 'Fail' end)");

	entity.HasOne(d => d.Recipe)
		.WithMany(p => p.TestResults)
		.HasForeignKey(d => d.RecipeId)
		.HasConstraintName("FK_TestResults_Recipes");

	entity.HasOne(d => d.Shift)
		.WithMany(p => p.TestResults)
		.HasForeignKey(d => d.ShiftId)
		.OnDelete(DeleteBehavior.ClientSetNull)
		.HasConstraintName("FK_TestResults_PlantShifts");

	entity.HasOne(d => d.Station)
		.WithMany(p => p.TestResults)
		.HasForeignKey(d => d.StationId)
		.OnDelete(DeleteBehavior.ClientSetNull)
		.HasConstraintName("FK_TestResults_Stations");
});

Does it support the InMemory provider?

I want to move away from our current solution (a combination of temp tables, SqlBulkCopy and merge statements) to increase testability. The majority of the tests are already using the InMemory provider. Will this library support it?

Merge Matching on Null

I'm trying to BulkInsertOrUpdate with the defined list of UpdateByProperties. The only way I'm able to get it to update instead of insert is when the UpdateByProperties list contains all non-null values for a row. So for example, a row of data consists of Field1, Field2, Value, with Field1 and Field2 as UpdateByProperties. If a row of data that exists currently has Field1 = 'a' and Field2 = 'b' and the new data matches this, it will correctly update Value. But if FieldB is null in both the existing row and the new row, a second row gets inserted.

Is there a property I should be using to get around this or is there a way that I can get nulls to be matched?

Double update when bulkupdate

Hello, having trouble when use bulk update inside transaction:

What happening:

  1. Inserting new entities
   await dbContext.BulkInsertAsync(newClasses, new BulkConfig()
            {
                SetOutputIdentity = true,
                WithHoldlock = false,
                SqlBulkCopyOptions = SqlBulkCopyOptions.UseInternalTransaction
            });
  1. Each entry in state Unmodified. Modified each inserted entity
foreach (var newClass in newClasses){

                newClass.TableName = "newString"
            }

  1. Each entry in state Modified. Making bulk update
 await dbContext.BulkUpdateAsync(newClasses, new BulkConfig
            {
                SetOutputIdentity = true,
                WithHoldlock = false
            });
  1. Update passed, db updated, but each entry of dbContext styll in Modified state
  2. Making SaveChange cause ef tracker to update all entities

And one extra question why bulk update can block whole dbset inside of ReadCommited Transaction?

BulkDelete throws NullReferenceException in 2.0.9

After upgrading from 2.0.8 to 2.0.9, I started getting NullReferenceException from some of my tests which use the BulkDelete functionalities. It works fine when I reverted the package back to 2.0.8.

Exception:
System.NullReferenceException: 'Object reference not set to an instance of an object.'

Stack trace:

at EFCore.BulkExtensions.TableInfo.b__65_2[T](IProperty a)
at System.Linq.Enumerable.Any[TSource](IEnumerable1 source, Func2 predicate)
at EFCore.BulkExtensions.TableInfo.LoadData[T](DbContext context, Boolean loadOnlyPKColumn)
at EFCore.BulkExtensions.TableInfo.CreateInstance[T](DbContext context, IList1 entities, OperationType operationType, BulkConfig bulkConfig) at EFCore.BulkExtensions.DbContextBulkTransaction.Execute[T](DbContext context, IList1 entities, OperationType operationType, BulkConfig bulkConfig, Action1 progress) at EFCore.BulkExtensions.DbContextBulkExtensions.BulkDelete[T](DbContext context, IList1 entities, BulkConfig bulkConfig, Action1 progress) at MyService.SqlService.BulkCopy.SqlBulkCopyService1.BulkDelete[TEntity](Expression`1 deleteFilter)
at MyService.PerfTests.Search.SearchLiveDbFixture..ctor()

Here's my service wrapper which uses the BulkDelete functionality:

public void BulkDelete<TEntity>(
    Expression<Func<TEntity, bool>> deleteFilter)
    where TEntity : class
{
    var dbSet = _dbContext.Set<TEntity>();
    IQueryable<TEntity> entitiesToDelete;
    if (deleteFilter != null)
    {
        entitiesToDelete = dbSet.Where(deleteFilter).AsNoTracking();
    }
    else
    {
        entitiesToDelete = dbSet.AsNoTracking();
    }

    _dbContext.BulkDelete(entitiesToDelete.ToList());
}

Additional info:

  • _dbcontext.Model doesn't seem to be null
  • entitiesToDelete is not null, the exception is thrown whether the list is empty / not

Any ideas?

Parent child relationship

I have quick question, does parent child relationship work while saving the data?

For example, I have following classes:

public partial class Application
{
    public Application()
    {
        Stations = new HashSet<Station>();
    }

    public int ApplicationId { get; set; }
    public string ApplicationName { get; set; }

    public ICollection<Station> Stations { get; set; }
}

public partial class Station
{
    public Station()
    {
        TestResults = new HashSet<TestResult>();
    }

    public int StationId { get; set; }
    public string StationName { get; set; }
    public int ApplicationId { get; set; }

    public Application Application { get; set; }
    //other properties removed for brevity
}

Below code is inside OnModelCreating

modelBuilder.Entity<Application>(entity =>
{
    entity.Property(e => e.ApplicationName)
        .IsRequired()
        .HasMaxLength(50);
});

modelBuilder.Entity<Station>(entity =>
{
    entity.HasIndex(e => e.StationName)
        .HasName("IX_Stations")
        .IsUnique();

    entity.Property(e => e.StationName)
        .IsRequired()
        .HasMaxLength(50);

    entity.HasOne(d => d.Application)
        .WithMany(p => p.Stations)
        .HasForeignKey(d => d.ApplicationId)
        .OnDelete(DeleteBehavior.ClientSetNull)
        .HasConstraintName("FK_Stations_Applications");
});

Following is my input for seeding the Applications and Stations table.

void SeedApplications()
{
	using (var repository = RepositoryFactory.GetRepository())
	{
		var applications = repository.SeedApplications(new Application[] {
			new Application
			{   ApplicationName = "Front EOL", ApplicationId = 0, Stations = new Station[]
				{
					new Station{ StationName = "1600N" ,StationId = 0, ApplicationId = 0 },
					new Station{ StationName = "1600S" ,StationId = 0, ApplicationId = 0 },
					new Station{ StationName = "1700S" ,StationId = 0, ApplicationId = 0 },
				}
			},
			new Application
			{   ApplicationName = "FAM", ApplicationId = 0, Stations = (new Station[]
				{
					new Station{ StationName = "1650N",StationId = 0, ApplicationId = 0  },
					new Station{ StationName = "1700N",StationId = 0, ApplicationId = 0  },
					new Station{ StationName = "1750N",StationId = 0, ApplicationId = 0  }
				}
				)
			}			
		});
	}
}

Following is a class to create a Repository instance:

static class RepositoryFactory
{
	public static Repository GetRepository()
	{
		DbContextOptionsBuilder b = new DbContextOptionsBuilder();
		b.UseSqlServer(System.Configuration.ConfigurationManager.ConnectionStrings["TestResultServer"].ConnectionString);
		return new Repository(b.Options);
	}
}

Actual repository class:

public partial class Repository : DbContext
{
        public virtual DbSet<Application> Applications { get; set; }
        public virtual DbSet<Station> Stations { get; set; }

	public IEnumerable<Application> SeedApplications(IEnumerable<Application> data)
	{
		///This code was auto generated, if you make any change in this method then next time when the code is regenerated your changes will be overwritten 
		if(data.Any() == false)
			return data;
		var items = data.ToList();
		using (var transaction = Database.BeginTransaction())
		{
			Applications.AddRange(items);
			this.BulkInsert(items, new BulkConfig {  SetOutputIdentity = true });
			transaction.Commit();
			return items;
		}		
	}
}

I was hoping that after inserting Applications records, Stations records will be inserted too. But that never happens. Following is my debug window output which shows after insert Stations property is cleared.

Before bulk insert:
beforeinsert

After bulk insert:
after

If use regular Insert from EF Core it just works fine (code which is commented in the above screen shot). The problem only occurs if I use BulkInsert method.

I am wondering if you can please clear up my doubt and let me know if this is possible or not :)

Thanks.

Support for non public setter of PK and 'PreserveInsertOrder'

Hi,

I have an entity where the primary key has a public get and private set:

public class Entity 
{
    public int Id {get; private set;}
}

I'm not able to do a bulk insert on this with PreserverInsertOrder, because of this:

protected void UpdateEntitiesIdentity<T>(IList<T> entities, IList<T> entitiesWithOutputIdentity)
{
if (this.BulkConfig.PreserveInsertOrder) // Updates PK in entityList
{
var accessor = TypeAccessor.Create(typeof(T));
for (int i = 0; i < this.NumberOfEntities; i++)
accessor[entities[i], this.PrimaryKeys[0]] = accessor[entitiesWithOutputIdentity[i], this.PrimaryKeys[0]];
}

It creates the accessor that handles only fully public properties.

I think that the fix should be as simple as changing to following:

TypeAccessor.Create(typeof(T), true);

Class structures with Discriminator aren't supported

Trying to do BulkInsert with a structure that maps several types to one table using some Discriminator field results in the IndexOutOfRange exception in FastMember.
This is obviously due to fact that Discriminator isn't present in instances.

Cannot drop the table 'nameTable, because it does not exist or you do not have permission.

i have this problem when i send to save

Cannot drop the table 'dbo.nameTableTempf21dbf8f', because it does not exist or you do not have permission.
i supose that is the error in my conectiongstring?
on the other hands my save is very slow is aprox seven minutes.

` _context.Database.SetCommandTimeout(360);
//listTemp = listTemp.OrderBy(T => T.Id).ToList();
// _context.Database.SetCommandTimeout(0);

            using (var transaction = _context.Database.BeginTransaction())
            {
                //_context.BulkInsert(listTemp, new BulkConfig {  SetOutputIdentity = true, BatchSize = 4000,NotifyAfter=240 });
                _context.BulkInsert(listTemp, new BulkConfig { SetOutputIdentity = true, NotifyAfter = 360 });

                transaction.Commit();
            }`

Getting Deadlock Issues while doing Bulk insert

We are using the bulk insert package and in regular tests, the inserts go through fine but in load tests where multiple web jobs are trying to do bulk inserts, we are getting deadlocks. Any pointers or settings that you can tell us how to solve the issue

Does the update selectively update only the affected columns or the whole record

I have been using Bulk update in my program extensively, it’s a very useful tool.

After some time we began using the program actively, and users realized that saved data is being lost, namely overwritten.

A user can change an item price from $10 to $20 and after 15 minutes or so the price is back to the original value, like a phantom flipping switching inside the program.

After some soul searching I started to think what is the problem.

We have a lot of background processes which are running every hour in order to get updated info from Amazon, namely buy box price, selling price, fba fees, etc.

The process goes like this:
The program gets a list of items from the table and loads them into memory using regular EF Core, it modifies them and updates the table by passing in the list of objects to the bulkupdate Function.

Now let’s say this scenario:
Process1, which is modifying the buy-box column, starts at 3:00 p.m., it runs for 20 minutes. It gets a bunch of objects from the database and does its work.

5 minutes later, 3:05 p.m., process2 starts, it get’s a list of objects from the database and starts it’s work of modifying the selling price column.

It finishes in 5 minutes, and on 3:15 p.m. it updates the database. Let’s say the selling price of “item1”, was modified in process2, from $10 to $5.

But “item1” is also in the list which process1 took out of the database, now at 3:20 p.m., process1 sends back the updates to the database, using bulk update. It has many buy-box fields modified, but it overwrites the whole record with the old values it collected while getting the items from the database on 3:00p p.m., Including the info that the selling price of “item1” is $10, like it was at 3:00 p.m., and those overwriting the new selling price value which is in the database, as updates by proces2.

So my question is, does the update selectively update only the affected columns or the whole record

Please comment.

Naftaly

System.ArgumentOutOfRangeException when BulkInsertOrUpdate

Hi,

I got System.ArgumentOutOfRangeException: 'StartIndex cannot be less than zero.'
when run below codes.

   _db.BulkInsertOrUpdate(hotelSegments, new BulkConfig { BatchSize = hotelSegments.Count });

Stack Trace:

  at System.String.Remove(Int32 startIndex, Int32 count)
  at EFCore.BulkExtensions.SqlQueryBuilder.GetCommaSeparatedColumns(List`1 columnsNames, String prefixTable, String equalsTable)
  at EFCore.BulkExtensions.SqlQueryBuilder.MergeTable(TableInfo tableInfo, OperationType operationType)
  at EFCore.BulkExtensions.SqlBulkOperation.Merge[T](DbContext context, IList`1 entities, TableInfo tableInfo, OperationType operationType, Action`1 progress)
  at EFCore.BulkExtensions.DbContextBulkExtensions.BulkInsertOrUpdate[T](DbContext context, IList`1 entities, BulkConfig bulkConfig, Action`1 progress)

Connection is not opened - BulkMerge

In EFCore.BulkExtensions.TableInfo.CheckHasIdentity there are conn.OpenAsync(). I think you have to await here, or use sync method

System.InvalidOperationException: ExecuteReader requires an open and available Connection. The connection's current state is connecting.
   at System.Data.SqlClient.SqlConnection.GetOpenTdsConnection(String method)
   at System.Data.SqlClient.SqlConnection.ValidateConnectionForExecute(String method, SqlCommand command)
   at System.Data.SqlClient.SqlCommand.ValidateCommand(Boolean async, String method)
   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean asyncWrite, String method)
   at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior)
   at EFCore.BulkExtensions.TableInfo.CheckHasIdentity(DbContext context) in D:\Projects\! GitHub\EFCore.BulkExtensions\EFCore.BulkExtensions\TableInfo.cs:line 61
   at EFCore.BulkExtensions.SqlBulkOperation.Merge[T](DbContext context, IList`1 entities, TableInfo tableInfo, OperationType operationType) in D:\Projects\! GitHub\EFCore.BulkExtensions\EFCore.BulkExtensions\SqlBulkOperation.cs:line 49
   at Auto1.Prices.Repositories.ProductRepository.InsertOrUpdate(IEnumerable`1 entities) in D:\Projects\Auto1.Prices\Auto1.Prices\Repositories\ProductRepository.cs:line 55
   at Auto1.Prices.Controllers.ExchangeController.UpdateProducts(ExchangeProduct[] products) in D:\Projects\Auto1.Prices\Auto1.Prices\Controllers\ExcangeController.cs:line 49
   at lambda_method(Closure , Object , Object[] )
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeActionMethodAsync>d__27.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeNextActionFilterAsync>d__25.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Rethrow(ActionExecutedContext context)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Next(State& next, Scope& scope, Object& state, Boolean& isCompleted)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeNextExceptionFilterAsync>d__24.MoveNext()    at System.Data.SqlClient.SqlConnection.GetOpenTdsConnection(String method)
   at System.Data.SqlClient.SqlConnection.ValidateConnectionForExecute(String method, SqlCommand command)
   at System.Data.SqlClient.SqlCommand.ValidateCommand(Boolean async, String method)
   at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean asyncWrite, String method)
   at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior)
   at EFCore.BulkExtensions.TableInfo.CheckHasIdentity(DbContext context) in D:\Projects\! GitHub\EFCore.BulkExtensions\EFCore.BulkExtensions\TableInfo.cs:line 61
   at EFCore.BulkExtensions.SqlBulkOperation.Merge[T](DbContext context, IList`1 entities, TableInfo tableInfo, OperationType operationType) in D:\Projects\! GitHub\EFCore.BulkExtensions\EFCore.BulkExtensions\SqlBulkOperation.cs:line 49
   at Auto1.Prices.Repositories.ProductRepository.InsertOrUpdate(IEnumerable`1 entities) in D:\Projects\Auto1.Prices\Auto1.Prices\Repositories\ProductRepository.cs:line 55
   at Auto1.Prices.Controllers.ExchangeController.UpdateProducts(ExchangeProduct[] products) in D:\Projects\Auto1.Prices\Auto1.Prices\Controllers\ExcangeController.cs:line 49
   at lambda_method(Closure , Object , Object[] )
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeActionMethodAsync>d__27.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeNextActionFilterAsync>d__25.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
   at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Rethrow(ActionExecutedContext context)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.Next(State& next, Scope& scope, Object& state, Boolean& isCompleted)
   at Microsoft.AspNetCore.Mvc.Internal.ControllerActionInvoker.<InvokeNextExceptionFilterAsync>d__24.MoveNext()

Type Conversion failure in CheckHasIdentity()

It would seem the cast that's happening right in the while loop causes an exception in cases where there are no identities. I'm not entirely sure if this is something weird propagating in the environment I'm working with or if this is merely something else..

Steps to Reproduce

  1. Create a table that has a PK column which lacks any identity/auto-increment attributes so it sits as a normal column with the rest.
  2. Set BulkConfig to SetOutputIdentity to true. (i.e. new BulkConfig { SetOutputIdentity = true })
  3. Attempt a BulkInsert call. (haven't tested but this likely affects the other insertion methods as well)

Solution / Notes

In any case, the offending line is occurring in TableInfo.cs right here.

For fixing the problem, I was able to simply alter the statement to the following:
hasIdentity = reader[0] == DBNull.Value ? 0 : 1;

Though I haven't exactly spent time investigating this issue further so if it ends up being some kind of weird false positive due to my environment then I do apologize in advance for occupying your time. Please let me know if there's any further information you would need.

Thanks!

BulkUpdate error with Connect/ShowPlan Db Permission

I have a table that contains a unique, non-clustered, filter index. There are no identity columns or triggers.

When I try to do a bulk update I receive a "CREATE TABLE permission denied in database" error.

The database access is using SQL authentication and user is granted Connect and Show plan permissions.

My current workaround is to grant Alter permission, however according to related bcp utility documentation:

A bcp in operation minimally requires SELECT/INSERT permissions on the target table. In addition, ALTER TABLE permission is required if any of the following is true:

  • Constraints exist and the CHECK_CONSTRAINTS hint is not specified.
  • Triggers exist and the FIRE_TRIGGER hint is not specified.
  • You use the -E option to import identity values from a data file.

I was expecting to specify only the SqlBulkCopyOptions.CheckConstraints flag so that Alter permission would not be required, however the error still occurs.

Additionally I have successfully been able to do bulk inserts with the same user into a different table without a unique index.

Given ColumnMapping is invalid

I am getting this error and was wondering, do I need to implement every column in the database in my entity for this to work?

I have to craft a series of bulk update services for a project, but I only want to pass in a list of the users ids and the new value all the users will be updated to.

Can I not create an entity that doesn't expose all the columns and have BulkUpdate work?

EntityFramework 6.x

I'm guessing EFCore.BulkExtensions is specific to EFCore. Any plans to release a version for old school EF 6.x?

Temp Table Clean-Up Question

Hello,

I'm using BulkInsertOrUpdate and unrelated to this library I am sometimes running to data scenarios that cause the operation to fail and then an exception is thrown in my code. Is there a setting I can enable to ensure that this library cleans up any tables that were created in the event of an exception during the operation?

Support for custom ValueConverters (EF Core 2.1)

EF Core 2.1 introduces the concept of converters that allow you to convert to/from different data types in the DB. Do you have plans to incorporate these into the bulk extensions? See here for more information.

BulkInsert doesn't work inside transaction. Tries to force open a new connection.

using (var tran = Context.Database.BeginTransaction())
{
   Context.BulkInsert(assets);
   tran.Commit();
}

Calling BulkInsert like this will result to the following exception:

{System.InvalidOperationException: The connection was not closed. The connection's current state is open.
   at System.Data.ProviderBase.DbConnectionInternal.TryOpenConnection(DbConnection outerConnection, DbConnectionFactory connectionFactory, TaskCompletionSource`1 retry, DbConnectionOptions userOptions)
   at System.Data.SqlClient.SqlConnection.TryOpen(TaskCompletionSource`1 retry)
   at System.Data.SqlClient.SqlConnection.Open()
   at EFCore.BulkExtensions.SqlBulkOperation.Insert[T](DbContext context, IList`1 entities, TableInfo tableInfo, Action`1 progress)
   at EFCore.BulkExtensions.DbContextBulkExtensions.BulkInsert[T](DbContext context, IList`1 entities, BulkConfig bulkConfig, Action`1 progress)

If I try to close the connection, which already seems hazardous action:

using (var tran = Context.Database.BeginTransaction())
{
   Context.Database.GetDbConnection().Close();
   Context.BulkInsert(assets);
   tran.Commit();
}

I get the following result:

This SqlTransaction has completed; it is no longer usable.

at System.Data.SqlClient.SqlTransaction.ZombieCheck()
at System.Data.SqlClient.SqlTransaction.Commit()
at Microsoft.EntityFrameworkCore.Storage.RelationalTransaction.Commit()

I did some digging into the source code of yours, and i found out that BulkInsert tries to open a connection without checking that if one is already open. I also found out that the SqlBulkCopy is not getting the reference of an transaction even if one is ongoing already.

I played around with your project a bit and made it to work with my needs, but the question is that is this behavior something which is by design or can this be a bug?

"The given ColumnMapping does not match up with any column in the source or destination" - When using GetDataTable<T>

First i was receiving this exception

NotSupportedException: DataSet does not support System.Nullable<>.
Module "System.Data.DataColumn", line 164, col 0, in .ctor
Void .ctor(System.String, System.Type, System.String, System.Data.MappingType)
Module "System.Data.DataColumnCollection", line 8, col 0, in Add
System.Data.DataColumn Add(System.String, System.Type)
File "C:\Users\tom.adams\Source\Repos\EFCore.BulkExtensions\EFCore.BulkExtensions\SqlBulkOperation.cs", line 175, col 21, in GetDataTable
System.Data.DataTable GetDataTable[T](Microsoft.EntityFrameworkCore.DbContext, System.Collections.Generic.IList`1[T])
File "C:\Users\tom.adams\Source\Repos\EFCore.BulkExtensions\EFCore.BulkExtensions\SqlBulkOperation.cs", line 75, col 25, in MoveNext
Void MoveNext()
Module "System.Runtime.ExceptionServices.ExceptionDispatchInfo", line 12, col 0, in Throw
Void Throw()
Module "System.Runtime.CompilerServices.TaskAwaiter", line 46, col 0, in HandleNonSuccessAndDebuggerNotification
Void HandleNonSuccessAndDebuggerNotification(System.Threading.Tasks.Task)

Which seems to be caused by this line and the column was of type DateTime?

dataTable.Columns.Add(columnName, property.PropertyType);

i managed to solve this by changing the line to
dataTable.Columns.Add(columnName, Nullable.GetUnderlyingType(property.PropertyType) ?? property.PropertyType);

But then it throws this exception

System.InvalidOperationException: The given ColumnMapping does not match up with any column in the source or destination.
at System.Data.SqlClient.SqlBulkCopy.AnalyzeTargetAndCreateUpdateBulkCommand(BulkCopySimpleResultSet internalResults)
at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternalRestContinuedAsync(BulkCopySimpleResultSet internalResults, CancellationToken cts, TaskCompletionSource1 source) --- End of stack trace from previous location where exception was thrown --- at EFCore.BulkExtensions.SqlBulkOperation.InsertAsync[T](DbContext context, IList1 entities, TableInfo tableInfo, Action`1 progress) in C:\Users\tom.adams\Source\Repos\EFCore.BulkExtensions\EFCore.BulkExtensions\SqlBulkOperation.cs:line 78

On this line

await sqlBulkCopy.WriteToServerAsync(dataTable);

Any ideas what could be causing this?

tableInfo.BulkConfig.UseTempDB = false;

Would it be possible for you to remove the following line of code? The code works fine without it if used within explicit transaction. The line causes issue in Sql Server since it requires account to have high privilege like db_owner or similar since normal tables are used as temp tables in merge operation and have to be created and dropped.

if (operationType != OperationType.Insert)
            {
                tableInfo.BulkConfig.UseTempDB = false; // TempDB can only be used with Insert.
                // Other Operations done with customTemp table.
                // If using tempdb[#] throws exception: 'Cannot access destination table' (gets Droped too early, probably because transaction ends)
            }

Need To be careful when using BulkInsertOrUpdate With Composite Key

When using the BulkInsertOrUpdate operation, you have to be careful if your primary key is a Composite key, let's say you pass in a list of records, some of them are new and need to be created (Inserted), one part of the key is same as an existing key (i.e. Part of the composite key), the operation will treat it as an existing record and will overwrite it, instead of inserting a new on, I got burned by that and spent hours banging my head.

Bulk Insert Should Update ID's

I am doing a bulk insert but would then like to get the entities I am inserting to have their primary keys updated. Right now, they stay as zero. I would like to do this so I can bulk insert related entities but I need to set the foreign key.

How can I achieve this?

Batch Size

Thank you kindly for this library.

Previously I was using http://entityframework-extensions.net/?z=codeplex which had the ability to specify a batchSize. I found this was essential for my project as sometimes we need to insert millions of records which will fail unless we can batch them up. Any chance of adding BatchSize to this library?

Bulk Insert Case sensitivity problem with column names

bulk insert treats column names case sensitive somewhere
therefore if there is even a single case difference in column name in modelBuilder, the Bulk Inser fails.
for example userID vs userId fails immediately

entity.Property(e => e.userId)
                    .HasColumnName("userID")
                    .HasMaxLength(100);

database column name is userId.
C# code refers as userID

Nuget 2.0.9 'reference not set to an instance' Question, and DbSchema keyword support [SquareBrackets]

I am trying to bulk insert some data according to example given on here but it is throwing exception that I can't trace.

Here is the code

using(var tr = await _context.Database.BeginTransactionAsync())
            {
                await _context.BulkInsertAsync(model);
                await _context.SaveChangesAsync();
                tr.Commit();
            }

It throws exception on _context.BulkInsertAsync

Below is the stack trace

{System.NullReferenceException: Object reference not set to an instance of an object.

at EFCore.BulkExtensions.TableInfo.<LoadData>b__65_2[T](IProperty a)
 at System.Linq.Enumerable.Any[TSource](IEnumerable`1 source, Func`2 predicate)
 at EFCore.BulkExtensions.TableInfo.LoadData[T](DbContext context, Boolean loadOnlyPKColumn)
 at EFCore.BulkExtensions.TableInfo.CreateInstance[T](DbContext context, IList`1 entities, OperationType operationType, BulkConfig bulkConfig)
 at EFCore.BulkExtensions.DbContextBulkTransaction.ExecuteAsync[T](DbContext context, IList`1 entities, OperationType operationType, BulkConfig bulkConfig, Action`1 progress)
 at EFCore.BulkExtensions.DbContextBulkExtensions.BulkInsertAsync[T](DbContext context, IList`1 entities, BulkConfig bulkConfig, Action`1 progress)
 at AccountService.<model>d__17.MoveNext() in C:\Project\AccountService.cs:line 303
--- End of stack trace from previous location where exception was thrown ---`

Thanks

Support for SQLLITE?

Hello,

Is there any support for SQLLite, if not, is there a plan to do so in the future? and when?

Kind regards

Opened connections are not closed properly

If you try to ~200-300 insertations you will fail with exception - can not open new connection.

It can be fixed easy - wrap sqlBulkCopy into using in SqlBulkOperation.cs Insert method:

public static void Insert<T>(DbContext context, IList<T> entities, TableInfo tableInfo, Action<double> progress = null, int batchSize = 2000)
        {
            var sqlBulkCopy = new SqlBulkCopy(context.Database.GetDbConnection().ConnectionString)
            {
                DestinationTableName = tableInfo.InsertToTempTable ? tableInfo.FullTempTableName : tableInfo.FullTableName,
                BatchSize = batchSize,
                NotifyAfter = batchSize
            };

            using (sqlBulkCopy) // add using here
            {
                sqlBulkCopy.SqlRowsCopied += (sender, e) => { progress?.Invoke(e.RowsCopied / entities.Count); };

                foreach (var element in tableInfo.PropertyColumnNamesDict)
                {
                    sqlBulkCopy.ColumnMappings.Add(element.Key, element.Value);
                }
                using (var reader = ObjectReader.Create(entities, tableInfo.PropertyColumnNamesDict.Keys.ToArray()))
                {
                    sqlBulkCopy.WriteToServer(reader);
                }
            }
        }

Fix here: commit

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.