mikependon / repodb Goto Github PK
View Code? Open in Web Editor NEWA hybrid ORM library for .NET.
License: Apache License 2.0
A hybrid ORM library for .NET.
License: Apache License 2.0
Conversion of CacheItem to IEnumerable exception is being thrown if the cache key is being passed in the repository.Query.
Please create a unit test and integration for this one.
As requested by one of our Senior Developer, please support the raw SQL Statement array parameters.
Please see below.
var param = new [] { 1, 2, 3, 4 };
using (var connection = new SqlConnection(connectionString))
{
connection.ExecuteQuery<Person>("SELECT * FROM [dbo].[Person] WHERE Id IN (@Ids);", new { Ids = param });
}
When the Map
attribute is defined at the class level with [dbo].[...]
formatting, an exception is thrown if the PrimaryKey
attribute and Id
field is not defined.
To avoid type coupling, base class DataEntity must be removed.
As analyzed, please remove the MultiMapping Support. There are very less scenario in the actual implementation that usually uses this feature.
Captured by the Unit Testing - CreateMergeTest.TestCreateMergeWithoutMappings
If the data entity class does not have a PrimaryKey, the "where" parameter is throwing a "PrimaryKeyNotFoundException".
When parameters are created we rely on the ADO.NET facility to derive the right db types. In cases the type is not supported, we can use TypeMap
to support it. This works so far but maybe not the best approach for the ff reasons:
Using Query
works fine, but for BatchQuery
the MapAttribute
is ignored.
Here is a simple example:
CREATE TABLE [dbo].[Test](
[Id] [int] NOT NULL,
[WrongName] [nvarchar](50) NOT NULL
) ON [PRIMARY]
GO
public class Test : DataEntity {
public int Id {get;set;}
[Map("WrongName")]
public string RightName {get;set;}
}
using (var con = new SqlConnection(ConnectionString).EnsureOpen())
{
con.Query<Test>(); // works fine
// SqlException: Invalid column name 'RightName'.
con.BatchQuery<Test>(0,2,OrderField.Parse(new {Id = Order.Descending }));
}
The generated SQL is:
WITH CTE
AS (
SELECT ROW_NUMBER() OVER (
ORDER BY [Id] DESC
) AS [RowNumber],
[Id],
[WrongName] -- correct
FROM [Test]
)
SELECT [Id], [RightName] -- not correct
FROM CTE
WHERE ( [RowNumber] BETWEEN 1 AND 2 )
ORDER BY [Id] DESC
The Problem is in SqlDbStatementBuilder.cs, line 61 which has .FieldsFrom(fields)
, the problem stops occuring when the line is replaced with .FieldsFrom(Command.BatchQuery)
.
I'd send a PR, but with the lag of tests and my lack of deeper understanding I neither know if this will introduce further bugs nor if it has performance penalties.
This is implemented already at ExecuteReaderInternal as a requested by a team member. But it is good to support such feature in Scalar and NonQuery methods.
An exception is thrown.
Additional information: Object cannot be cast from DBNull to other types.
IL Reflection Emitting Problem.
To support a simplified expression-based coding when calling certain operations the required a Field, kindly support the following.
RepoDb.Field.From<Person>(p => p.Name);
Usable at InlineMerge, Merge, etc operations
Merge failed when PK column was not passed in the dataset. It used to be working but have now failed after the latest fixes in Master.
Tests available in my fork https://github.com/rdagumampan/RepoDb.
Open the
[Test]
public void TestMergeInsert()
{
//arrange
var repository = new DbRepository<SqlConnection>(Constants.TestDatabase);
var fixtureData = new Customer
{
GlobalId = Guid.NewGuid(),
FirstName = "Juan-MERGED",
LastName = "de la Cruz-MERGED",
MiddleName = "Pinto-MERGED",
Address = "San Lorenzo, Makati, Philippines 4225-MERGED",
IsActive = true,
Email = "[email protected]",
LastUpdatedUtc = DateTime.UtcNow,
LastUserId = Environment.UserName
};
//act
repository.Merge(fixtureData);
//assert
var customer = repository.Query<Customer>(new { fixtureData.GlobalId }).FirstOrDefault();
customer.ShouldNotBeNull();
customer.Id.ShouldNotBe(0);
customer.GlobalId.ShouldBe(fixtureData.GlobalId);
customer.FirstName.ShouldBe(fixtureData.FirstName);
customer.LastName.ShouldBe(fixtureData.LastName);
customer.MiddleName.ShouldBe(fixtureData.MiddleName);
customer.Address.ShouldBe(fixtureData.Address);
customer.Email.ShouldBe(fixtureData.Email);
customer.IsActive.ShouldBe(fixtureData.IsActive);
customer.LastUpdatedUtc.ShouldBe(fixtureData.LastUpdatedUtc);
customer.LastUserId.ShouldBe(fixtureData.LastUserId);
}
System.Data.SqlClient.SqlException : Must declare the scalar variable "@Id".
at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)
@rdagumampan - integration test scenarios captured this.
Please add support into this or declare if this will be supported in the next release or not.
An exception is being thrown if running multiple operations in parallel using Task.Factory.StartNew.
at System.Collections.Generic.Dictionary`2.Insert(TKey key, TValue value, Boolean add) at System.Collections.Generic.Dictionary`2.Add(TKey key, TValue value) at RepoDb.PropertyCache.Get[TEntity](Command command) at RepoDb.PropertyMapNameCache.Get[TEntity](Command command) at RepoDb.QueryBuilder`1.Fields(Command command) at RepoDb.SqlDbStatementBuilder.CreateQuery[TEntity](QueryBuilder`1 queryBuilder, QueryGroup where, Nullable`1 top, IEnumerable`1 orderBy) at RepoDb.DbRepository`1.Query[TEntity](QueryGroup where, IDbTransaction transaction, Nullable`1 top, IEnumerable`1 orderBy, String cacheKey) at RepoDb.DbRepository`1.Query[TEntity](Object where, IDbTransaction transaction, Nullable`1 top, IEnumerable`1 orderBy, String cacheKey) at LocalVisitLogHistoricalDataProcessor.Program.ProcessClassifications() in C:\Users\MIPEN\Desktop\LocalVisitLogHistoricalDataProcessor\LocalVisitLogHistoricalDataProcessor\Program.cs:line 119 at System.Threading.Tasks.Task.InnerInvoke() at System.Threading.Tasks.Task.Execute()
After introducing the GetHashCode, IEquatable, Operator (== / !=), and Equals overriding, this feature optimization can then be commenced.
Performance benchmark result is gone.
Hi @rdagumampan,
Do you have the result in your local forked project? Can you place it here? https://github.com/RepoDb/RawDataAccessBencher/tree/master/Results
Thank you very much.
Best regards,
Michael
To support a simplified expression-based coding when calling certain operations that required an OrderField, kindly support the following.
RepoDb.OrderField.Ascending<Person>(p => p.Name);
Usable at Query operations.
When executing repository.Query, the resulting query is using TOP instead of using the PK of the table.
These two approach returns 2 different rows when we expect the same row.
var repository = new DbRepository<SqlConnection>(Constants.TestDatabase);
var customer = repository.Query<Customer>(1).FirstOrDefault();
var customer = repository.Query<Customer>(new { Id = 1 }).FirstOrDefault();
[Map("DatabaseCheckActivity")]
public class DatabaseCheckActivityAllocation
{
public Guid Id { get; set; }
public long LargePageAllocationsInKb { get; set; }
public long LockedPageAllocationsInKb { get; set; }
public long TotalVirtualAddressSpaceInKb { get; set; }
public long VirtualAddressSpaceReservedInKb { get; set; }
public long VirtualAddressSpaceCommittedInKb { get; set; }
public long VirtualAddressSpaceAvailableInKb { get; set; }
}
var allocation = _serviceMonitorRepository
.Query<WebModels.DatabaseCheckActivityAllocation>(new { Id = databaseCheckId })
.Select(databaseCheckActivity => Convert(databaseCheckActivity))
.FirstOrDefault();
Result: The allocation.Id is an Empty Guid
An exception is thrown when using the Operation.In
with multiple values more than 2 values.
If any of the properties are Nullable within the DataEntity class, the Reflection Emitter fails to submit the correct data.
Found at v1.0.14
Frans:
be worth looking into compiling a generated expression tree which accepts a datareader and results in a new, materialized poco instance
:)
You can use the results, it was a release build run. As your set fetch pipeline is a bit behind the pack, it might be worth looking into compiling a generated expression tree which accepts a datareader and results in a new, materialized poco instance. So generate something like:
r=>new p() { Field1=r.GetInt32(0), Field2 = r.GetString(1), ... , Fieldn=r.GetInt32(n-1);} where the type of the get method called of course depends on the type of the field at ordinal n. For fields that are nullable, you have to check for nulls of course, so you then have to generate something like Field1 = r.IsDBNull(0) ? null : r.GetInt32(0), it's up to you of course to determine which fields are null and whether you need to do actual testing.
Linq to DB (and I suspect tortuga does too) check the resultset's schema table prior to consuming the resultset and based on that skip null checks for some fields which are actually nullable (if the meta-data says no nulls are expected, you could skip the check). This is the reason Linq to DB is ahead of the pack (and I suspect tortuga is too, there's no other way to get this close to the handwritten materializer in set fetches). It's unfortunately not reliable in many cases, so for this benchmark it might work, in general cases in might not, hence I skipped implementing it in llblgen pro (and therefore have to settle for a spot in the middle ;))
The generated expression tree represents a LambdaExpression, which is then compiled using its Compile() method and of course cached so you don't compile it again with the next query.
Test shows the BLOB types are currently not support. The IL generator used to fast-read the datareader are emitting empty byte arrays for data columns that are Image, VarBinary or VarBinaryMax. Insert works fine but repository cannot read the byte arrays back.
Integration tests available here https://github.com/rdagumampan/RepoDb
[Test]
public void BlobTypeMap()
{
//arrange
var resourceName = "RepoDb.IntegrationTests.Setup.hello-world.png";
var baseByteData = ExtractResource(resourceName);
var fixtureData = new Models.TypeMapBlob
{
image_column = baseByteData,
varbinary_column = baseByteData,
varbinarymax_column = baseByteData
};
//act
var sut = new DbRepository<SqlConnection>(Constants.TestDatabase);
var returnedId = sut.Insert(fixtureData);
//TODO: support guid primary key
//assert
var saveData = sut.Query<Models.TypeMapBlob>(top: 1).FirstOrDefault();
saveData.ShouldNotBeNull();
saveData.image_column.SequenceEqual(fixtureData.image_column).ShouldBe(true);
saveData.varbinary_column.SequenceEqual(fixtureData.varbinary_column).ShouldBe(true);
saveData.varbinarymax_column.SequenceEqual(fixtureData.varbinarymax_column).ShouldBe(true);
}
System.EntryPointNotFoundException : Entry point was not found.
at System.Collections.Generic.IEnumerable`1.GetEnumerator()
at System.Linq.Enumerable.SequenceEqual[TSource](IEnumerable`1 first, IEnumerable`1 second, IEqualityComparer`1 comparer)
at RepoDb.IntegrationTests.TestBlobDataTypeMapping.BlobTypeMap() in C:\play\_forked\RepoDb\RepoDb\RepoDb.Tests\RepoDb.IntegrationTests\TestBlobDataTypeMapping.cs:line 72
Frans:
check the resultset's schema table prior to consuming the resultset and based on that skip null checks for some fields which are actually nullable (if the meta-data says no nulls are expected, you could skip the check)
Intermittent: Exception below is being thrown when repository.Query is called.
at System.ThrowHelper.ThrowArgumentException(ExceptionResource resource) at System.Collections.Generic.Dictionary`2.Insert(TKey key, TValue value, Boolean add) at System.Collections.Generic.Dictionary`2.Add(TKey key, TValue value) at RepoDb.Reflection.TypeCache.Get(TypeTypes type) at RepoDb.Reflection.ReflectionFactory.CreateTypes(TypeTypes[] types) at RepoDb.Reflection.TypeArrayCache.Get(TypeTypes[] type) at RepoDb.Reflection.DelegateFactory.GetDataReaderToDataEntityDelegate[TEntity](DbDataReader reader) at RepoDb.Reflection.DelegateCache.GetDataReaderToDataEntityDelegate[TEntity](DbDataReader reader) at RepoDb.Reflection.DataReaderConverter.ToEnumerable[TEntity](DbDataReader reader) at RepoDb.Extensions.DbConnectionExtension.ExecuteQuery[TEntity](IDbConnection connection, String commandText, Object param, Nullable`1 commandType, Nullable`1 commandTimeout, IDbTransaction transaction, ITrace trace) at RepoDb.DbRepository`1.ExecuteQuery[TEntity](String commandText, Object param, Nullable`1 commandType, Nullable`1 commandTimeout, IDbTransaction transaction) at RepoDb.DbRepository`1.Query[TEntity](QueryGroup where, IDbTransaction transaction, Nullable`1 top, IEnumerable`1 orderBy, String cacheKey) at RepoDb.DbRepository`1.Query[TEntity](Object where, IDbTransaction transaction, Nullable`1 top, IEnumerable`1 orderBy, String cacheKey) at LocalVisitLogHistoricalDataProcessor.Program.ProcessClassifications() in C:\Users\MIPEN\Desktop\LocalVisitLogHistoricalDataProcessor\LocalVisitLogHistoricalDataProcessor\Program.cs:line 111 at System.Threading.Tasks.Task.InnerInvoke() at System.Threading.Tasks.Task.Execute()
var queryContext = new { StartDateTimeUtc = (object)null };
repository.Query(queryContext);
I should expect a "SELECT ... WHERE StartDateTimeUtc IS NULL;"
When query is executed, the parameter builder still process properties that are marked to Ignore. In this case DateInsertedUtc must be ignored when building the parameter bucket. If the property is not initialized, it will set to default value.
[Map("[dbo].[Customer]")]
public class Customer : DataEntity
{
[Attributes.Ignore(Command.Insert | Command.Update | Command.Merge | Command.InlineUpdate), Primary(true), Map("Id")]
public int Id { get; set; }
public Guid GlobalId { get; set; }
public string FirstName { get; set; }
public string MiddleName { get; set; }
public string LastName { get; set; }
public string Address { get; set; }
public string Email { get; set; }
public bool IsActive { get; set; }
public DateTime LastUpdatedUtc { get; set; }
[Attributes.Ignore(Command.Update | Command.Insert)]
public DateTime DateInsertedUtc { get; set; }
public string LastUserId { get; set; }
}
[Test]
public void TestInsert()
{
//arrange
var repository = new DbRepository<SqlConnection>(RepoDbConnectionString, 0);
var fixtureData = new Customer
{
GlobalId = Guid.NewGuid(),
FirstName = "Juan",
LastName = "de la Cruz",
MiddleName = "Pinto",
Address = "San Lorenzo, Makati, Philippines 4225",
IsActive = true,
Email = "[email protected]",
//DateInsertedUtc = DateTime.UtcNow, COMMENTED FOR TESTING
LastUpdatedUtc = DateTime.UtcNow,
LastUserId = Environment.UserName
};
//act
repository.Insert(fixtureData);
//assert
var customer = repository.Query<Customer>().FirstOrDefault();
customer.ShouldNotBeNull();
customer.Id.ShouldNotBe(0);
customer.GlobalId.ShouldBe(fixtureData.GlobalId);
customer.FirstName.ShouldBe(fixtureData.FirstName);
customer.LastName.ShouldBe(fixtureData.LastName);
customer.MiddleName.ShouldBe(fixtureData.MiddleName);
customer.Address.ShouldBe(fixtureData.Address);
customer.Email.ShouldBe(fixtureData.Email);
customer.IsActive.ShouldBe(fixtureData.IsActive);
customer.DateInsertedUtc.ShouldBe(fixtureData.DateInsertedUtc);
customer.LastUpdatedUtc.ShouldBe(fixtureData.LastUpdatedUtc);
customer.LastUserId.ShouldBe(fixtureData.FirstName);
}
System.Data.SqlTypes.SqlTypeException : SqlDateTime overflow. Must be between 1/1/1753 12:00:00 AM and 12/31/9999 11:59:59 PM.
at System.Data.SqlClient.TdsParser.TdsExecuteRPC(SqlCommand cmd, _SqlRPC[] rpcArray, Int32 timeout, Boolean inSchema, SqlNotificationRequest notificationRequest, TdsParserStateObject stateObj, Boolean isCommandProc, Boolean sync, TaskCompletionSource1 completion, Int32 startRpc, Int32 startParam) at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest) at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource
1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)
at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)
at System.Data.SqlClient.SqlCommand.ExecuteScalar()
at RepoDb.Extensions.DbConnectionExtension.ExecuteScalar(IDbConnection connection, String commandText, Object param, Nullable1 commandType, Nullable
1 commandTimeout, IDbTransaction transaction, ITrace trace) in C:\play\RepoDb\RepoDb\Extensions\DbConnectionExtension.cs:line 498
at RepoDb.DbRepository1.ExecuteScalar(String commandText, Object param, Nullable
1 commandType, Nullable1 commandTimeout, IDbTransaction transaction) in C:\play\RepoDb\RepoDb\DbRepository.cs:line 2058 at RepoDb.DbRepository
1.Insert[TEntity](TEntity entity, IDbTransaction transaction) in C:\play\RepoDb\RepoDb\DbRepository.cs:line 1094
at RepoDb.IntegrationTests.TestBasicCrud.TestInsert() in C:\play\RepoDb\RepoDb.IntegrationTests\TestBasicCrud.cs:line 38
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.