arangodb / spring-data Goto Github PK
View Code? Open in Web Editor NEWSpring Data ArangoDB
Home Page: https://www.arangodb.com/docs/stable/drivers/spring-data-getting-started.html
License: Apache License 2.0
Spring Data ArangoDB
Home Page: https://www.arangodb.com/docs/stable/drivers/spring-data-getting-started.html
License: Apache License 2.0
I would like to suggest the automatic creation of edges.
With an annotation like @Link
on a field.
class Athlete {
}
class Meeting {
@Link("EventInMeeting")
Set<Event> events;
}
class Event {
@Link("AthleteInEvent")
Set<Athlete> athletes
}
In these cases we can create edges named EventInMeeting and AthleteInEvent automatically.
Maybe add an indicator for from and to if there is a field with annotation on the other side as well.
The default is the annotated side is the from side.
The annotation could also accept a class instead of string in which case the class is expected to be annotated with @Edge
.
It seems documenting that no-arg constructor is required for Document mapping would be a good idea.[1.1][1.2]
If similar to hibernate overriding of equals and/or hashCode is needed, would be nice to have it documented as well.[2]
[1.1] Similar to 4.1.1. in
https://docs.jboss.org/hibernate/stable/core.old/reference/en/html/persistent-classes.html
[1.2] If there's no no-arg constructor in the presence of other constructors, a fairly cryptic exception is thrown which took a while to resolve:
org.springframework.data.mapping.MappingException: Parameter org.springframework.data.mapping.PreferredConstructor$Parameter@a6452387 does not have a name!
at org.springframework.data.mapping.model.PersistentEntityParameterValueProvider.getParameterValue(PersistentEntityParameterValueProvider.java:61)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator$EntityInstantiatorAdapter.extractInvocationArguments(ClassGeneratingEntityInstantiator.java:248)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator$EntityInstantiatorAdapter.createInstance(ClassGeneratingEntityInstantiator.java:221)
at org.springframework.data.convert.ClassGeneratingEntityInstantiator.createInstance(ClassGeneratingEntityInstantiator.java:86)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.read(DefaultArangoConverter.java:171)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.read(DefaultArangoConverter.java:111)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.read(DefaultArangoConverter.java:93)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.read(DefaultArangoConverter.java:63)
at com.arangodb.springframework.core.template.ArangoTemplate.fromDBEntity(ArangoTemplate.java:288)
at com.arangodb.springframework.core.template.ArangoTemplate.find(ArangoTemplate.java:448)
at com.arangodb.springframework.core.template.ArangoTemplate.find(ArangoTemplate.java:456)
at com.arangodb.springframework.core.convert.resolver.RefResolver.resolve(RefResolver.java:55)
at com.arangodb.springframework.core.convert.resolver.RefResolver.resolveOne(RefResolver.java:45)
at com.arangodb.springframework.core.convert.resolver.RefResolver.resolveOne(RefResolver.java:33)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.lambda$readReference$8(DefaultArangoConverter.java:268)
at java.util.Optional.flatMap(Optional.java:241)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.readReference(DefaultArangoConverter.java:251)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.lambda$readReferenceOrRelation$6(DefaultArangoConverter.java:229)
at java.util.Optional.flatMap(Optional.java:241)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.readReferenceOrRelation(DefaultArangoConverter.java:229)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.readProperty(DefaultArangoConverter.java:216)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.lambda$read$4(DefaultArangoConverter.java:180)
at org.springframework.data.mapping.model.BasicPersistentEntity.doWithAssociations(BasicPersistentEntity.java:355)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.read(DefaultArangoConverter.java:178)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.read(DefaultArangoConverter.java:111)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.read(DefaultArangoConverter.java:93)
at com.arangodb.springframework.core.convert.DefaultArangoConverter.read(DefaultArangoConverter.java:63)
at com.arangodb.springframework.core.template.ArangoTemplate.fromDBEntity(ArangoTemplate.java:288)
at com.arangodb.springframework.core.template.ArangoTemplate.find(ArangoTemplate.java:448)
at com.arangodb.springframework.core.template.ArangoTemplate.find(ArangoTemplate.java:456)
at com.arangodb.springframework.repository.SimpleArangoRepository.findById(SimpleArangoRepository.java:109)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.springframework.data.repository.core.support.RepositoryComposition$RepositoryFragments.invoke(RepositoryComposition.java:377)
at org.springframework.data.repository.core.support.RepositoryComposition.invoke(RepositoryComposition.java:200)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$ImplementationMethodExecutionInterceptor.invoke(RepositoryFactorySupport.java:636)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:600)
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:580)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185)
at org.springframework.data.projection.DefaultMethodInvokingMethodInterceptor.invoke(DefaultMethodInvokingMethodInterceptor.java:59)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185)
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185)
at org.springframework.data.repository.core.support.SurroundingTransactionDetectorMethodInterceptor.invoke(SurroundingTransactionDetectorMethodInterceptor.java:61)
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:185)
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:212)
at com.sun.proxy.$Proxy63.findById(Unknown Source)
at someJUnitIT.testSaveDelegate(UserRepositoryJUnitIT.java:78)
at someJUnitIT.testSaveWithAddress(UserRepositoryJUnitIT.java:67)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.junit.platform.commons.util.ReflectionUtils.invokeMethod(ReflectionUtils.java:389)
at org.junit.jupiter.engine.execution.ExecutableInvoker.invoke(ExecutableInvoker.java:115)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.lambda$invokeTestMethod$6(TestMethodTestDescriptor.java:167)
at org.junit.jupiter.engine.execution.ThrowableCollector.execute(ThrowableCollector.java:40)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.invokeTestMethod(TestMethodTestDescriptor.java:163)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:110)
at org.junit.jupiter.engine.descriptor.TestMethodTestDescriptor.execute(TestMethodTestDescriptor.java:57)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.lambda$execute$3(HierarchicalTestExecutor.java:83)
at org.junit.platform.engine.support.hierarchical.SingleTestExecutor.executeSafely(SingleTestExecutor.java:66)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:77)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.lambda$null$2(HierarchicalTestExecutor.java:92)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.lambda$execute$3(HierarchicalTestExecutor.java:92)
at org.junit.platform.engine.support.hierarchical.SingleTestExecutor.executeSafely(SingleTestExecutor.java:66)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:77)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.lambda$null$2(HierarchicalTestExecutor.java:92)
at java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
at java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:175)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
at java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:418)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.lambda$execute$3(HierarchicalTestExecutor.java:92)
at org.junit.platform.engine.support.hierarchical.SingleTestExecutor.executeSafely(SingleTestExecutor.java:66)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:77)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestExecutor.execute(HierarchicalTestExecutor.java:51)
at org.junit.platform.engine.support.hierarchical.HierarchicalTestEngine.execute(HierarchicalTestEngine.java:43)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:170)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:154)
at org.junit.platform.launcher.core.DefaultLauncher.execute(DefaultLauncher.java:90)
at org.eclipse.jdt.internal.junit5.runner.JUnit5TestReference.run(JUnit5TestReference.java:86)
at org.eclipse.jdt.internal.junit.runner.TestExecution.run(TestExecution.java:38)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:538)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.runTests(RemoteTestRunner.java:760)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.run(RemoteTestRunner.java:460)
at org.eclipse.jdt.internal.junit.runner.RemoteTestRunner.main(RemoteTestRunner.java:206)
[2] https://docs.jboss.org/hibernate/stable/core.old/reference/en/html/persistent-classes-equalshashcode.html
I use this code to generate a repository just for some queries which are supposed to be generic
as i on run-time decide the query and building a repository class for each collection is tidies and long process
i just tried an example to see if i get a response back, as i wasn't sure if you can do a repository of object
funny part the query executes and returns data when i debug but i get a ClassCastException after the query executes
`public interface GenericArangoRepo extends ArangoRepository{
@Query("FOR pv IN PV return pv")
Object listPVs();
}`
Caused by: java.lang.ClassCastException: Cannot cast com.arangodb.internal.ArangoCursorIterator to com.arangodb.springframework.core.template.ArangoExtCursorIterator
at java.lang.Class.cast(Class.java:3369) ~[na:1.8.0_161]
at com.arangodb.springframework.core.template.ArangoExtCursor.(ArangoExtCursor.java:41) ~[arangodb-spring-data-1.0.0.jar:na]
at com.arangodb.springframework.core.template.ArangoCursorInitializer.createInstance(ArangoCursorInitializer.java:48) ~[arangodb-spring-data-1.0.0.jar:na]
at com.arangodb.ArangoDatabase.createCursor(ArangoDatabase.java:334) ~[arangodb-java-driver-4.2.2.jar:na]
at com.arangodb.ArangoDatabase.query(ArangoDatabase.java:301) ~[arangodb-java-driver-4.2.2.jar:na]
at com.arangodb.springframework.core.template.ArangoTemplate.query(ArangoTemplate.java:314) ~[arangodb-spring-data-1.0.0.jar:na]
at com.arangodb.springframework.repository.query.ArangoAqlQuery.execute(ArangoAqlQuery.java:188) ~[arangodb-spring-data-1.0.0.jar:na]
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.doInvoke(RepositoryFactorySupport.java:499) ~[spring-data-commons-1.13.9.RELEASE.jar:na]
at org.springframework.data.repository.core.support.RepositoryFactorySupport$QueryExecutorMethodInterceptor.invoke(RepositoryFactorySupport.java:477) ~[spring-data-commons-1.13.9.RELEASE.jar:na]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) ~[spring-aop-4.3.14.RELEASE.jar:4.3.14.RELEASE]
at org.springframework.data.projection.DefaultMethodInvokingMethodInterceptor.invoke(DefaultMethodInvokingMethodInterceptor.java:56) ~[spring-data-commons-1.13.9.RELEASE.jar:na]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) ~[spring-aop-4.3.14.RELEASE.jar:4.3.14.RELEASE]
at org.springframework.aop.interceptor.ExposeInvocationInterceptor.invoke(ExposeInvocationInterceptor.java:92) ~[spring-aop-4.3.14.RELEASE.jar:4.3.14.RELEASE]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) ~[spring-aop-4.3.14.RELEASE.jar:4.3.14.RELEASE]
at org.springframework.data.repository.core.support.SurroundingTransactionDetectorMethodInterceptor.invoke(SurroundingTransactionDetectorMethodInterceptor.java:57) ~[spring-data-commons-1.13.9.RELEASE.jar:na]
at org.springframework.aop.framework.ReflectiveMethodInvocation.proceed(ReflectiveMethodInvocation.java:179) ~[spring-aop-4.3.14.RELEASE.jar:4.3.14.RELEASE]
at org.springframework.aop.framework.JdkDynamicAopProxy.invoke(JdkDynamicAopProxy.java:213) ~[spring-aop-4.3.14.RELEASE.jar:4.3.14.RELEASE]
at com.sun.proxy.$Proxy53.bulkCommitItems(Unknown Source) ~[na:na]
at arangoboot.BulkRunner.run(BulkRunner.java:21) ~[classes/:na]
at org.springframework.boot.SpringApplication.callRunner(SpringApplication.java:732) [spring-boot-1.5.10.RELEASE.jar:1.5.10.RELEASE]
... 5 common frames omitted
At present:
And retrieval for 2. & 3. is problematic.
IMHO, persisted values should be as in the following examples:
Until that better future support, it seems there's no choice but to use 1. for now, because (at least for UTC) it kind of looks like 2. would look with better support (but it's a bit confusing, because in Java one would have non-UTC-aware LocalDateTime, which would be persisted essentially as UTC-based OffsetDateTime).
Hello!
I found an error while deserializing nested maps. See the following class:
public class Test {
@Id
private String id;
@Key
private String key;
private final Map<String, Object> attrs = new HashMap<>();
public put(String key, Object value) {
attrs.put(key, value);
}
// getter & setter skipped
}
Test test = new Test();
Map<String, Object> innerMap = new HashMap<>();
innerMap.put("key", "value");
test.put("map", innerMap);
arangoOperations.insert(test);
arangoOperations.find(test.getKey(), Test.class).get();
This raises a org.springframework.data.mapping.MappingException: No mapping metadata found for type java.lang.Object
exception at DefaultArangoConverter.java:168
.
The exception has its root in the following method of DefaultArangoConverter
:
private Object read(final TypeInformation<?> type, final DBEntity source) {
// ...
final Optional<? extends ArangoPersistentEntity<?>> entity = Optional
.ofNullable(context.getPersistentEntity(type.getType()));
return read(type, source, entity);
}
As a side note: Optional
was not intended to be used as arguments (see here). Also it would be more clear to raise the exception where it actually happened (at context.getPersistentEntity()
). There is also a nice method from Spring Data's MappingContext
, which raises an exception for you: context.getRequiredPersistentEntity()
.
To fix the problem with nested maps and lists, we could check if the type is java.lang.Object
and create a map if the source is of type DBDocumentEntity
or a list if the source is of type DBCollectionEntity
.
What do you think?
Store type hints in documents when they are persisted to the DB (if the property type and actual type do not match).
Example:
public interface Person {
String getName();
}
public class Employee implements Person {
private String name;
public void setName(String name) {
this.name = name;
}
@Override
public String getName() {
return name;
}
}
public class Company {
@Key
private String key;
private Person manager;
public String getKey() {
return key;
}
public Person getManager() {
return manager;
}
public void setManager(Person manager) {
this.manager = manager;
}
}
Employee manager = new Employee();
manager.setName("Jon Doe");
Company comp = new Company();
comp.setManager(manager);
arangoOperations.insert(comp);
arangoOps.find(comp.getKey(), Company.class);
This throws an exception because interface Person
can not be instantiated. To fix this problem we need to store an additional property (e.g. _type
) with the name of the class:
{
"manager": {
"_type": "package.Employee"
"name": "Jon Doe"
}
}
First, a org.springframework.data.convert.TypeMapper
must be implemented that reads and writes the type information. Then the DefaultArangoConverter
needs to write the type hints if the property type and the actual type do not match (here Person
and Employee
). If an object is read and has a type hint, then this class must be used.
I will have a look at this at the weekend.
after #78 is done, this feature should work for @Id
This should be well tested too.
E.g., given the following example (simplified) classes:
@ Document("Parent")
class Parent {
String one;
}
@ Document("Child")
class Child extends Parent {
String two;
}
@ Document("Aggregate")
class Aggregate {
@ Ref
Parent child;
}
After saving an instance of Aggregate with child (having property two), an instance with a ref key like Child/4214688 is persisted (e.g., under aggregate with key like Aggregate/4214696). But when retrieving the data for key Aggregate/4214696, it ends up having a child property of type Parent (thus not providing property Child.two).
Frankly, based on type in the key value, i expected the child property of type Child. This would be similar to
https://docs.oracle.com/javaee/5/api/javax/persistence/InheritanceType.html#TABLE_PER_CLASS
The way it's now, there seems to be no support for (automatically handled) inheritance with ArangoDB. E.g., in JPA there are 3 options for automatic inheritance mapping:
https://docs.oracle.com/javaee/5/api/javax/persistence/InheritanceType.html
I haven't tried a custom query (with joins etc.) yet, but even if that works, developers shouldn't be burdened with that kind of work in this day and age, IMHO.
P.S. Document annotation might be mangled by the system here. To be clear, the following is meant:
https://github.com/arangodb/spring-data#document
Same for Ref:
https://github.com/arangodb/spring-data#reference
When saving a simple Java bean (Character from the demo) I'm getting the error:
org.springframework.core.convert.ConversionFailedException: Failed to convert from type [java.lang.String] to type [java.time.Instant] for value ...
where the value is from a String property (name), containing nothing resembling a date.
Debugging this I see that in the method getOrCreateAndCache
in com.arangodb.springframework.core.convert.CustomConversions
calls producer.get()
which has the property sourceType.name
set to "java.lang.String" but returns the type "java.time.Instant".
I'm stuck with Spring Boot 1.5.13.RELEASE, as I'm waiting for Apache Camel to support v2, and so I'm using version 1.1.2 of arangodb-spring-data.
Any ideas what may be causing this problem?
In the mapping section of the README it is stated that
The Java class needs a non parameterized constructor
but this is incorrect. The DefaultArangoConverter
uses the EntityInstantiator
from Spring Data together with a value provider, which is capable of handling non-default constructors.
You have two possibilities:
@PersistenceConstructor
All parameters must match the name of a class property and must be retained at runtime (more info).
This "unknown feature" should be documented in the README. Issue #10 could also be closed.
It would be beneficial to have an annotation to generate _key
(and therefore _id
) values based on some criteria defined in there. Take Couchbase implementation for example - it's fairly simple to use, fairly well built and makes the schema very flexible without a necessity to duplicate code.
My Person Repository :
Iterable findByName(String name);
when I fetch data using native query or findByName query with Filter one column is always error like this
java.time.format.DateTimeParseException: Text 'dewi' could not be parsed at index 0
at java.time.format.DateTimeFormatter.parseResolved0(DateTimeFormatter.java:1949) ~[na:1.8.0_172]
at java.time.format.DateTimeFormatter.parse(DateTimeFormatter.java:1851) ~[na:1.8.0_172]
at java.time.Instant.parse(Instant.java:395) ~[na:1.8.0_172]
and soo on.....
"dewi" is the parameter name that i input.
I used
spring boot 2.0.2.RELEASE
arangodb-spring-data 2.1.3
Db : ArangoDB Enterprise Edition 3.3.8 and 3.3.9.
This is a follow-up issue of #46 and DATACMNS-1323.
Oliver Gierke thinks that something like a cursor should not be used in client repository code. Any opinions on this?
If it should be further supported, we can't just override QueryMethod.getReturnedObjectType()
as this leads to a NPE. @mpv1989 added the workaround from #46.
The "Spring Data way", as pointed out by Oliver Gierke, is to override the RepositoryMetadata.getReturnedDomainClass()
in a custom class and return it from ArangoRepositoryFactory.getRepositoryMetadata()
.
According to Oliver Gierke the QueryMethod.getReturnedObjectType()
is "some kind of left-over from the pre-ResultProcessor days". Therefore, we should also use the result processor, which can also be used for projections.
Any comments? ;)
I'm using arangodb-spring-data and get an error like this issue except that i don't use spring boot and flux. my application is just a rest mvc based spring application that want to use arangodb using spring data repository. need to add that in this application I use jpa repository as well to connect to mysql.
We should log possible query warnings.
For instance, for the
https://github.com/arangodb/spring-data#reference
example, when address.id is null, it's not currently auto-persisted; thus it currently has to be persisted separately (prior to persisting the parent entity). In JPA implementations i've worked with, the association(s) would be auto-persisted (allowing to persist an entire aggregate[1] by making only 1 call to persist the aggregate root).
I think it would be beneficial (for users & ArangoDB) if arangodb/spring-data functioned similarly: it would make development easier, & facilitate adoption of ArangoDB. The implementation also shouldn't be difficult...
Currently a field annotated with @Id
is saved as _id
in ArangoDB. Since this field is more for internal usage and does not offer any benefit to the user, we should change @Id
to represent the document field _key
instead, like @Key
currently does. This eliminates the need of having two fields (for @Id
and @Key
) when using user generated keys.
Feature request:
Add options for countProjection and existsProjection in @query.
For the moment, results are only converted by ArangoResultConverter with predefined type mappings.
While there is a need for writing queries that count or check existence just like derived queries.
This would be consistent with FetchType.LAZY:
https://javaee.github.io/javaee-spec/javadocs/javax/persistence/FetchType.html
This could in fact be implemented using the same enum.
P.S. In JPA, this frequently works as follows:
lazily-fetched association (especially if it's to-many) isn't fetched right away, but if it's dereferenced, it's fetched at that moment (before dereferencing method returns); i.e., on as needed basis.
P.P.S. I haven't seen something similar to this in the current implementation, hence logging this here...
In @Edge
, & DefaultArangoConverter.
The core equivalent from JPA 1 is the following annotation:
https://javaee.github.io/javaee-spec/javadocs/javax/persistence/MapKey.html
P.S.
But a few other "bells-and-whistles" have been added in JPA 2. E.g.:
https://javaee.github.io/javaee-spec/javadocs/javax/persistence/MapKeyEnumerated.html
P.P.S. It's also worth considering here whether to use the same annotation(s), or to add distinct one(s).
Hello @mpv1989,
I'm just looking at the TimeStringConverters
and JodaTimeStringConverters
and I see that you are using the java.util.Date
there and the DateUtil
with the not thread-safe SimpleDateFormat
(I saw your trick with ThreadLocal
;)).
I think we should move to the new and thread-safe DateTimeFormatter
and leave java.util.Date
and SimpleDateFormat
behind.
The next thing we should think about is how we serialize the LocalDateTime
and LocalDate
. I saw that you are converting it to UTC in TimeStringConverters
, but not in JodaTimeStringConverters
.
As LocalDateTime
and LocalDate
are explicitly local (= without timezone) we should maybe keep it as it is (since we don't know if really the system default is used). If the user really wants UTC, he could use LocalDateTime.now(ZoneOffset.UTC)
or Instant
, which is UTC by definition.
I'm not sure if the auto-conversion is what users want. I saw a user complaining about it on SO for mongo java driver.
How about this conversions:
Class | Format |
---|---|
java.util.Date |
yyyy-MM-dd'T'HH:mm:ss.SSS'Z' |
LocalDate |
yyyy-MM-dd |
LocalDateTime |
yyyy-MM-dd'T'HH:mm:ss.SSS |
Instant |
yyyy-MM-dd'T'HH:mm:ss.SSS'Z' |
OffsetDateTime |
yyyy-MM-dd'T'HH:mm:ss.SSSXXX |
PR #33 introduced data pollution for mainstream inheritance types like TABLE/COLLECTION per ENTITY/DOCUMENT. For TABLE/COLLECTION_PER_CLASS type inheritance there is already an entire
TABLE/COLLECTION dedicated for the class involved, so there's no need to store anything type-related as a property/column (let alone the fully-qualified class name).
Does it support transactions?
I was trying to perform transactional operation using spring data arangodb,but it failedใIs there any transaction mechanism in arangodb๏ผ
@Transactional(rollbackFor = Exception.class)
public Response createDocument(DocumentsDoc docuemnt) throws Exception {
documentsRepository.save(docuemnt);
throw new Exception("test");
}
do the To and From annotations work in the spring data project?
when I try to retrieve edge documents using the From attribute, I get "nested exception is java.lang.StackOverflowError". I can see the docs and the edge in the database. but when I use the repository (actorRepository.findByName), i get the above error. When I remove the From annotation, I don't get that error
@Edge(value = "roles")
public class Role{
@Id
private String id;
@From
private Actor actor;
@To
private Movie movie;
}
@Document
public class Movie movie{
@Id
private String id;
...other stuff...
}
@Document
public class Actor actor{
@Id
private String id;
private String name;
...other stuff...
@From
private List<Role> roles;
}
Type:
java.sql.Date date;
OR
java.time.LocalDate date;
Requested:
this.date = Date.valueOf(LocalDate.now(ZoneOffset.UTC).toString());
OR
this.date = LocalDate.now(ZoneOffset.UTC);
E.g.: 2018-03-29
Persisted, e.g.:
2018-03-29T00:00:00.000Z
Here everything starting with T is really unnecessary: in terms of storage and readability.
P.S.
For java.sql.Date, this also causes retrieval trouble (but this can be worked around by changing to LocalDate):
org.springframework.core.convert.ConversionFailedException: Failed to convert from type [java.lang.String] to type [java.sql.Date] for value '2018-03-29T00:00:00.000Z'; nested exception is java.lang.IllegalArgumentException
at org.springframework.core.convert.support.ObjectToObjectConverter.convert(ObjectToObjectConverter.java:112)
at
We should add support for reactive types.
understanding-reactive-types
going-reactive-with-spring-data
The WITH information in the AQL query for resolving @relations is missing. This causes to an error within a cluster setup.
I try to use Spring Data ArangoDB 2.0.0 in spring-data-demo-master.
There is a ParseException when running CrudRunner .
Exception detail:
Caused by: com.arangodb.ArangoDBException: java.text.ParseException: Unparseable date: "Ned"
at com.arangodb.springframework.core.convert.TimeStringConverters.parse(TimeStringConverters.java:67) ~[arangodb-spring-data-2.0.0.jar:na]
at com.arangodb.springframework.core.convert.TimeStringConverters.access$000(TimeStringConverters.java:44) ~[arangodb-spring-data-2.0.0.jar:na]
at com.arangodb.springframework.core.convert.TimeStringConverters$StringToInstantConverter.convert(TimeStringConverters.java:105) ~[arangodb-spring-data-2.0.0.jar:na]
at com.arangodb.springframework.core.convert.TimeStringConverters$StringToInstantConverter.convert(TimeStringConverters.java:100) ~[arangodb-spring-data-2.0.0.jar:na]
at org.springframework.core.convert.support.GenericConversionService$ConverterAdapter.convert(GenericConversionService.java:385) ~[spring-core-5.0.6.RELEASE.jar:5.0.6.RELEASE]
at org.springframework.core.convert.support.ConversionUtils.invokeConverter(ConversionUtils.java:40) ~[spring-core-5.0.6.RELEASE.jar:5.0.6.RELEASE]
... 40 common frames omitted
Caused by: java.text.ParseException: Unparseable date: "Ned"
at java.text.DateFormat.parse(DateFormat.java:366) ~[na:1.8.0_161]
at com.arangodb.velocypack.internal.util.DateUtil.parse(DateUtil.java:60) ~[velocypack-1.0.14.jar:na]
at com.arangodb.springframework.core.convert.TimeStringConverters.parse(TimeStringConverters.java:65) ~[arangodb-spring-data-2.0.0.jar:na]
... 45 common frames omitted
pom :
<parent>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-parent</artifactId>
<version>2.0.2.RELEASE</version>
</parent>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
</properties>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter</artifactId>
</dependency>
<dependency>
<groupId>com.arangodb</groupId>
<artifactId>arangodb-spring-data</artifactId>
<version>2.0.0</version>
</dependency>
</dependencies>
GeoPage.class.equals(method.getReturnType())
is meaningless & is always false.
P.S. Compiler warning:
Unlikely argument type for equals(): TypeInformation<capture#3-of ?> seems to be unrelated to Class
Using spring-boot 1.5.10.RELEASE and arangodb-spring-data 1.0.1 we get the following error
Caused by: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'contractRepository': Unsatisfied dependency expressed through constructor parameter 1: Ambiguous argument values for parameter of type [com.arangodb.springframework.core.ArangoOperations] - did you specify the correct bean references as arguments?
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:736)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:189)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1193)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1095)
at ...
i get "You have defined query method in the repository but you don't have any query lookup strategy defined. The infrastructure apparently does not support query methods!"
when trying to run a springboot app with arango repository containing @query
public interface TraversalRepository extends ArangoRepository<BaseDocument>{ @Query(value="FOR vcenter IN PV return vcenter") Iterable<BaseDocument> listPVs(); }
using spring boot version 2.1.0.BUILD-SNAPSHOT
and arango spring data 2.1.3
`
org.springframework.boot
spring-boot-starter-web
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-netflix-eureka-client</artifactId>
</dependency>
<dependency>
<groupId>com.arangodb</groupId>
<artifactId>arangodb-spring-data</artifactId>
<version>2.1.3</version>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-test</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-configuration-processor</artifactId>
<optional>true</optional>
</dependency>
</dependencies>`
any suggestions?
The WITH information in the AQL query for resolving @Relations
is missing. This causes to an error within a cluster setup.
Hi!
Currently the DefaultArangoConverter
must deal with non DBEntity
objects because the DBEntityDeserializer
only handles the root object. Nested objects are still "normal" hash maps and lists instead of DBDocumentEntity
and DBCollectionEntity
.
We could implement custom entity instantiators to produce DBEntity
objects right away. As DBDocumentEntity
and DBCollectionEntity
are themselves a hash map and a collection we could do the following:
public class DBEntityModule implements VPackModule {
@Override
public <C extends VPackSetupContext<C>> void setup(C context) {
context.registerInstanceCreator(Map.class, new DBDocumentEntityInstantiator())
.registerInstanceCreator(Collection.class, new DBCollectionEntityInstantiator());
}
public static class DBDocumentEntityInstantiator implements VPackInstanceCreator<Map<?, ?>> {
@Override
public Map<?, ?> createInstance() {
return new DBDocumentEntity();
}
}
public static class DBCollectionEntityInstantiator implements VPackInstanceCreator<Collection<?>> {
@Override
public Collection<?> createInstance() {
return new DBCollectionEntity();
}
}
}
This would greatly simplify the readMap()
and readCollection()
methods of the converter.
E. g. new readMap()
would look like the following:
private Object readMap(final TypeInformation<?> type, final DBDocumentEntity source) {
// ...
for (final Map.Entry<String, Object> entry : source.entrySet()) {
// ...
if (value instanceof DBEntity) {
map.put(key, read(valueType, (DBEntity) value));
}
// remove:
// else if (value instanceof Map) {
// map.put(key, read(valueType, new DBDocumentEntity((Map<? extends String, ? extends Object>) value)));
// } else if (value instanceof Collection) {
// map.put(key, read(valueType, new DBCollectionEntity((Collection<? extends Object>) value)));
// } else if (isSimpleType(valueType.getType())) {
// final Optional<Class<?>> customWriteTarget = conversions.getCustomWriteTarget(valueType.getType());
// final Class<?> targetType = customWriteTarget.orElseGet(() -> valueType.getType());
// map.put(key, conversionService.convert(value, targetType));
// }
else {
/// TODO: potentially convert value
map.put(key, value);
}
}
return map;
}
Can you tell me why you ever needed this part in a read method:
else if (isSimpleType(valueType.getType())) {
final Optional<Class<?>> customWriteTarget = conversions.getCustomWriteTarget(valueType.getType());
final Class<?> targetType = customWriteTarget.orElseGet(() -> valueType.getType());
map.put(key, conversionService.convert(value, targetType));
}
I have custom query with only few field in RETURN like this
@query("FOR person IN persons RETURN {id : person._id, name : person.name, username : person.username, email : person.email, profilFacebookId: person.profilFacebookId, gender : person.gender}")
Page <
PersonWithoutLinkPhotos >
findAllPersonsWithoutLinkPhotos(Pageable arg0);
My Controller
final Pageable pageable = PageRequest.of(0, 5);
Page <
PersonWithoutLinkPhotos >
data = personRepository.findAllPersonsWithoutLinkPhotos(pageable);
the result is still show all data.
Person Class
@document("persons")
@hashindex(fields = { "profilFacebookId", "username", "email" }, unique = true)
public class Person {
@id
private String id;
private String name;
private String username;
private String email;
private String profilFacebookId;
private String gender;
private List linkPhotos;
private List linkPhotosCloud;
private boolean alive;
private Integer age;
@relations(edges = ChildOf.class, lazy = true)
private Collection childs;
// getter setter
}
PersonWithoutLinkPhotos Class // My custom for handle result.
public class PersonWithoutLinkPhotos {
private String id;
private String name;
private String username;
private String email;
private String profilFacebookId;
private String gender;
// getter setter
}
I used
spring boot 2.0.2.RELEASE
arangodb-spring-data 2.1.3
Db : ArangoDB Enterprise Edition 3.3.8 and 3.3.9.
Hi,
So, I'm attempting to try out ArangoDB in a Spring WebFlux + Boot webapp.
I've tried to keep it simple, config is thus:
package project.test.platformapi.config;
import com.arangodb.ArangoDB;
import com.arangodb.Protocol;
import com.arangodb.springframework.annotation.EnableArangoRepositories;
import com.arangodb.springframework.config.AbstractArangoConfiguration;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
@Configuration
@EnableArangoRepositories(basePackages = { "project.test.platformapi.repo" })
public class ArangoDBConfig extends AbstractArangoConfiguration
{
@Override
@Bean
public ArangoDB.Builder arango()
{
return new ArangoDB.Builder()
.useProtocol(Protocol.HTTP_JSON);
}
@Override
public String database()
{
return "test-database";
}
}
Have a repo like:
package project.test.platformapi.repo;
import com.arangodb.springframework.repository.ArangoRepository;
import project.test.platformapi.model.User;
public interface UserRepository extends ArangoRepository<User>
{
Iterable<User> findByName(String name);
Iterable<User> findAll();
User save(User user);
}
And then, on a @RestController
, I have a simple:
@Autowired
private UserRepository userRepository;
My build.gradle file has been enhanced with:
compile(
'com.arangodb:arangodb-spring-data:2.0.1',
'org.apache.httpcomponents:httpclient:4.5.4'
)
Theoretically, everything should be in order. But when I try to actually run this:
2018-01-12 23:45:21.933 ERROR 33668 --- [ restartedMain] o.s.b.SpringApplication : Application startup failed
org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'userRepository': Unsatisfied dependency expressed through constructor parameter 1: Ambiguous argument values for parameter of type [com.arangodb.springframework.core.ArangoOperations] - did you specify the co
rrect bean references as arguments?
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:716) ~[spring-beans-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:192) ~[spring-beans-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1270) ~[spring-beans-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1127) ~[spring-beans-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getSingletonFactoryBeanForTypeCheck(AbstractAutowireCapableBeanFactory.java:946) ~[spring-beans-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryBean(AbstractAutowireCapableBeanFactory.java:833) ~[spring-beans-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.beans.factory.support.AbstractBeanFactory.isTypeMatch(AbstractBeanFactory.java:557) ~[spring-beans-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:428) ~[spring-beans-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:391) ~[spring-beans-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:385) ~[spring-beans-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.hateoas.config.HypermediaSupportBeanDefinitionRegistrar.registerBeanDefinitions(HypermediaSupportBeanDefinitionRegistrar.java:134) ~[spring-hateoas-0.24.0.RELEASE.jar:?]
at org.springframework.context.annotation.ConfigurationClassBeanDefinitionReader.lambda$loadBeanDefinitionsFromRegistrars$0(ConfigurationClassBeanDefinitionReader.java:360) ~[spring-context-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at java.util.LinkedHashMap.forEach(LinkedHashMap.java:684) ~[?:?]
at org.springframework.context.annotation.ConfigurationClassBeanDefinitionReader.loadBeanDefinitionsFromRegistrars(ConfigurationClassBeanDefinitionReader.java:359) ~[spring-context-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.context.annotation.ConfigurationClassBeanDefinitionReader.loadBeanDefinitionsForConfigurationClass(ConfigurationClassBeanDefinitionReader.java:144) ~[spring-context-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.context.annotation.ConfigurationClassBeanDefinitionReader.loadBeanDefinitions(ConfigurationClassBeanDefinitionReader.java:117) ~[spring-context-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.context.annotation.ConfigurationClassPostProcessor.processConfigBeanDefinitions(ConfigurationClassPostProcessor.java:328) ~[spring-context-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.context.annotation.ConfigurationClassPostProcessor.postProcessBeanDefinitionRegistry(ConfigurationClassPostProcessor.java:233) ~[spring-context-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanDefinitionRegistryPostProcessors(PostProcessorRegistrationDelegate.java:273) ~[spring-context-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:93) ~[spring-context-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:693) ~[spring-context-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:531) ~[spring-context-5.0.2.RELEASE.jar:5.0.2.RELEASE]
at org.springframework.boot.web.servlet.context.ServletWebServerApplicationContext.refresh(ServletWebServerApplicationContext.java:138) ~[spring-boot-2.0.0.M7.jar:2.0.0.M7]
at org.springframework.boot.SpringApplication.refresh(SpringApplication.java:751) ~[spring-boot-2.0.0.M7.jar:2.0.0.M7]
at org.springframework.boot.SpringApplication.refreshContext(SpringApplication.java:387) ~[spring-boot-2.0.0.M7.jar:2.0.0.M7]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:327) ~[spring-boot-2.0.0.M7.jar:2.0.0.M7]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1245) ~[spring-boot-2.0.0.M7.jar:2.0.0.M7]
at org.springframework.boot.SpringApplication.run(SpringApplication.java:1233) ~[spring-boot-2.0.0.M7.jar:2.0.0.M7]
at project.test.platformapi.PlatformApiApplication.main(PlatformApiApplication.java:21) ~[main/:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) ~[?:?]
at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:?]
at java.lang.reflect.Method.invoke(Method.java:564) ~[?:?]
at org.springframework.boot.devtools.restart.RestartLauncher.run(RestartLauncher.java:49) ~[spring-boot-devtools-2.0.0.M7.jar:2.0.0.M7]
No clue how to proceed from here, have tried most things I can think of. Can someone please assist?
My Repository
@Query("FOR person IN persons FILTER LIKE(LOWER(person.name), '%LOWER(@name)%') RETURN {id : person._id, name : person.name, username : person.username, email : person.email, profilFacebookId: person.profilFacebookId, gender : person.gender}")
PersonWithoutLinkPhotos findByNameFB(@Param("name") String name);
always return error like this
com.arangodb.ArangoDBException: Response: 400, Error: 1552 - AQL: bind parameter 'name' was not declared in the query (while parsing) at com.arangodb.internal.util.ResponseUtils.checkError(ResponseUtils.java:53) ~[arangodb-java-driver-4.4.0.jar:na] at com.arangodb.internal.velocystream.VstCommunication.checkError(VstCommunication.java:116) ~[arangodb-java-driver-4.4.0.jar:na]
but if i change to LIKE(LOWER(person.name), LOWER(@name)) it's work but does not match that i want. i have to write full name to make it work's.
I used
spring boot 2.0.1.RELEASE
arangodb-spring-data 2.1.3
Db : ArangoDB Enterprise Edition 3.3.8 and 3.3.9.
For a class with a declared @Edge
annotation, there is no need to store any type-related properties/columns, because it causes numerous issues & inefficiencies (for graphs), such as:
Hi,
I try ArangoDB within a Spring based microservice project. I use the following dependencies:
dependencies {
compile "org.springframework.boot:spring-boot-starter-cache"
compile "io.github.jhipster:jhipster"
compile "io.dropwizard.metrics:metrics-core"
compile "io.dropwizard.metrics:metrics-json"
compile "io.dropwizard.metrics:metrics-jvm"
compile "io.dropwizard.metrics:metrics-servlet"
compile "io.dropwizard.metrics:metrics-servlets"
compile "net.logstash.logback:logstash-logback-encoder"
compile "com.fasterxml.jackson.datatype:jackson-datatype-json-org"
compile "com.fasterxml.jackson.datatype:jackson-datatype-hppc"
compile "com.fasterxml.jackson.datatype:jackson-datatype-jsr310"
compile "com.fasterxml.jackson.core:jackson-annotations"
compile "com.fasterxml.jackson.core:jackson-databind"
compile "com.fasterxml.jackson.module:jackson-module-afterburner"
compile "com.ryantenney.metrics:metrics-spring"
compile "com.hazelcast:hazelcast"
compile "com.hazelcast:hazelcast-spring"
compile "javax.cache:cache-api"
compile "commons-codec:commons-codec"
compile "org.apache.commons:commons-lang3"
compile "commons-io:commons-io"
compile "javax.transaction:javax.transaction-api"
compile "org.lz4:lz4-java"
compile "org.springframework.boot:spring-boot-actuator"
compile "org.springframework.boot:spring-boot-autoconfigure"
compile "org.springframework.boot:spring-boot-loader-tools"
compile "org.springframework.boot:spring-boot-starter-mail"
compile "org.springframework.boot:spring-boot-starter-logging"
compile "org.springframework.boot:spring-boot-starter-aop"
compile "org.springframework.boot:spring-boot-starter-security"
compile ("org.springframework.boot:spring-boot-starter-web") {
exclude module: "spring-boot-starter-tomcat"
}
compile "org.springframework.boot:spring-boot-starter-undertow"
compile "org.springframework.boot:spring-boot-starter-thymeleaf"
compile "org.zalando:problem-spring-web"
compile "com.google.guava:guava:18.0"
compile "com.arangodb:arangodb-spring-data:2.0.2"
compile "org.apache.httpcomponents:httpclient:4.5.1"
compile "org.springframework.cloud:spring-cloud-starter"
compile "org.springframework.cloud:spring-cloud-starter-ribbon"
compile "org.springframework.cloud:spring-cloud-starter-hystrix"
compile "org.springframework.cloud:spring-cloud-starter-spectator"
compile "org.springframework.retry:spring-retry"
compile "org.springframework.cloud:spring-cloud-starter-consul-discovery"
compile "org.springframework.cloud:spring-cloud-starter-consul-config"
compile "org.springframework.cloud:spring-cloud-starter-feign"
compile "org.springframework.cloud:spring-cloud-spring-service-connector"
compile "org.springframework:spring-context-support"
compile "org.springframework.security:spring-security-config"
compile "org.springframework.security:spring-security-data"
compile "org.springframework.security:spring-security-web"
compile "io.jsonwebtoken:jjwt"
compile ("io.springfox:springfox-swagger2") {
exclude module: 'mapstruct'
}
compile "io.springfox:springfox-bean-validators"
compile "org.mapstruct:mapstruct-jdk8:${mapstruct_version}"
testCompile "com.jayway.jsonpath:json-path"
testCompile ("org.springframework.boot:spring-boot-starter-test") {
exclude group: 'com.vaadin.external.google', module: 'android-json'
}
testCompile "org.springframework.security:spring-security-test"
testCompile "org.springframework.boot:spring-boot-test"
testCompile "org.assertj:assertj-core"
testCompile "junit:junit"
testCompile "org.mockito:mockito-core"
testCompile "org.hamcrest:hamcrest-library"
testCompile "com.h2database:h2"
optional ("org.springframework.boot:spring-boot-configuration-processor") {
exclude group: 'com.vaadin.external.google', module: 'android-json'
}
}
When starting the unit tests, I get the following error stack trace:
java.lang.NoClassDefFoundError: org/springframework/data/convert/CustomConversions
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2701)
at java.lang.Class.getDeclaredMethods(Class.java:1975)
at org.springframework.util.ReflectionUtils.getDeclaredMethods(ReflectionUtils.java:613)
at org.springframework.util.ReflectionUtils.doWithMethods(ReflectionUtils.java:524)
at org.springframework.util.ReflectionUtils.doWithMethods(ReflectionUtils.java:510)
at org.springframework.util.ReflectionUtils.getUniqueDeclaredMethods(ReflectionUtils.java:570)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.getTypeForFactoryMethod(AbstractAutowireCapableBeanFactory.java:697)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.determineTargetType(AbstractAutowireCapableBeanFactory.java:640)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.predictBeanType(AbstractAutowireCapableBeanFactory.java:609)
at org.springframework.beans.factory.support.AbstractBeanFactory.isFactoryBean(AbstractBeanFactory.java:1484)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doGetBeanNamesForType(DefaultListableBeanFactory.java:425)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeanNamesForType(DefaultListableBeanFactory.java:395)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeansOfType(DefaultListableBeanFactory.java:515)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.getBeansOfType(DefaultListableBeanFactory.java:508)
at org.springframework.context.support.AbstractApplicationContext.getBeansOfType(AbstractApplicationContext.java:1188)
at org.springframework.boot.SpringApplication.getExitCodeFromMappedException(SpringApplication.java:818)
at org.springframework.boot.SpringApplication.getExitCodeFromException(SpringApplication.java:804)
at org.springframework.boot.SpringApplication.handleExitCode(SpringApplication.java:790)
at org.springframework.boot.SpringApplication.handleRunFailure(SpringApplication.java:744)
at org.springframework.boot.SpringApplication.run(SpringApplication.java:314)
at org.springframework.boot.test.context.SpringBootContextLoader.loadContext(SpringBootContextLoader.java:120)
at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContextInternal(DefaultCacheAwareContextLoaderDelegate.java:98)
at org.springframework.test.context.cache.DefaultCacheAwareContextLoaderDelegate.loadContext(DefaultCacheAwareContextLoaderDelegate.java:116)
at org.springframework.test.context.support.DefaultTestContext.getApplicationContext(DefaultTestContext.java:83)
at org.springframework.test.context.web.ServletTestExecutionListener.setUpRequestContextIfNecessary(ServletTestExecutionListener.java:189)
at org.springframework.test.context.web.ServletTestExecutionListener.prepareTestInstance(ServletTestExecutionListener.java:131)
at org.springframework.test.context.TestContextManager.prepareTestInstance(TestContextManager.java:230)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.createTest(SpringJUnit4ClassRunner.java:228)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner$1.runReflectiveCall(SpringJUnit4ClassRunner.java:287)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.springframework.test.context.junit4.SpringJUnit4ClassRunner.methodBlock(SpringJUnit4ClassRunner.java:289)
Your implementation looks right. Any hints?
Regards,
Holger
We used the spring-data library with a bit old version (2.0.1) everything was ok. But a few days ago after migration to the latest stable version we faced very strange behavior with the serialization of enums.
Just, for example, user status enum is:
public enum UserStatus {
VERIFIED("Verified"), PENDING("Pending"), BANNED("Banned");
@Getter
@JsonValue
private final String status;
UserStatus(String status) {
this.status = status;
}
@JsonCreator
public static UserStatus findValue(String status) {
return Stream.of(UserStatus.values())
.filter(value -> value.status.equals(status))
.findFirst()
.orElseThrow(() -> new IllegalStateException(String.format("UserStatus do not have such value: %s", status)));
}
}
Previously, it was saved in database as string, but now saved as following structure:
"status": {
"name": "PENDING",
"status": "Pending",
"ordinal": 1
}
Could you add proper serialization of enums?
P.S. Fixed by adding custom converters. But would be great to have default converter for enums
The Spring Data Commons project provides all the plumbing for doing the scanning for repositories.
I'm trying to implement one of the documentation example:
@Document("persons")
public class Person {
@From
private List<Relation> relations;
}
@Edge(name="relations")
public class Relation {
...
}
But I'm unable to get a list of relations as expected.
First issue: @edge doesn't have a "name" attribute (it's "value").
But it's not clear what Relation class should have as attributes.
Do I have to add a from and to attribute? Should I add the @from and @to annotations on them?
What I want doesn't seem complicated:
ikincil-dev@ziya-u16:~/development/projects/tiers/arangodb/truth/head/spring-data$ head -n 10 pom.xml | grep version
2.1.7
ikincil-dev@ziya-u16:~/development/projects/tiers/arangodb/truth/head/spring-data$ git tag | tail -n 5
2.1.2
2.1.3
2.1.4
2.1.5
2.1.6
ikincil-dev@ziya-u16:~/development/projects/tiers/arangodb/truth/head/spring-data$
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.