Comments (8)
I think this is not supported, at least with Jackson's native Avro read implementation. Apache Avro-lib -backed variant, while slower, might handle default values correctly.
As to how to enable Apache Avro lib backend, I think there are unit tests that do that.
I agree, it'd be good to document this gap.
from jackson-dataformats-binary.
Thanks for your response.
I tried looking for a unit test, but I couldn't find one. I did however find the ApacheAvroparserImpl
. When I implemented it like this:
try (AvroParser parser =new ApacheAvroFactory(new AvroMapper()).createParser(payload)) {
parser.setSchema(schema);
TreeNode treeNode = parser.readValueAsTree();
System.out.println(treeNode);
};
It does not work unfortunately (as in no default values). Am I doing it correctly or should I also use a different codec?
from jackson-dataformats-binary.
I made some changes, as of course the code that I showed in my first message does not fully make sense. You cannot not write a value, even if it has a default. So I changed it to this:
String writingSchema = """
{
"type": "record",
"name": "Employee",
"fields": [
{"name": "age", "type": "int"},
{"name": "emails", "type": {"type": "array", "items": "string"}},
{"name": "boss", "type": ["Employee","null"]}
]}
""";
String readingSchema = """
{
"type": "record",
"name": "Employee",
"fields": [
{"name": "name", "type": ["string", "null"], "default" : "bram"},
{"name": "age", "type": "int"},
{"name": "emails", "type": {"type": "array", "items": "string"}},
{"name": "boss", "type": ["Employee","null"]}
]}
""";
String employeeJson = """
{
"age" : 26,
"emails" : ["[email protected]", "[email protected]"],
"boss" : {
"age" : 33,
"emails" : ["[email protected]"]
}
}
""";
When I do this, when I read the values, I get the following exception: java.io.IOException: Invalid Union index (26); union only has 2 types
. Which is the same as reported here: #164
from jackson-dataformats-binary.
The only other note I have is that this:
new ApacheAvroFactory(new AvroMapper()).
is wrong way around: it should be
new AvroMapper(new ApacheAvroFactory)
to have correct linking; and then you should be able to create ObjectReader
/ ObjectWriter
through which you can assign schema.
But I suspect that won't change things too much: you should either way have ApacheAvroFactory
that is using Apache Avro lib.
from jackson-dataformats-binary.
Ah thanks, didn't know that. I tried it, but as you said it did indeed not work.
Whats weird, I even tried decoding it with the apache avro library myself. I just used GenericDatumReader (and all things that come with it), but I would get exactly the same error. This does not make sense right? As I'm sure that what I'm doing is allowed by Avro (adding a default field in a reader schema, that is not in the write schema), as I have done it many times in my Kafka cluster.
Do you happen to know what the difference might be? Do my Kafka clients do anything special for this?
from jackson-dataformats-binary.
I finally get it. In your kafka cluster it saves the writing schema with it. If you parse it like this:
Schema avroSchema = ((AvroSchema) schema).getAvroSchema();
GenericDatumReader<GenericRecord> objectGenericDatumReader = new GenericDatumReader<>(writingschema, avroSchema);
BinaryDecoder binaryDecoder = DecoderFactory.get().binaryDecoder(payload, null);
GenericRecord read = objectGenericDatumReader.read(null, binaryDecoder);
So with the specific writer schema.
It does work. Normally kafka does it this way for you, but I don't think the AvroMapper has a way do to it with 2 schemas.
from jackson-dataformats-binary.
@basimons Avro module does indeed allow use of 2 schema (read/write) configuration -- it's been a while so I'll have to see how it was done. I think AvroMapper
has methods to construct Jackson AvroSchema
from 2 separate schemas.
from jackson-dataformats-binary.
Ah. Close: AvroSchema
has method withReaderSchema(AvroSchema rs)
where you get both schema instances, then call method on "writer schema" (one used on writing records). From ArrayEvolutionTest
:
final AvroSchema srcSchema = MAPPER.schemaFrom(SCHEMA_XY_ARRAY_JSON);
final AvroSchema dstSchema = MAPPER.schemaFrom(SCHEMA_XYZ_ARRAY_JSON);
final AvroSchema xlate = srcSchema.withReaderSchema(dstSchema);
and then you construct ObjectReader
as usual.
from jackson-dataformats-binary.
Related Issues (20)
- Honor READ_ENUMS_USING_TO_STRING feature when deserializing HOT 3
- Honor `READ_UNKNOWN_ENUM_VALUES_USING_DEFAULT_VALUE` feature with Protobuf HOT 5
- Rewrite Smile buffer recycling (`SmileBufferRecycler`) to use new (2.16) `RecyclerPool`, allow configuring use of non-ThreadLocal based pools HOT 2
- Rewrite Avro buffer recycling (`ApacheCodecRecycler.java`) to use new (2.16) `RecyclerPool`, allow configuring use of non-ThreadLocal based pools
- CVE in avro prior to v1.11.3 HOT 2
- Remove Smile-specific buffer-recycling
- Update `com.amazon.ion:ion-java` to 1.10.5 (from 1.9.5)
- (avro) Snyk Reports a Critical Vulnerability (org.codehaus.jackson:jackson-mapper-asl Improper Input Validation) -- NOT APPLICABLE (polymorphic deserialization) HOT 3
- `IonReader` classes contain assert statement which could throw unexpected `AssertionError` HOT 1
- `IndexOutOfBoundsException` thrown by `IonReader` implementations are not handled HOT 1
- Avro generation failed with enums containing values with special characters HOT 3
- `IonReader` throws `NullPointerException` for unchecked invalid data
- `SmileParser` throws unexpected IOOBE for corrupt content HOT 1
- `IonParser.getIntValue()` fails or does not handle value overflow checks HOT 1
- CBOR: negative BigInteger values not handled correctly HOT 2
- More methods from `IonReader` could throw an unexpected `AssertionError`
- Unexpected `NullPointerException` thrown from `IonParser::getNumberType()` HOT 1
- `IonFactory.createParser(IonReader)` does not initialize state
- `IonReader.next()` throws NPEs for some invalid content
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from jackson-dataformats-binary.