benhaney / jsonrs Goto Github PK
View Code? Open in Web Editor NEWRust powered JSON library for Elixir
Home Page: https://hexdocs.pm/jsonrs
License: The Unlicense
Rust powered JSON library for Elixir
Home Page: https://hexdocs.pm/jsonrs
License: The Unlicense
Is possible have performance also for a 1k json?
Thanks
As the title says. (for support with OTP 26)
I receive a payload as a charlist from :httpc
, so I need to convert it into a string to Jsonrs.decode!
it. The library Jason
can do it without the string conversion. Is it worth using Jsonrs when I have this kind of conversion to do?
When a struct contains an inner struct, both with Jsonrs.Encoder
definitions, the Jsonrs.encode
function will not pick up on the inner struct's definition.
This test fails when it should pass. The inner structs defimpl is not
defmodule SomeStruct do
defstruct [:field, :inner_struct]
end
defmodule InnerStruct do
defstruct [:field, :ignored_field]
end
defimpl Jsonrs.Encoder, for: SomeStruct do
def encode(%{field: f, inner_struct: is}) do
%{f: f, is: is}
end
end
defimpl Jsonrs.Encoder, for: InnerStruct do
def encode(%{field: f}) do
%{f: f}
end
end
test "protocols" do
is = %InnerStruct{field: "a", ignored_field: "b"}
val = %SomeStruct{
field: "a",
inner_struct: is
}
assert ~s({"f":"a"}) == Jsonrs.encode!(is)
assert ~s({"f":"a","is":{"f":"a"}}) ==
Jsonrs.encode!(val)
end
Hi @benhaney,
We are having some issues using this with phoenix live dashboard as it's making the views crash.
I set up the project like this:
# config.exs
config :phoenix, :json_library, Jsonrs
And then you go to /dashboard
after setting it up and you'll see the loading bar just continue loading.
To debug the issues you'll have to create a module with cloning it and adding Jason
as a dependency:
defmodule TestApp.Jsontest do
def decode!(input, opts \\ []) do
IO.puts("####### decode")
IO.puts("INC VAL:")
IO.inspect(input)
IO.puts("JASON:")
IO.inspect(Jason.decode!(input, opts))
IO.puts("JSONRS:")
IO.inspect(Jsonrs.decode!(input, opts))
IO.puts("####### decode")
Jsonrs.decode!(input, opts)
end
def encode_to_iodata!(input, opts \\ []) do
IO.puts("####### ENCODE")
IO.puts("INC VAL:")
IO.inspect(input)
IO.puts("JASON:")
IO.inspect(Jason.encode!(input, opts))
IO.puts("JSONRS:")
IO.inspect(Jsonrs.encode!(input, opts))
IO.puts("####### ENCODE")
Jsonrs.encode!(input, opts)
end
end
And pointing the json_library above to it.
Then you'll see some differences between Jason and Jsonrs:
####### ENCODE
INC VAL:
[nil, "26", "phoenix", "phx_reply", %{response: %{}, status: :ok}]
JASON:
"[null,\"26\",\"phoenix\",\"phx_reply\",{\"response\":{},\"status\":\"ok\"}]"
JSONRS:
"[null,\"26\",\"phoenix\",\"phx_reply\",{\"response\":{},\"status\":\"Ok\"}]"
####### ENCODE
It seems like Jsonrs
is turning atoms into capitalized letters which is breaking stuff here and there.
Example:
iex> foo_struct = %Foo{payload: ~T[12:00:00]}
%Foo{payload: ~T[12:00:00]}
iex21> foo_map = %{payload: ~T[12:00:00]}
%{payload: ~T[12:00:00]}
iex22> Jsonrs.encode!(foo_struct)
"{\"payload\":{\"calendar\":\"Elixir.Calendar.ISO\",\"hour\":12,\"microsecond\":[0,0],\"minute\":0,\"second\":0}}"
iex23> Jsonrs.encode!(foo_map)
"{\"payload\":\"12:00:00\"}"
OTP 24 includes a new JIT that should greatly increase the performance of non-NIF encoders like Jason and Poison. It will be interesting to see how much of the performance advantage of Jsonrs shrinks with this update.
The largest current memory-inefficiency when doing an encode involves an extra copy across the Rust-to-Elixir boundary to convert the final encoded Rust string to an Elixir binary. This can potentially be avoided by allocating erlang binaries as the backing store of the destination vector, instead of a Rust byte array. The final binary can just be passed directly to the BEAM and used without an additional copy. There's probably some computational overhead with allocating erlang binaries instead of byte arrays to back the vector, which will need to be compared against the advantage of halving the allocations.
I think there's a good chance this ends up being a net positive, but it needs testing.
System:
Operating System: macOS
CPU Information: Apple M1
Number of Available Cores: 8
Available memory: 16 GB
Elixir 1.14.0
Erlang 25.0.4
small_list = Enum.to_list(1..100)
big_list = Enum.to_list(1..10_000)
small_map = Enum.map(small_list, fn(x)-> {"key#{x}", x} end) |> Enum.into(%{})
big_map = Enum.map(big_list, fn(x)-> {"key#{x}", x} end) |> Enum.into(%{})
Benchee.run(
%{
"jsonrs.encode! - small" => fn -> Jsonrs.encode!(small_map) end,
"Jason.encode! - small" => fn -> Jason.encode!(small_map) end
}
)
Benchee.run(
%{
"jsonrs.encode! - big" => fn -> Jsonrs.encode!(big_map) end,
"Jason.encode! - big" => fn -> Jason.encode!(big_map) end
}
)
RESULT:
Operating System: macOS
CPU Information: Apple M1
Number of Available Cores: 8
Available memory: 16 GB
Elixir 1.14.0
Erlang 25.0.4
Benchmark suite executing with the following configuration:
warmup: 2 s
time: 5 s
memory time: 0 ns
reduction time: 0 ns
parallel: 1
inputs: none specified
Estimated total run time: 14 s
Benchmarking Jason.encode! - small ...
Benchmarking jsonrs.encode! - small ...
Name ips average deviation median 99th %
Jason.encode! - small 84.74 K 11.80 μs ±36.67% 11.33 μs 16.83 μs
jsonrs.encode! - small 25.35 K 39.45 μs ±56.39% 35.08 μs 94.58 μs
Comparison:
Jason.encode! - small 84.74 K
jsonrs.encode! - small 25.35 K - 3.34x slower +27.65 μs
Operating System: macOS
CPU Information: Apple M1
Number of Available Cores: 8
Available memory: 16 GB
Elixir 1.14.0
Erlang 25.0.4
Benchmark suite executing with the following configuration:
warmup: 2 s
time: 5 s
memory time: 0 ns
reduction time: 0 ns
parallel: 1
inputs: none specified
Estimated total run time: 14 s
Benchmarking Jason.encode! - big ...
Benchmarking jsonrs.encode! - big ...
Name ips average deviation median 99th %
Jason.encode! - big 642.38 1.56 ms ±10.56% 1.50 ms 1.93 ms
jsonrs.encode! - big 478.25 2.09 ms ±6.31% 2.07 ms 2.36 ms
Comparison:
Jason.encode! - big 642.38
jsonrs.encode! - big 478.25 - 1.34x slower +0.53 ms
Statements from the readme file seem not to reflect those recent developments. Maybe adding a disclaimer about differences on different Erlang VMs / Elixir versions would help?
Cheers and thanks for this library!
I am loving the approach of your library. At my day job I have a use-case where we create some very large data structures and encode them to JSON, then gzip that JSON. It is pretty common for us to see JSON documents that are 140MB+ in size which get compressed to ~9MB after gzipping.
Would you be open to me extending your library to support an option to do a streaming gzip encoding of the JSON as it's being compiled. This would reduce my system's memory footprint significantly since I wouldn't need to hold the 140MB of JSON in memory. I would only need to hold 9MB of gzipped json.
Hello, I'm trying to shift from Jason to Jsonrs but I'm hitting some weird type issues. Dialyzer is going crazy overall. One example is the code below:
@spec deserialize(binary) :: {term, Metadata.t() | nil} | :error
def deserialize(message) do
case Jsonrs.decode!(message) do
%{"route" => route, "reference" => reference, "payload" => payload} ->
{payload, %Metadata{reference: reference, route: route}}
%{"d" => payload} ->
{payload, nil}
_ ->
:error
end
end
Dialyzer just give me this for the first case clause line
The pattern can never match the type.
Pattern:
%{
<<114, 101, 102, 101, 114, 101, 110, 99, 101>> => _reference,
<<114, 111, 117, 116, 101>> => _route,
<<112, 97, 121, 108, 111, 97, 100>> => _payload
}
Type:
binary()
I don't understand how decode!()
can be expecting a binary()
type as return for the dialyzer run?
I did look at Jsonrs typing and everything seems correct, specs says decode!()
should return term
so I'm in a bit of a loss...
Important to notice is that I had no dialyzer issues at all with Jason and the only thing that changed in that function was the decode function module name
Hello.
$ FORCE_JSONRS_BUILD=true mix deps.compile jsonrs
==> jsonrs
Compiling 2 files (.ex)
== Compilation error in file lib/jsonrs.ex ==
** (RuntimeError) precompiled NIF is not available for this target: "aarch64-unknown-linux-musl".
The available targets are:
- aarch64-apple-darwin
- x86_64-apple-darwin
- x86_64-unknown-linux-gnu
- x86_64-unknown-linux-musl
- arm-unknown-linux-gnueabihf
- aarch64-unknown-linux-gnu
- x86_64-pc-windows-msvc
- x86_64-pc-windows-gnu
lib/jsonrs.ex:8: (module)
(elixir 1.14.0) lib/kernel/parallel_compiler.ex:346: anonymous fn/5 in Kernel.ParallelCompiler.spawn_workers/7
could not compile dependency :jsonrs, "mix compile" failed. Errors may have been logged above. You can recompile this dependency with "mix deps.compile jsonrs", update it with "mix deps.update jsonrs" or clean it with "mix deps.clean jsonrs"
Also how do I go about requesting aarch64-unknown-linux-musl
target? I'm running in an Alpine 3 container on a Mac M1 host.
Thank you.
Hi,
The newer rustler version fixes building this library when using OTP 24. The related rustler's issue.
Could you please update the library version, thanks.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.