Git Product home page Git Product logo

nippy's Introduction

Taoensso open source
Documentation | Latest releases | Get support

Nippy

The fastest serialization library for Clojure

Clojure's rich data types are awesome. And its reader allows you to take your data just about anywhere. But the reader can be painfully slow when you've got a lot of data to crunch (like when you're serializing to a database).

Nippy is an attempt to provide a reliable, high-performance drop-in alternative to the reader.

Used by Carmine, Faraday, PigPen, Onyx, XTDB, Datalevin, and others.

Latest release/s

Main tests Graal tests

See here for earlier releases.

Why Nippy?

  • Small, simple all-Clojure library
  • Terrific performance: the best for Clojure that I'm aware of
  • Comprehensive support for all standard data types
  • Easily extendable to custom data types
  • Robust test suite, incl. full coverage for every supported type
  • Auto fallback to Java Serializable when available
  • Auto fallback to Clojure Reader for all other types (including tagged literals)
  • Pluggable compression with built-in LZ4, Zstandard, etc.
  • Pluggable encryption with built-in AES128
  • Tools for easy + robust integration into 3rd-party libraries, etc.
  • Powerful thaw transducer for flexible data inspection and transformation

Performance

Since its earliest versions, Nippy has consistently been the fastest serialization library for Clojure that I'm aware of. Latest benchmark results:

benchmarks-png

Documentation

Funding

You can help support continued work on this project, thank you!! 🙏

License

Copyright © 2012-2024 Peter Taoussanis.
Licensed under EPL 1.0 (same as Clojure).

nippy's People

Contributors

danmason avatar fierycod avatar isaksky avatar johnchapin avatar kul avatar mpenet avatar postspectacular avatar ptaoussanis avatar slipset avatar weavejester avatar ztellman avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nippy's Issues

Add custom serialization support

Thanks to James Reeves for this suggestion:

(thaw bytes {:readers {Vector3D read-vector} :writers {Vector3D write-vector})

It might result in a prefix for the data like 80 \V \e \c \t \o \r \3 \D \

(defn read-vector [^DataInputStream s]
  (Vector3D. (.readDouble s) (.readDouble s) (.readDouble s)))

(defn write-vector [^DataOutputStream s ^Vector3D v]
   (.writeDouble s (.getX v))
   (.writeDouble s (.getY v))
   (.writeDouble s (.getZ v)))

Error with umlauts in strings (windows)

Hi,

I noticed a bug while freezing/thawing strings with umlauts.
I'm using windows as OS, so this is propably the reason.

Example:

(= "some text with umlauts äöü"
    (nippy/thaw (nippy/freeze "some text with umlauts äöü")))
=> false

This could be fixed in nippy.clj line 189 with (.getBytes x "UTF-8") instead of (.getBytes x)

Thanks.

String.getBytes and nippy

Hello,

I wanted to ask if nippy is doing anything special when you freeze-to-bytes a string because I have noticed that if I do this:

(thaw-from-bytes (.getBytes "my-string"))

then it simply won't work even though I expected for strings to work fine. I am curious, what is the reason for this?

Thank you for your time

Serialization problem across clojure versions

I have the following java class as an example, in a project lets say org.foo/dummy-project

package org;

import java.util.UUID;
import clojure.lang.Keyword;
import java.io.Serializable;

public class Dummy implements Serializable {

  private Keyword foo;
  private UUID id;
  private static final long serialVersionUID = 327399273872L;

  public Dummy(Keyword foo, UUID id) {
    if (id==null) {
      throw new IllegalArgumentException("fooo");
    }
    this.foo = foo;
    this.id = id;
  }
}

Now if the above dependency is available in two clojure projects with nippy 2.6.2 and clojure 1.5.1 and 1.6.0 respectively. The following errors are encountered

From 1.5.1 project

  (import 'java.io.FileOutputStream)
  (require '[taoensso.nippy :as n])

  (with-open [out (FileOutputStream. "/tmp/foo")]
    (.write out (n/freeze (org.Dummy. :foo (java.util.UUID/randomUUID)))))

From 1.6.0 project

  (import 'com.google.common.io.Files)
  (n/thaw (Files/toByteArray (clojure.java.io/file "/tmp/foo")))
  ; gives {:nippy/unthawable "org.Dummy", :type :serializable}

I noticed that this happened after introduction of the Keyword member in Dummy class.

exception using 2.0.0-RC1

Hi,

I migrate the nippy from 1.2.1 to 2.0.0-RC1, and unfortunately I got an error when starting the app.

java.lang.IllegalArgumentException: No single method: freeze_to_stream_STAR_ of interface: taoensso.nippy.Freezable found for function: freeze-to-stream* of protocol: Freezable.

Is it caused by the broken jar in the clojars?

Thanks

Julius

Possible nippy issues found with Eastwood linter

The second ex-info call in get-auto-encryptor is only given one argument, but it requires either 2 or 3 args.

thaw-nippy-v1-data is a fn defined to take 1 arg, but in one place it is called with none.

Byte Encoding

What is the default encoding for bytes with nippy?

I get following errors when working with this API which assumes UTF-8.

user=> (import 'org.apache.hadoop.hbase.util.Bytes)
org.apache.hadoop.hbase.util.Bytes
user=> (n/thaw (Bytes/toBytes (Bytes/toString (n/freeze n/stress-data))))

CorruptionException last byte of compressed length int has high bit set  org.iq80.snappy.SnappyDecompressor.readUncompressedLength (SnappyDecompressor.java:425)
user=> (n/thaw (Bytes/toBytes (Bytes/toString (n/freeze n/stress-data {:compressor nil}))))

Exception No reader provided for custom type ID: 65  taoensso.nippy/thaw-from-stream (nippy.clj:365)
user=> (n/thaw (Bytes/toBytes (Bytes/toString (n/freeze {:a 1 :b "2" :c 0.1}))))

CorruptionException Invalid copy offset for opcode starting at 27  org.iq80.snappy.SnappyDecompressor.decompressAllTags (SnappyDecompressor.java:165)
user=> (n/thaw (Bytes/toBytes (Bytes/toString (n/freeze {:a 1 :b "2"}))))
{:a 1, :b "2"}
user=> (n/thaw (Bytes/toBytes (Bytes/toString (n/freeze {:a 1 :b "2" :c 9}))))
{:a 1, :c 9, :b "2"}
user=> (n/thaw (Bytes/toBytes (Bytes/toString (n/freeze {:a 1 :b "2" :c 9.1}))))
{:a 1, :c 9.1, :b "2"}

Thaw two or more nippy frozen concatenated strings

I am using nippy to write data to and read data from a socket.
The socket read can return the result of more than one write operation which causes nippy to crash since it cannot handle more than one nippy frozen clojure-data objects in one thaw call.
Am i using the api correctly ? Is there some way i can use the existing api to achieve intended result ?

Error while running thaw

Hey,

I get the following error when running the sample call (nippy/freeze nippy/stress-data) in my repl. When i thaw the above result, i get the error given below.

CorruptionException Invalid copy offset for opcode starting at 23 org.iq80.snappy.SnappyDecompressor.decompressAllTags (SnappyDecompressor.java:165)

running
clojure 1.5.1
nippy 2.0.0-RC1

Decompress/Deserialize form Java worker when data inserted via amazon ica

We are trying to utilize the Amazon Connector Library:

https://github.com/awslabs/amazon-kinesis-connectors

I read through this issue:

mcohen01/amazonica#36

I still want to compress on the wire and like how nippy doe sit, I just want to comsume messages in other code like Java.

We are inserting records via amazonica but we need to decompress/deserialize via java so I even created a small API method in a small clojure library to use nippy since that seemed to make sense:

Im using a String since it makes the calling of the Clojure API simpler from Java and can use the default Amazon connector library code which goes ahead and converts the HeapByteBuffer to string. I'll optimize that later.

(defn- -deserializeKinesisEvent
  [this byte-buffer-string]
  (let [
        byte-buffer (ByteBuffer/wrap (.getBytes byte-buffer-string))
        b (byte-array (.remaining byte-buffer))]
    (.get byte-buffer b)
    (nippy/thaw b)))

It looks like Sandy is the default for thaw/freeze correct for amazonica? I just want to confirm there is nothing else needed more thso i can try to get this working on java or clojure.

Exception when i try to call that method from Java to thaw:

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
 [java]     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
 [java]     at java.lang.Thread.run(Thread.java:744)
 [java] Caused by: java.lang.Exception: Thaw failed: Encrypted data or wrong compressor?
 [java]     at taoensso.nippy$thaw$ex__3212.doInvoke(nippy.clj:359)
 [java]     at clojure.lang.RestFn.invoke(RestFn.java:423)
 [java]     at taoensso.nippy$thaw$try_thaw_data__3216.invoke(nippy.clj:377)
 [java]     at taoensso.nippy$thaw.doInvoke(nippy.clj:393)
 [java]     at clojure.lang.RestFn.invoke(RestFn.java:410)
 [java]     at datasnap_core.csv_gen.transformer$_deserializeKinesisEvent.invoke(transformer.clj:67)
 [java]     at datasnap_core.csv_gen.Transformer.deserializeKinesisEvent(Unknown Source)
 [java]     ... 15 more
 [java] Caused by: org.iq80.snappy.CorruptionException: last byte of compressed length int has high bit set
 [java]     at org.iq80.snappy.SnappyDecompressor.readUncompressedLength(SnappyDecompressor.java:425)
 [java]     at org.iq80.snappy.SnappyDecompressor.uncompress(SnappyDecompressor.java:38)
 [java]     at org.iq80.snappy.Snappy.uncompress(Snappy.java:37)
 [java]     at taoensso.nippy.compression.SnappyCompressor.decompress(compression.clj:19)
 [java]     at taoensso.nippy$thaw$try_thaw_data__3216.invoke(nippy.clj:369)
 [java]     ... 19 more
 [java] java.lang.reflect.InvocationTargetException
 [java]     at sun.reflect.NativeMethodAcc

[proposal] Try to avoid conflict in custom type definition

Hi Peter,

In our development I found it's very easy to run into a situation that we have nippy custom type id in conflict, especially when the type definition is not in a single source file. It's hard to detect in compile time, and in repl development (the conflict one is not loaded in repl often)

My suggestion is to add some mechanism to avoid this conflict in the library. Currently there are two approaches:

  • Making custom types id namespaced. This will solve the issue totally but it might introduce overhead in serialization.
  • Throwing an exception when overriding existed custom types. The can be done simply in extend-thaw and extend-freeze by just checking if the id is already used in custom-types map.

Add alternative (high-compression) ICompressor(s)

Snappy (currently implemented) offers a sensible default, but there are a lot of cases where a higher compression ratio for higher CPU usage would be a good tradeoff.

As of v2, adding new compressors is very simple (just implement the ICompressor protocol in taoensso.nippy.compression).

GZip would be a good start, and I'm sure there are some other higher-compression Java libs we can support.

Nippy 2.0.0 is broken

Hi.
I just tried new carmine 2.0.0 and found that nippy 2.0.0 is broken:

I installed nippy from clojars:

[com.taoensso/nippy "2.0.0"]

And then required it in my REPL:

(require 'taoensso.nippy)

And got an error:

CompilerException java.lang.IllegalArgumentException: No single method: freeze_to_stream_STAR_ of interface: taoensso.nippy.Freezable found for function: freeze-to-stream* of protocol: Freezable, compiling:(taoensso/nippy.clj:123) 

Create taoensso.nippy/freezeable?

I'd like a function that would return whether or not a value is able to be frozen and thawed by nippy. This prevents having to attempt the serialization and catch any exceptions. It can also return the proper value for Clojure functions, which appear to be serializable, but won't work in a different REPL without AOT.

=> (taoensso.nippy/freezeable? 0)
true

=> (taoensso.nippy/freezeable? (atom 0))
false

=> (taoensso.nippy/freezeable? (fn [x] x))
false

Thanks,
Matt

Support for reading from a DataInput

I'd like to use Nippy in a little Netty project I'm working on, and Netty exposes ByteBufInputStreams which act as both InputStream and DataInput. It'd be nice to avoid an extra byte[] copy by reading directly from a DataInput, and I think all the calls in thaw-from-stream could take that type instead. Then I'd just need to make thaw-from-stream publically accessible, or extend (thaw) to accept streams as well as byte arrays. Thoughts?

Perf+size optimization for numbers

A byte-sized long-typed number is currently serialized as a long (8 bytes). It should be relatively simple to optimize away these kinds of cases so that we're always storing a minimal amount of data.

There are a couple ways of doing this, and I'd like it to be simple to extend to custom types - so going to take a little time to think about it.

Update: Actually it occurs to me now that if this trick could be applied to all internal integers (incl. encoded data structure sizes), it could make a significant difference indeed.

sorted-maps are converted into hash-maps

Hi~
This lib is very useful but I found a little problem with sorted-map.
After the round-trip, sorted-maps are converted into hash-maps.
This can be problem to someone depends on sorted-map data structures.

serializable? fails with StackOverwflow on simple 'repeated' list

I had a simple data structure (list of list of strings) and noticed that the function serializable? fails with StackOverFlow, but I can successfully call "freeze" on it.

The simplest case which breaks seems to be this:

(require 'taoensso.nippy.utils)
(def l (list 1))
(def ll (repeat 2 l))
(taoensso.nippy.utils/serializable? ll)  -> StackOverwlow

(if I use "(def ll (list l l)" is does work ....) So it is somehow related to structural sharing of clojure ?

My use case is to make a function to freeze/serialize a bunch of vars , where some of them might not be serializable.
So I want to "test" this before calling "freeze" on it.

:read-eval? true causes error on deftyped types with custom print-dup

thawing with :read-eval? true seems to lead to an error if i use custom types with custom print-dup:

Here my small example code:

(deftype MyType [a b c])
(defn new-type [a b c] (MyType. a b c))
(defmethod print-method MyType 
  [t writer] 
  (.write writer (str "#user.new-type[" (.a t) " " (.b t) " " (.c t)"]")))
(defmethod print-dup MyType 
  [t writer] 
  (.write writer (str "#=(user/new-type " (.a t) " " (.b t) " " (.c t) ")")))
(binding [*print-dup* true] 
  (read-string (pr-str (new-type 1 2 3))))
;; #user.new-type[1 2 3]
(nippy/thaw (nippy/freeze (new-type 1 2 3)) :read-eval? true)
; Quit to level 1
; Evaluation aborted

Support deftypes

Hi,

Is it possible to add support for deftyped classes? I can see you've left a data type id for them commented out. :) Looking at the class a deftype generates, it implements IType, a single constructor for all fields and a static method "getBasis" which defines the field order for the constructor so it should be possible to use reflection to read/create. I suppose I could use this to make a macro to generate the extend-freeze/extend-thaw but it would be handy to have fallback to runtime support for types that I haven't created a custom (de)serializer for (at least as an option).

nippy uses old version of timbre

Nippy depends on an old version of timbre. This broke my build, because there's conflicting dependencies on encore. I think it would be fine to just upgrade the dependency, since I don't think there's any breaking API change. Right now, I've done it with just an exclusion, and that seems to work fine.

I don't know if hatnik is a tool you know of/like, but FWIW, this is what automatically notified me of the latest version existing.

Thanks a lot for all of your work on nippy/timbre/etc! I'm a big fan of all of your projects :)

Default implementation for IRecord

It seems that the default implementation for freezing IRecord could be made more efficient. As it stands right now, it creates a new in-memory map by doing (into {} x) because we'd like to preserve the declared fields as well as the underlying extra map.

There's an (undocumented) Clojure trick:

(.__extmap record_instance) will return the extra map. We can write that map directly and then also write the declared fields. The list of declared fields can be gotten via introspection like this:

(defn static? [field]
  (java.lang.reflect.Modifier/isStatic
   (.getModifiers field)))

(defn get-record-field-names [record]
  (->> record
       .getDeclaredFields
       (remove static?)
       (map #(.getName %))
       (remove #{"__meta" "__extmap"})))

An alternative to introspecting this at freeze time would be to allow the client code to register the record as freezable, which would introspect the class and record the list of fields once at registration time.

Comment: Edited to make the code visible - Peter

Thawing a frozen record returns back an older version of that Record Class

I think I am having the following issue; I may be wrong, as I haven't yet tracked it down fully, but I will document it here in case someone else is having a similar issue:

  1. I load up my project in Emacs with Cider, which allows live recompilation.
  2. I compile a record implementation, and a method that dispatches on that record. I freeze and thaw an instance of the record. This works fine - the method dispatches correctly on the instance that is thawed.
  3. I recompile the record and the method. I freeze and thaw an instance of the record. This no longer works - the method doesn't dispatch on the instance that is thawed, and claims no implementation can be found. I think this is because the classloader that nippy uses somehow caches the first version of the class that it found in step 2, so I freeze an instance of a record whose Java Type is Class1-CompiledAtStep3, but when it is thawed it comes back as Class1-CompiledAtStep2.

Two references to identical object come back as copies of object

Consider the following:

user> (def m1 {:a 1})
#'user/m1
user> (def m2 m1)
#'user/m2
user> (identical? m1 m2)
true
user> (def r {:m1 m1 :m2 m2})
#'user/r
user> r
{:m1 {:a 1}, :m2 {:a 1}}
user> (identical? (r :m1) (r :m2))
true
user> (def r2 (nippy/thaw (nippy/freeze r)))
#'user/r2
user> (identical? (r2 :m1) (r2 :m2))
false

In the example as given it's more or less benign, but the same behavior happens when the reference is to Java objects where identity matters.

Unfreezing fn when AOT

When i compile my code in AOT mode, i can't freeze data if i use fn as hashmap value.
If i remove AOT compilation. all work fine.

EvalReader not allowed for thaw of java.util.Date

When running thaw on java.util.Date a runtime exception is thrown in Clojure 1.5.1 and nippy 2.1.0

(require '[taoensso.nippy :as nippy])
(nippy/thaw (nippy/freeze (java.util.Date.)))
RuntimeException EvalReader not allowed when *read-eval* is false. clojure.lang.Util.runtimeException (Util.java:219)

Is there a work around for this (other than disabling the reader check?)

More helpful deserialization errors for custom types

Hi @ptaoussanis,

One thing that's bitten us a couple of times is that if you serialize some record and then forget to require the record definition on the receiving end before deserializing, you get this fairly generic error:

Thaw failed: Decryption/decompression failure, or data unfrozen/damaged.

I haven't looked at the wire format too much, but when deserialization fails would it be possible to guess that this is happening? It seems like the deserializer should be getting a class name it doesn't recognize.

Thanks!

Prefer APersistentMap over IPersistentMap

The IPersistentMap interface is implemented by records as well as normal map structures, so when serializing records Nippy incorrectly encodes them as maps.

user=> (require '[taoensso.nippy :as nippy])
nil
user=> (defrecord Foo [x])
user.Foo
user=> (nippy/thaw (nippy/freeze (Foo. 1)))
{:x 1}

The APersistentMap abstract class, on the other hand, is implemented by all map types except for records:

user=> (instance? clojure.lang.APersistentMap (hash-map))
true
user=> (instance? clojure.lang.APersistentMap (array-map))
true
user=> (instance? clojure.lang.APersistentMap (sorted-map))
true
user=> (instance? clojure.lang.APersistentMap (Foo. 1))
false

For this reason it's better for Nippy to dispatch its freezer off of APersistentMap instead of IPersistentMap.

Protocols on thawed records fail

(defprotocol ISample
(greet [_]))

(defrecord Sample [x]
ISample
(greet [_](str "g'day " x)))

(def s (Sample. "mate"))

(greet s)

(def e (thaw (freeze s)))

(greet e)

The thawed version fails with java.lang.IllegalArgumentException: No implementation of method: :greet of protocol: #'easy.scratch/ISample found for class: easy.scratch.Sample

Provide simple java API

This could be useful for people wanting to use Nippy in projects that use both languages, and since the API is not very large at the moment it shouldn't require too much work.

2.6.0-beta1 requires many more dependencies

When upgrading from 2.6.0-alpha1 to 2.6.0-beta1, I noticed that nippy is pulling in the phone book of dependencies. When running in jar-sensitive places (hadoop), this increases the likelihood of a conflict.

Here's alpha:

+--- com.taoensso:nippy:2.6.0-alpha1
| +--- org.clojure:clojure:1.4.0 -> 1.5.1
| +--- org.clojure:tools.reader:0.8.3
| +--- org.iq80.snappy:snappy:0.3
| --- org.tukaani:xz:1.4

Here's beta:

+--- com.taoensso:nippy:2.6.0-beta1
| +--- org.clojure:clojure:1.6.0-beta1
| +--- org.clojure:tools.reader:0.8.3
| +--- org.iq80.snappy:snappy:0.3
| +--- org.tukaani:xz:1.4
| +--- com.taoensso:encore:0.8.0
| | +--- com.cemerick:austin:0.1.4
| | | +--- org.clojure:clojure:1.5.1 -> 1.6.0-beta1
| | | +--- org.clojure:clojurescript:0.0-2014 -> 0.0-2173
| | | | +--- com.google.javascript:closure-compiler:v20131014
| | | | | +--- args4j:args4j:2.0.16
| | | | | +--- com.google.guava:guava:15.0
| | | | | +--- com.google.protobuf:protobuf-java:2.4.1
| | | | | +--- org.json:json:20090211
| | | | | --- com.google.code.findbugs:jsr305:1.3.9
| | | | +--- org.clojure:google-closure-library:0.0-20130212-95c19e7f0f5f
| | | | | --- org.clojure:google-closure-library-third-party:0.0-20130212-95c19e7f0f5f
| | | | +--- org.clojure:data.json:0.2.3
| | | | +--- org.mozilla:rhino:1.7R4
| | | | --- org.clojure:tools.reader:0.8.3
| | | --- com.cemerick:piggieback:0.1.3
| | | +--- org.clojure:clojure:1.5.1 -> 1.6.0-beta1
| | | +--- org.clojure:tools.nrepl:0.2.3
| | | --- org.clojure:clojurescript:0.0-2080 -> 0.0-2173 ()
| | +--- com.keminglabs:cljx:0.3.2
| | | +--- org.clojure:clojure:1.5.1 -> 1.6.0-beta1
| | | +--- org.clojure:core.match:0.2.0
| | | +--- org.clojars.trptcolin:sjacket:0.1.0.3
| | | | +--- org.clojure:clojure:[1.3.0,) -> 1.6.0-beta1
| | | | +--- net.cgrand:regex:1.1.0
| | | | | --- org.clojure:clojure:[1.2.0,) -> 1.6.0-beta1
| | | | --- net.cgrand:parsley:0.9.1
| | | | +--- org.clojure:clojure:[1.2.0,) -> 1.6.0-beta1
| | | | --- net.cgrand:regex:1.1.0 (
)
| | | +--- com.cemerick:piggieback:0.1.0 -> 0.1.3 ()
| | | --- watchtower:watchtower:0.1.1
| | | --- org.clojure:clojure:1.3.0 -> 1.6.0-beta1
| | +--- com.cemerick:clojurescript.test:0.2.2
| | | +--- org.clojure:clojure:1.5.1 -> 1.6.0-beta1
| | | --- org.clojure:clojurescript:0.0-1934 -> 0.0-2173 (
)
| | +--- org.clojure:clojure:1.6.0-beta1
| | +--- expectations:expectations:1.4.56
| | | +--- org.clojure:clojure:1.5.1 -> 1.6.0-beta1
| | | +--- erajure:erajure:0.0.3
| | | | +--- org.clojure:clojure:1.4.0 -> 1.6.0-beta1
| | | | --- org.mockito:mockito-all:1.8.0
| | | --- junit:junit:4.8.1
| | +--- reiddraper:simple-check:0.5.6
| | +--- org.clojure:clojurescript:0.0-2173 ()
| | --- org.clojure:core.async:0.1.278.0-76b25b-alpha
| +--- expectations:expectations:1.4.56 (
)
| +--- reiddraper:simple-check:0.5.6
| +--- org.xerial.snappy:snappy-java:1.1.1-M1
| --- org.clojure:data.fressian:0.2.0
| +--- org.clojure:clojure:1.5.1 -> 1.6.0-beta1
| --- org.fressian:fressian:0.6.3

Fast serialization by default for UUIDs and Dates

I was a little surprised that UUIDs and Date objects fall back to the reader. With the new custom types mechanism in version 2.1 I was easily able to provide some custom serialization (reducing the roundtrip from 2600ms to 75ms!). However, since #uuid and #inst are in Clojure core, perhaps it's also worth including efficient serialization for these types by default?

For reference, this is how I converted the dates (in case it wasn't obvious):

(nippy/extend-freeze java.util.Date 1
  [date out-stream]
  (.writeLong out-stream (.getTime date)))

(nippy/extend-thaw 1
  [in-stream]
  (java.util.Date. (.readLong in-stream)))

Would you be interested in a pull request to implement binary serialization for both java.util.Date and java.util.UUID?

clojure.lang.PersistentVector$ChunkedSeq fails utils/freezable?.

First off, thanks for nippy--this library has saved my bacon on many occasions.

(utils/freezable? (clojure.lang.PersistentVector$ChunkedSeq [1 2 3] 0 0))
; => nil

Shouldn't this return a class? By uncommenting utils.clj:67, this appears to fix the issue, as PersistentVector$ChunkedSeq implements clojure.lang.ISeq. The value freezes and thaws just fine.

adding option to freeze directly to file

Hey,
Not much of an issue, more like a feature request..

I have a huge structure that I need to freeze, i am talking about lots and lots of GBs (my heap is at a 100GB now).

I manage to construct this huge structure, but I cant freeze it, because nippy first has to keep the entire byte-array on the heap. I dont have enough room on the machine for two representations of the DS, and was wondering if there could be an option to spit these bytes directly to a file?

Thanks :)

Binary incompatible in freeze/thaw cycle

Hello.

Are nippy guarantee this contract?

(assert (= (seq (nippy/freeze a))
           (seq (nippy/freeze (nippy/thaw (nippy/freeze a))))))

If yes, lazy-seq break it:

(def a (map identity [1 2 3]))

(assert (= (seq (nippy/freeze a))
           (seq (nippy/freeze (nippy/thaw (nippy/freeze a))))))

;; AssertionError

Consider adding de/serialization for Throwable, Exception, ExceptionInfo

A few questions worth considering:

  • Is this a reasonable goal? The Reader does not support reading exception forms.
  • Which state should we capture? Can we drop stacktraces, keep message, causes, ExceptionInfo data?
  • Assuming the goal is at all sound, is there a common-enough need to bake actually this in?
  • Consider using Serializable?

Thoughts welcome.

Working with Record

How should records be de/serialized ?

user=> (defrecord MyRec [a b])
user.MyRec
user=> (n/thaw (n/freeze (MyRec. 1 2)))

ClassNotFoundException user.MyRec  java.net.URLClassLoader$1.run (URLClassLoader.java:366)

Regards

Need to control *read-eval*

Hi again,

we need to eval expressions from strings, we control the expression source so there is zero risk for us.

Could you add a third entry point to thaw-from-bytes or a separate one allowing us to set the read-eval value ?
In the meantime we cloned 0.9.2 and tweaked it to leave read-eval at its default value.

Thank you,
Luc

Handling strigns bigger than 64K

Hi,

readUTF and writeUTF cannot handle strings more than 64K in length.

I wrote these to specify the length as an int:

(defn ^{:private true} write-long-string
[^DataOutputStream stream ^String string](let [ba %28.getBytes string)
size (alength ba)]
(.writeInt stream size)
(.write stream ba 0 size)))

(defn ^{:private true} read-long-string
^String [^DataOutputStream stream](let [size %28.readInt stream%29
ba %28byte-array size%29
_ %28.read stream ba 0 size%29]
%28String. ba)))

and used them for these entries:

(freezer String id-string (write-long-string s x))
(freezer Object id-reader (write-long-string s (pr-str x)))

and in thaw-from-stream!*

 id-reader  (read-long-string s)

..
id-string (read-long-string s)

I tested these with several hundred kilobyte strings and it works flawlessly.
I chose to use an integer length, a long might be an overkill. Up to you to change it to long if you
see the need. I did not want to optimize this further, a two bytes "penalty" looks reasonable to me
to avoid the whole issue.

Thank you,

Luc P.

Custom java object serialization

Hello,

I am trying to serialize a custom java object and, of course, i am getting

Caused by: java.lang.IllegalArgumentException: No method in multimethod 'print-dup' for dispatch value

Is there a recommended way with nippy to serialize custom objects or do i need to defmethod print-dup MyObject to make this work?

Thank you for your time

2.6.0 and 2.6.1 breaks clojure backward compatibility

My project was based on clojure 1.4 and nippy 2.5.2, upon upgrade to 2.6.0 or 2.6.1 i get

user=> (use 'taoensso.nippy)

FileNotFoundException Could not locate clojure/edn__init.class or clojure/edn.clj on classpath:   clojure.lang.RT.load (RT.java:432)
user=> (pst)
FileNotFoundException Could not locate clojure/edn__init.class or clojure/edn.clj on classpath: 
        clojure.lang.RT.load (RT.java:432)
        clojure.lang.RT.load (RT.java:400)
        clojure.core/load/fn--4890 (core.clj:5415)
        clojure.core/load (core.clj:5414)
        clojure.core/load-one (core.clj:5227)
        clojure.core/load-lib (core.clj:5264)
        clojure.core/apply (core.clj:603)
        clojure.core/load-libs (core.clj:5298)
        clojure.core/apply (core.clj:603)
        clojure.core/require (core.clj:5381)
        taoensso.encore/eval1956/loading--4784--auto----1957 (encore.clj:1)
        taoensso.encore/eval1956 (encore.clj:1)
nil

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.