Git Product home page Git Product logo

chimney's Introduction

Chimney logo

Chimney

Chimney Scala version support

CI build codecov.io License Join the discussions at https://github.com/scalalandio/chimney/discussions

Documentation Status Scaladoc 2.11 Scaladoc 2.12 Scaladoc 2.13 Scaladoc 3

The battle-tested Scala library for data transformations. Supported for (2.12, 2.13, 3.3+) x (JVM, Scala.js, Scala Native).

Chimney documentation is available at https://chimney.readthedocs.io. Read the Docs keeps it versioned in case you need to access documentation for older versions.

If you are looking to up-to-date artifacts versions ready to copy-paste into your build tool, we recommend Scaladex or Maven Repository.

Contribution

A way to get started is described in CONTRIBUTING.md and the general overview of the architecture is given in DESIGN.md and in Under the hood section of the documentation.

chimney's People

Contributors

asdcdow avatar coreyoconnor avatar cucumissativus avatar gitter-badger avatar htmldoug avatar hughsimpson avatar jatcwang avatar jchapuis avatar jchyb avatar jgogstad avatar jorokr21 avatar kmikulski avatar krzemin avatar lbialy avatar ldrygala avatar liff avatar mateuszkubuszok avatar mrtosz avatar noemirozpara avatar piotrkosecki avatar rayanral avatar reimai avatar rrramiro avatar saeltz avatar scala-steward avatar ulanzetz avatar wenta avatar willtrnr avatar yoohaemin avatar zarthross avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

chimney's Issues

Usage of tags can break derivation

Test case:

object TestConversion {
  import io.scalaland.chimney.dsl._
  import shapeless.tag
  import shapeless.tag._

  sealed trait Tag

  final case class Input(
    long:   Long @@ Tag,
    string: String
  )

  final case class Output(
    long: Long @@ Tag
  )

  val test = Input(tag[Tag](0L), "").transformInto[Output]
}

results in Could not find implicit for derivedTransformer...

Wrong type passed to withFieldConst results with ClassCastException

Minimal example:

case class Foo1(y: String)
case class Foo2(y: String, x: Int)

Foo1("test")
    .into[Foo2]
    .withFieldConst(_.x, "xyz")
    .transform

Unfortunately, this one results with ClassCastException at runtime :(

The problem is caused by the fact that both selector and passed value refers to the same type variable, which in this case is inferred to Any. When type is passed explicitly, the code is not compiling, as expected. But I think we can do better that just leaving advice to always pass types explicitly :)

The idea for fix would be to modify signature of withFieldConst by introducing separate type variable:
def withFieldConst[T, U](selector: To => T, value: U): TransformerInto[From, To, _]

Then we would like to verify that U is proper subtype of T. We can't do this by type bound (U <: T) as this would be still inferred in a single step to Any.

We could introduce something similar to type-level evidence implicit ev: U <:< T, but most probably inside macro code, as this is whitebox macro that already takes implicit weak type tags. Plus: in macro we can format pretty error message.

Note that most probably the same problem may exists in withFieldComputed and withFieldRenamed. Hopefully, idea for fix would be just the same.

Support for custom strategy of patching via options

EDIT: The actual description is in @krzemin's post below. This was originally a different issue.

Given

case class Foo(name: String, age: Int)
case class FooPatch(name: Option[String], age: Option[Int])

there could be a function taking instances of both and modifying the Foo with the present fields from FooPatch.

Combination of supported features fails to compile

I have a test case during entity initialization that goes like this:

case class NewEntity(name: Option[String])
case class Entity(id: Long, name: Option[String], isDeleted: Boolean)

NewEntity(Some("name")).into[Entity]
    .withFieldConst('id, 0L)
    .withFieldConst('isDeleted, false)

I think we can start working on it in feature/feature-combination where I added the test case.

add automatic field renaming according to custom function or popular conventions

It might be useful to consider adding a method that automatically renames all fields of a class according to provided f: String => String, something along these lines:

case class MyDTO(my_string_field: String, my_int_field: Int)
case class MyEntity(myStringField: String, myIntField: Int)

def rename(s: String) = {
  val words = s.split("_").toList
  (words.head :: words.tail.map(_.capitalize)).mkString
}

MyDTO("abc", 1).into[MyEntity].withFieldsAutorenamed(rename).transform

If running a user-provided function in compile time turns out to be impossible or we deem it too dangerous, perhaps we could support transitions between the common case styles?

withFieldConst doesn't for work None

This doesn't work, but should!

case class X(a: Int)
case class Y(a: Int, b: Option[String])

X(5).into[Y].withFieldConst(_.b, None).transform

error: could not find implicit value for parameter transformer: io.scalaland.chimney.DerivedTransformer[X,Y,io.scalaland.chimney.internal.Modifier.fieldFunction[Symbol with shapeless.tag.Tagged[String("b")],X,None.type] :: shapeless.HNil]
       X(5).into[Y].withFieldConst(_.b, None).transform
                                              ^

when None is explicitly up-casted to Option[String], it works seamlessly:

X(5).into[Y].withFieldConst(_.b, None : Option[String]).transform

Optional non-empty fields in patched class overwritten by empty optional fields of patcher

Consider the following:

case class User(name: Option[String], age: Option[Int])

case class UserPatch(name: Option[String], age: Option[Int])

val user = User(Some("John"), Some(30))
val userPatch = UserPatch(None, None)

user.patchWith(userPatch) == user //expected behaviour, but *false* 
user.patchWith(userPatch) == User(None, None) //true

I would expect that, if the optional fields of the patchers are empty and those of the patched class are not, the latter ones should remain. Is the current behaviour the expected one ?

local Transformer instance doesn't have precedence for value classes

Consider following example:

case class VC(x: String) extends AnyVal

This works as expected, picking an instance for value classes:

VC("test").transformInto[String] // "test" : String

While in this example we would expect local Transformer instance to be used.

implicit val vcTransformer: Transformer[VC, String] = _ => "abc"
VC("test").transformInto[String] // "test" : String, but should be "abc"

Better handling Option[T] in target when no value provider can be found

Current support for Option[T] types is IMO not sufficient enough. For now we support simple transformation derivation rule that having transformation A -> B can infer transformation of Option[A] -> Option[B]. It's useful, but doesn't handle one common case, especially present in data types evolution:

case class Foo1(x: String)
case class Foo2(x: String, y: Option[Int])

We can't automatically transform Foo1 to Foo2 (and probably we shouldn't by default), but it would be nice to have a modifier that automatically handles all optional values in a way that in case when no provider in the source type can be found, we can fill it automatically with None.

Example usage:

Foo1("abc").into[Foo2].withMissingOptionalsDefaultToNone.transform

Obviously, shorter name than withMissingOptionalsDefaultToNone, but still being able to express the intent would be useful.

It should be probably addressed after fixing #46.
We should check how it interacts with #45 proposal. Maybe we don't need both of them?

Improve selector API even more

After merging #34 it shouldn't be possible to pass arbitrarily nested field selection as selector, but only field selection from captured lambda argument. But.. not exactly.

Consider following example:

val x: Foo = ...
val y: Bar = ...

x.into[Foo2].withConstField(c => y.f, ...).transform

That example should compile smoothly if both Bar and Foo contain fields f of corresponding types. Otherwise, we ends up with ugly (shapeless-related) compilation error, instead of "invalid selector". It can be improved by enforcing additional check in a macro so that referred name is always equivalent to lambda argument name which should work both with true name referring (c => c.f) and with wildcards (_.f).

Optional patcher

Hi. Can you provide something like optional patcher? The idea would be to do something like this:

case class Foo(a: String, b: Int, c: Double)
case class Bar(a: Option[String], b: Option[Int])

val f = Foo("aaa", 123, 0.33)
val b = Bar(None, Some(987))
f.patchWithOpt(b) should be Foo("aaa", 987, 0.33) 

At the moment the way I can do it is:

f.copy(a = b.getOrElse(f.a), ...) 

Complete test suite for protocol buffers

The idea is to:

  • configure scalaPB to be run only on test configuration
  • define simple example protocol with (possibly) all supported PB data types - case classes should be automatically generated by scalaPB
  • define corresponding domain in scala world, but with value classes instead of primitives
  • write tests that check conversion round trips for many various instances

It probably includes #7 to complete (pb enums are like sealed hierarchies, but with non-total mapping, we'll need to think about it).

@MateuszKubuszok WDYT?

Support reading from non-case classes

This would be really convient in a ScalaJs context. Native JS classes defined in Scala can't be case classes. Frequently, I want to transform native JS classes to my own internal case classes in order to use code shared with the JVM.

Obviously I would prefer reading and writing but I assume reading is much easier since I think you can enumerate all of the class fields.

Fallback on existing target

Imagine case when I have public API model and internal model:

case class ApiModel(name: Name, surname: Surname)
case class DomainModel(id: Id, name: Name, surname: Surname, additionalData: AdditionalData)

when I receive ApiModel from the outside I would like data to be moved to a new DomainModel - but properties not overridden by the update could be taken from existing instance, e.g.:

apiModel.into[DomainModel].fallbackOn(existingModel).transform

@krzemin WDYT?

Change base package to chimney?

With syntax-heavy libraries it usually helps a lot to have a shorter implicit. Currently with Chimney I find myself doing import io.scalaland.chimney.dsl._ a lot, but what I'd really like to do is just import chimney.implicits._ or chimney.dsl._.

I suggest renaming the base package in a future release (maybe by introducing a type alias to the old package, or move everything and have deprecated type aliases pointing to the new package).

Add example of ADTs with more than enums

The current example:

sealed trait Color
object Color {
  case object Red extends Color
  case object Green extends Color
  case object Blue extends Color
}

sealed trait Channel
object Channel {
  case object Alpha extends Channel
  case object Blue extends Channel
  case object Green extends Channel
  case object Red extends Channel
}

doesn't show that it's possible to transform if you have a more complex ADT with class leaves. That could be improved :)

Improve handling of untyped polymorphed classes

I was able to transform polymorphic class as in #61

mono.into[PolyTarget[T]].withFieldComputed(_.poly, src => src.poly.asInstanceOf[T]).transform

But in my requirement I would prefer to omit type at all here and return PolyTarget[_]

mono.into[PolyTarget[_]].transform
// compile error printed: class type required but PolyTarget[_] found

Would that be possible?

Control over configuration and reasonable defaults

So far we had only one configuration switch that controlled if we look up into default values when performing case class transformer expansion. By default we had this behavior turned on and the user had an ability do disable it using .disableDefaultValues dsl operation.

Considering recent contributions #80 and #81, we have 2 new flags that add some extra behavior around Java beans and optional values in target objects. And here comes the struggling whether they should be turn on by default, which of them and how we want to control the configuration in some wider scope.

One of the proposals would be to have them all turned off by default (that means change to current semantics!) and have the ability to turn them on with a single dsl operation.

sealed trait ExtraBehavior
case object DefaultValuesLookup extends ExtraBehavior
case object JavaBeansNaming extends ExtraBehavior
case object TargetOptionalDefaultsToNone extends ExtraBehavior

obj.into[Target]
  .enable(DefaultValuesLookup, JavaBeansNaming, TargetOptionalDefaultsToNone)
  .transform

// or even passed to `into`

obj.into[Target](enable(DefaultValues, Beans, OptionalDefaultsToNone))
  .transform

As this is breaking change, for least surprise, I would keep .disableDefaultValues in dsl that aborts the compilation with a message that informs about this is not needed any more and that other usages should be properly adjusted.

We could also think about separate configuration object that can be implicitly passed to .transformInto and .into which would allow pre-configuring the library in scope of the whole module or project, in case someone wants to have extras on by default without boilerplate.

Improve error handling

Currently, for such a simple invalid usage of a library:

case class DoSthCommand(id: Int, what: String, requestor: String, doImmediately: Boolean)
case class DidSthEvent(id: Int, didAt: Long, what: String, requestor: String)
val command = DoSthCommand(10, "Prepare good coffee", "John", true)
val event = command.into[DidSthEvent].transform

produced error is somehow cryptic:

error: could not find implicit value for parameter transformer: io.scalaland.chimney.DerivedTransformer[DoSthCommand,DidSthEvent,shapeless.HNil]
       command.into[DidSthEvent].transform

I think we can do better when it comes to error analysis and provide more detailed compile-time message about value for didAt field should be provided with one of the DSL modifier (like withFieldConst).

Writing to Java beans

As we now support reading from Java beans, for completeness we could also implement support for writing to them.

Let's assume that Java bean is a class that:

  • have public, no-arg constructor
  • for each target private field foo of type T it has a method of following signature def setFoo(value: T): Unit

IMO it's OK to provide initial support that covers only a case, when we have all the required data in the source object. In the scope of this ticket it's not necessary to make dsl operations like withFieldConst, withFieldComputed, withFieldRenamed working with Java beans targets.

Feature request: Support pluggable, map-like models

My use case is object mapping to and from case classes to Map-like structures such as Jackson's
ObjectNode. We do a lot of this when serializing data to and from JSON. It would be awesome if Chimney supported mapping to Map-like models where fields on the Map side are referenced by their name as strings. Ex:

val personNode: ObjectNode = person.into[ObjectNode]
  .with(_.name, "theName")
  .with(_.age, "theAge")

And the reverse:

val person: Person = personNode.into[Person]
  .with("theName", _.name)
  .with("theAge", _.age)

Obviously to make this work with external models that Chimney doesn't know about, it would need a simple SPI that could allow external model support to be plugged into Chimney. Ex:

trait MapLikeSupport[T] {
  def create(): T

  def set(map: T, key: String, value: Any): T

  def get(map: T, key: String)
}

Making it better

Since the main use case for this is serializing/deserializing from JSON to case classes, it would be awesome if Chimney supported bi-directional mappings for the above, so a mapping could be defined once and used in each direction. Ex:

case class Person(name: String, age: String)
val mapper = mapperFor[Person, ObjectNode]
  .with(_.name, "theName")
  .with(_.age, "theAge")

val personNode = mapper.mapTo(person1) // serialize
val person2 = mapper.mapFrom(personNode) // deserialize

Patching from another instance of the same case class

Thank you for this great library! It's incredibly useful together with Diode+React.

Here's a suggestion, which I think naturally falls within the domain of chimney: Being able to patch a case class instance from another instance, specifying the fields to patch. It can be seen as nicer syntax for .copy(...) specifying override fields from the other instance.

For example:

user.patchFrom(anotherUser, _.email, _.phone, _.address)

which would be equivalent to:

user.copy(email = anotherUser.email, phone = anotherUser.phone, address = anotherUser.address)

Thank you!

ADT failing to be transformed when using Enumeratum

Hi, I was halfway through reporting this issue when I found the (easy) solution but I figured I will report it anyway since it might be useful for someone else experiencing the same issue.

Problem

I have an ADT defined in a dependency of our project and I've been using chimney to transform it into our own ADT so we don't get to share the same domain code from this dependency. Here's a simple example that reproduces my issue:

sealed trait Animal
object Animal {
  sealed trait Reptile extends Animal
  sealed trait Mammal extends Animal

  case object Cat extends Mammal
  case object Dog extends Mammal

  case object Lizard extends Reptile
  case object Turtle extends Reptile
 }

Basically I define the same ADT in our project with a small difference to help the type inference system:

sealed trait Animal extends Product with Serializable

And I have another class that makes use of it in this way:

case class Foo(animals: Map[Animal, Map[Animal.Reptile, BigDecimal]])

The transformation works seamlessly by defining a custom Transformer[LibAnimal, Animal].


Now I have the need to use enumeratum for my local representation of Animal in order to construct valid values from a string representation withNameInsensitive. And so my new ADT is defined as follows:

import enumeratum._

sealed trait Animal extends EnumEntry
object Animal extends Enum[Animal] {
  val values = findValues

  sealed trait Reptile extends Animal
  sealed trait Mammal extends Animal

  case object Cat extends Mammal
  case object Dog extends Mammal

  case object Lizard extends Reptile
  case object Turtle extends Reptile
 }

But now chimney fails to compile giving me the following error:

[info] Compiling 5 Scala sources to /workspace/xxx/target/scala-2.12/classes ...
[error] /workspace/xxx/Converter.scala:27:33: type mismatch;
[error]  found   : scala.collection.immutable.Map[remote.Animal,scala.collection.immutable.Map[local.Animal.Reptile with Product with Serializable,scala.math.BigDecimal]]
[error]  required: Map[remote.Animal,Map[local.Animal.Reptile,BigDecimal]]
[error]     remoteFoo.into[Foo].transform
[error]                                 ^
[error] one error found
[error] (service / Compile / compileIncremental) Compilation failed
[error] Total time: 5 s, completed Sep 19, 2018 7:15:31 PM

Solution

Define the ADT by mixing Product with Serializable like this:

sealed trait Animal extends EnumEntry with Product with Serializable

Simple example failure

Hello

scala> case class One(text: Option[String])
scala> case class Two(text: Option[String])
scala> val one = One(None)
scala> import io.scalaland.chimney.dsl._
scala> one.transformInto[Two]
<console> error: could not find implicit value for parameter derivedTransformer: io.scalaland.chimney.DerivedTransformer[One,Two,shapeless.HNil]
             one.transformInto[Two]
                              ^

Scala version: 2.12.3,
Library version: 0.1.5
Any advice?

Bidirectional mapping

Sometimes if you have two classes, it might be nice to be able to define a single bi-directional transform instead of having to write one for each one

// To send to your fancy no-sql store; fixed Schema
case class PersistedUser(name: String, address: String) 

// To send to your users
case class ApiUser(name: String, streetAddress: String)

// Declare a 2 way transformer
val biTransformer = BiTransform[PersistedUser, ApiUser]
  .withFieldRenamed(_.address, _.streetAdress)

val p = PersistedUser("Coookie", "Sesame Street")
val a = biTransformer.from(p)
val pAgain = biTransformer.from(a)

Not sure if/how this could be done, but just a thought that was inspired somewhat by play-json's `Format, which acts as both an encoder and a decoder, coupled to their functional combinator syntax, which comes together to allow for this sort of thing:

// Bi-directional formatter with renaming.
val locationFormat: Format[Location] = (
      (JsPath \ "lat").format[Double] and
      (JsPath \ "long").format[Double]
    )(Location.apply, unlift(Location.unapply))
Actual Ammonite-usage of play-json
14:00 $ amm
Loading...
Welcome to the Ammonite Repl 1.0.5
(Scala 2.12.4 Java 1.8.0_152)
If you like Ammonite, please support our development at www.patreon.com/lihaoyi
@ import $ivy.{`com.typesafe.play::play-json:2.6.10`}
import $ivy.$

@ import play.api.libs.json._, play.api.libs.functional.syntax._
import play.api.libs.json._, play.api.libs.functional.syntax._

@ case class Location(latitude: Double, longitude: Double)
defined class Location

@ implicit val locationFormat: Format[Location] = (
      (JsPath \ "lat").format[Double] and
      (JsPath \ "long").format[Double]
    )(Location.apply, unlift(Location.unapply))
locationFormat: Format[Location] = play.api.libs.json.OFormat$$anon$1@519e67e

@ val l = Location(123d, 123d)
l: Location = Location(123.0, 123.0)

@ val j = Json.toJson(l)
j: JsValue = JsObject(Map("lat" -> JsNumber(123.0), "long" -> JsNumber(123.0)))

@ val lAgain = j.as[Location]
lAgain: Location = Location(123.0, 123.0)

It seems to me like doing something like that in Chimney would be "even better" because we don't need to mess around with a JsPath and things are fully typed.

Tests failed to compile out of the box

Test specs failed to compile out of the box e.g,

[error] .../chimney/chimney/src/test/scala/io/scalaland/chimney/PatcherSpec.scala:8:17: value should is not a member of String
[error]   "Patcher DSL" should {

Support for Scala Native

As chimney is based only on compile-time reflection, there should be no blockers preventing from having support for Scala Native.

The task would be to try it out using sbt-crossproject and identify possible problems.

Support for Proto3

In protobuf 3, every message field now is made optional. It will be nice that Chimney could make this work:

case class Foo(x: Option[String], y: Option[Int])
case class Bar(x: String, y: Int, z: Boolean = true)

Foo(Some("abc"), Some(10)).transformInto[Bar]

Paths instead of symbol literals in DSL

Currently in DSL we have:

.withFieldConst('field, ...)
.withFieldComputed('field, ...)
.withFieldRenamed('field1, 'field2)

But this provides weak IDE support for code completion. It'd be better to have the syntax like:

.withFieldConst(_.field, ...)
.withFieldComputed(_.field, ...)
.withFieldRenamed(_.field1, _.field2)

Support default values of case class parameters

Currently, having following 2 definitions:

Foo1(x: Int, y: String)
Foo2(x: Int, y: String, z: Boolean = false)

Chimney will fail to simply derive Foo1 -> Foo2 transformation unless you explicitly pass value for z parameter using modifiers like withFieldConst.

But in this particular case we already have default value for z in its definition! The task would be to implement library support for that case.

Support reading from Java bean sources

Having Java bean-like class:

class Foo(private var id: Int, private var name: String) {
  def getId: Int = id
  def getName: String = name
}

we would like to be able to transform it to a case class:

case class Bar(id: Int, name: String)

We cannot simply access fields, because they're private, but we can add support for interpreting getters with getXyz name as sources for xyz fields.

Option to require explicit field discarding

I would very much appreciate if usage of default values could be turned off.

At my current project we use default values quite liberally. I use chimney to transform between semi-normalized model objects and huge fat DTOs for frontend to consume.

Combined with the fact that chimney implicitly removes extra fields, adding new fields tends to not be typo-safe:

  case class DTO(media: List[String] = List())
  case class Model(medias: List[String] = List())

  DTO(List("1", "2", "3")).transformInto[Model] // Gives B(Nil), silently losing data at runtime

Furthermore, I get that behavior if I add a field with default to model, but forget to add it to DTO. Since frontend might take a week to catch up, it's quite a pain to figure out after doing a couple other things ๐Ÿ˜…

If possible, I would like to get a compile-time error with these cases.

Another possible alternative is the option to disable default values. Both would be nice to have, actually

Better Java standard libraries interop

As we now support Java beans, it might be useful to support automatic transformations between:

  • scala Option and java Optional
  • scala collections and java collections
  • scala maps and java maps
  • java enums and scala sealed families of case objects

anything else?

Field renamings tied to type instead of instance

Firstly, thanks for writing this awesome lib, I think it serves ~95% of our use cases for writing things manually :)

One thing that might be useful is to be able define field renamings that aren't tied to a specific instance of a type. I think we can currently do renamings via .into from the DSL, and it's powered by TransformInto, but it seems to be tied to a specific source: From.

final class TransformerInto[From, To, C <: Cfg](val source: From,
val overrides: Map[String, Any],
val instances: Map[(String, String), Any]) {

In my mind, it might be nice to have something like

import io.scalaland.chimney.dsl._

case class A(firstName: String, lastName: String, age: Int)
case class B(nameFirst: String, nameLast: String, age: Int)

// Fail compilation if not all required re-mappings are complete
// Note that age does not need to be re-mapped, as is the case with existing `.into`
implicit val aToBTransformer: Transformer[A, B] = TransformerInto
  .derive[A, B]
  .withFieldRenamed(_.firstName, _.nameFirst)
  .withFieldRenamed(_.lastName, _.nameLast)
  .toTransformer

val a = A(firstName = "joe", lastName = "Blow", age = 10)
val b = a.transformInto[B]

My current work-around is to define a method that uses the DSL, but something on the type-level might be nice so it's (1) more easily reusable, and (2) can be picked up when deriving other Transformers that nest A and B (e.g. Transformer[ContainsA, ContainsB])

def aToBTransform(a: A): B = {
  a.into[B]
    .withFieldRenamed(_.firstName, _.nameFirst)
    .withFieldRenamed(_.lastName, _.nameLast)
    .transform
}

Rename w/ transform?

I was thinking it might be nice to have a version of withFieldRenamed that would rename + take an in-scope Transformer to rename and transform at the same time; currently you would need to do withFieldComputed

e.g.

case class InternalTicket(createdAt: DateTime)

case class ExternalTicket(createdAtMillis: Long)

implicit val jodaDateTimeToLong = new Transformer[DateTime, Long] {
  def transform(dt: DateTime): Long = dt.toInstant.getMillis
}

val externalTicket = InternalTicket(DateTime.now).into[ExternalTicket]
  .withFieldRenamed(_.createdAt, _.createdAtMillis)

Currently, it seems like we need to do

val externalTicket = InternalTicket(DateTime.now).into[ExternalTicket]
  .withFieldComputed(_.createdAtMillis, _.toInstant.getMillis)

which is marginally longer for this small example, but can get repetitive if there are more fields.

I also tried with an in-scope implicit conversion, but it doesn't get picked up.

As with my previous issue, it might be that this is already possible, but it doesn't seem like it ๐Ÿ˜…

Add default transformation from T to Option[T]

import io.scalaland.chimney.Transformer

trait ChimneyUtils {

  implicit def someByDefault[T]: Transformer[T, Option[T]] = (x: T) => Some(x)

  implicit def getByDefault[T]: Transformer[Option[T], T] = (x: Option[T]) => x.get

}

I wouldnt enable getByDefault byDefault but the other one is rather safe.

Nested patchers?

For the time being we don't seem to support nested patchers. It would be nice to provide some useful example and possibly decide if we want to add support for it (as an additional rule that supports nested patching out of the box or as a completely separate type class).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.