Git Product home page Git Product logo

skeuomorph's People

Contributors

47degfreestyle avatar 47erbot avatar alejandrohdezma avatar angoglez avatar antoniomateogomez avatar benfradet avatar benivf avatar bilki avatar calvellido avatar cb372 avatar daenyth avatar dmarticus avatar dmartin-ta avatar dzanot avatar fedefernandez avatar franciscodr avatar isabelduran avatar israelperezglez avatar jesusmtnez avatar jlofgren avatar juanpedromoreno avatar l-lavigne avatar noelmarkham avatar oli-kitty avatar pepegar avatar rlmark avatar scala-steward avatar sloshy avatar thatscalaguy avatar vil1 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

skeuomorph's Issues

Rename Schemas to Types

I think it'd make more sense to rename schema.scala files to types.scala, and ADTs of types from AvroF... to AvroType, for example

Improve Option support for Proto2

Source generated code from Mu creates case classes with member variables of the base type not wrapped in an Option. Scalapb code generation (https://scalapb.github.io/generated-code.html) creates case classes where each proto2 optional field is wrap in an Option, with the default value being None. Additionally the default case class value for each sequence is Nil.

Doing this makes instantiating case classes a better experience when using proto2, as you don't have to specify each field to be used.

finish move to higherkindness

We need to move the following:

  • ( #46 ) move the group ID of the modules to higherkindness (We probably need a different set of keys for this)
  • ( #26 ) move the skeuomorph.freestyle package to skeuomorph.mu

Add a new combinator to escape Scala reserved words

Given a valid schema in any format (mu, protobuf, avro, openapi...), when printing the Scala code, reserved words are not being correctly escaped (a common one is type). From the top of my head, these should be things like:

  • import paths
  • def, val identifiers
  • names in argument lists
  • others

A possible solution could be to add a new printer combinator that escapes those reserved words and use that instead of raw printing those identifiers.

Static analysis on schemas

We should be able to perform static analyisis on schemas. The idea is that when uploading that schema to Catamorph, we should be able to warn the user if they're doing something that we know is wrong, or hint a different approach.

For this issue to be done, we need to:

  • Create all the plumbing needed to create rules
  • Define and implement rules

Protobuf oneOf to generate coproducts without Option-ality

In proto3 there is no required constraint available.

As a consequence this

oneof value {
  A : a = 1;
  B : B = 2;
}

generates

Option[A] :+: Option[B] :+: CNil

Proposing the Option-ality is removed to generate shapeless A :+: B :+: CNil instead.

Rationale:

  1. The Option-ality seems redundant in the coproduct for evident reasons: if A is present in the coproduct then it is not optional.
  2. Generates more complex than necessary service apis.
  3. Slows down the runtime marshalling.

Given that skeuomorph is schema transformation library the user should have more flexibility at the time use from mu (time of service api design) and not be bound by the semantics of source IDL (context #91).

Resources

  1. https://stackoverflow.com/questions/42955621/require-a-oneof-in-protobuf
  2. https://news.ycombinator.com/item?id=18188519

Add documentation

We would like to have more descriptive documentation for Skeuomorph. The purpose of this issue is to enhance the current documentation, to let newcomers understand what this library does, and how to take advantage of the features.

Support for compression in avro/proto

The gRPC operations support Gzip compression. We need to support that in the printer gen.

The @service annotation has a second param for specifying the compression, so potentially we could add a new field in the Operation type and use it for the compression param.

Unified AST

introducing UAST

disclaimer: UAST term is borrowed from https://doc.bblf.sh/uast/uast-specification.html


Currently we declare different ASTs for different protocols. We
have AvroF for Avro, ProtobufF for Protocol Buffers, MuF for
describing Mu services...

Table of Contents

the problem

However, we are repeating ourselves a lot. See the implementation of
AvroF:

  ...
  final case class TNull[A]()                  extends AvroF[A]
  final case class TBoolean[A]()               extends AvroF[A]
  final case class TInt[A]()                   extends AvroF[A]
  final case class TLong[A]()                  extends AvroF[A]
  final case class TFloat[A]()                 extends AvroF[A]
  final case class TDouble[A]()                extends AvroF[A]
  final case class TBytes[A]()                 extends AvroF[A]
  final case class TString[A]()                extends AvroF[A]
  final case class TNamedType[A](name: String) extends AvroF[A]
  final case class TArray[A](item: A)          extends AvroF[A]
  final case class TMap[A](values: A)          extends AvroF[A]
  ...

And the implementation of MuF:

  ...
  final case class TNull[A]()                                        extends MuF[A]
  final case class TDouble[A]()                                      extends MuF[A]
  final case class TFloat[A]()                                       extends MuF[A]
  final case class TInt[A]()                                         extends MuF[A]
  final case class TLong[A]()                                        extends MuF[A]
  final case class TBoolean[A]()                                     extends MuF[A]
  final case class TString[A]()                                      extends MuF[A]
  final case class TByteArray[A]()                                   extends MuF[A]
  final case class TNamedType[A](name: String)                       extends MuF[A]
  final case class TOption[A](value: A)                              extends MuF[A]
  final case class TEither[A](left: A, right: A)                     extends MuF[A]
  final case class TList[A](value: A)                                extends MuF[A]
  final case class TMap[A](value: A)                                 extends MuF[A]
  ...

We repeat basically all the leaves of the AST, because all different
IDLs allow to represent more or less the same types, with minor
differences.

Also, having all those different ASTs means that we cannot create
functions that work on generic ASTs, but need to specify them.

The solution

I would prefer using a unique set of classes representing different
types possible in any schema, and then the specific schemata pick and
choose from them in order to assemble their ADT, as follows.

First we would need to declare all our base leaves:

  final case class TNull[A]()
  final case class TBoolean[A]()
  final case class TInt[A]()
  final case class TLong[A]()
  final case class TFloat[A]()
  final case class TDouble[A]()
  final case class TBytes[A]()
  final case class TString[A]()
  final case class TNamedType[A](name: String)
  final case class TArray[A](item: A)          // sugar over Generic(NamedType("Map"), A, A)
  final case class TMap[A](keys: A, values: A) // sugar over Generic(NamedType("Map"), A, A)
  final case class TFixed[A](name: String, size: Int)
  final case class TOption[A](value: A)          // sugar over Generic(NamedType("Option"), A)
  final case class TEither[A](left: A, right: A) // sugar over Generic(NamedType("Option"), A)
  final case class TList[A](value: A)            // sugar over Generic(NamedType("Option"), A)
  final case class TGeneric[A](generic: A, params: List[A])
  final case class TRecord[A](name: String, fields: List[Field[A]])
  final case class TEnum[A](name: String, symbols: List[String])
  final case class TUnion[A](options: NonEmptyList[A])

N.B: these classes do not extend anything, they're plain old case
classes.

Then, for combining those case classes into an Algebraic Data Type, we
can use coproducts. Coproducts and its motivations are well explained
in this paper.

Higher kind coproducts are implemented in several libraries in the
scala FP world, such as Scalaz, Cats, and Iota. In order to do some
dogfooding, we'll be using iota's implementation.

Then, we would assemble our types using coproducts, as follows:

import iota._
import iota.TListK.:::

type MuType[A] = CopK[
  TNull :::
  TDouble :::
  TFloat :::
  TInt :::
  TLong :::
  TDouble :::
  TBoolean :::
  TString :::
  TBytes :::
  TNamedType :::
  TOption :::
  TEither :::
  TList :::
  TGeneric :::
  TArray :::
  TMap :::
  TRecord :::
  TEnum :::
  TUnion :::
  TNilK,
  A]


type AvroType[A] = CopK[
  TNull :::
  TBoolean :::
  TInt :::
  TLong :::
  TFloat :::
  TDouble :::
  TBytes :::
  TString :::
  TNamedType :::
  TArray :::
  TMap :::
  TRecord :::
  TEnum :::
  TUnion :::
  TFixed :::
  TNilK,
  A]

See how, in this case, each AST declaration decided to pick only the
types it needed.

Usage

The only missing part of this idea is how to use it. One of the
features motivating this change is code reuse, and generic
programming. In the following code block you can see how some generic
transformations on these ASTs could be implemented.

  def desugarList[F[α] <: CopK[_, α], A](
      implicit
      L: CopK.Inject[TList, F],
      G: CopK.Inject[TGeneric, F],
      N: CopK.Inject[TNamedType, F],
      A: Embed[F, A]
  ): Trans[F, F, A] =
    Trans {
      case L(TList(t)) => generic[F, A](namedType[F, A]("List").embed, List(t))
      case x           => x
    }

  def desugarOption[F[α] <: CopK[_, α], A](
      implicit
      O: CopK.Inject[TOption, F],
      G: CopK.Inject[TGeneric, F],
      N: CopK.Inject[TNamedType, F],
      A: Embed[F, A]
  ): Trans[F, F, A] =
    Trans {
      case O(TOption(a)) => generic[F, A](namedType[F, A]("Option").embed, List(a))
      case x             => x
    }

  def desugarEither[F[α] <: CopK[_, α], A](
      implicit
      E: CopK.Inject[TEither, F],
      G: CopK.Inject[TGeneric, F],
      N: CopK.Inject[TNamedType, F],
      A: Embed[F, A]
  ): Trans[F, F, A] =
    Trans {
      case E(TEither(a, b)) => generic[F, A](namedType[F, A]("Either").embed, List(a, b))
      case x                => x
    }

As you can see, these functions do not work on a specific AST.
Instead, they are generic, meaning that they will work on any AST for
which some injections exist.

work to be done

We need to implement a way of traverse derivation for Iota's
coproducts. Traverse can be derived mechanically, and I think the
fastest way would be a generic-like derivation a la shapeless.

resources

some useful resources for understanding this approach are this
talk
and this paper.

Open-API Schemas: Client Generation

After recent work by @BeniVF, Skeumorph is able to parse an API specification in the OpenAPI 3.0 format (a.k.a. Swagger), and of generating an http4s client library for it. However, HTTP being a somewhat complicated protocol, not all features are supported from the beginning.

The goal of this over-issue is to keep track of the features that are pending to be supported by Skeumorph / OpenAPI.

  • Support for query parameters: elements in query parameters should be included as input parameters of the client method.
  • Support for getting a URI from the Location Header of a 201 Created response, usually for a POST or PUT request, that has created a new entity

The proto and avro printers need to receive a new param for setting the idiomatic gRPC to true

Relates to higherkindness/mu-scala#599

We need a way for indicating to the printers that the generated @service annotation should be populated with the namespace and methodNameStyle params.

Ideally, it should be a flag called something like useIdiomaticRPC that when true it'll put set the namespace equal to Some($package) and methodNameStyle to Capitalize. When false, it can omit those params.

After a talk with @juanpedromoreno, we thought the printer could receive some kind of metadata object with this flag and potentially others.

Improve package and directory structure

Our package structure does not match our directory structure, for example in the file src/main/scala/mu/print.scala:

package skeuomorph
package mu

This causes various inconveniences such as ScalaTest/Specs2 integration in IntelliJ IDEA being unable to identify and run tests together from the src/test root.

We also use chained package declarations which is somewhat non-standard, and it might be worth revisiting that choice.

A re-arrangement of packages would also be a good time to change the package roots once we've agreed internally on a naming convention for higherkindness, as part of #18.

Similar suggestion for mu: higherkindness/mu-scala#480.

Add more tests

  • check laws of Functor/Traverse instances
  • test schema transformations
  • test pretty printing

Fix protobuf enums representation in generated scala.

This issue is a related to Mu issue higherkindness/mu-scala#611.

Given this protobuf enum:

enum Test {
  LATER = 0;
  HELLO = 1;
  GOODBYE = 2;
  HI = 5;
}

This scala code is currently generated:

sealed trait Test
object Test {
  case object LATER extends Test
  case object HELLO extends Test
  case object GOODBYE extends Test
  case object HI extends Test
}

Not only does this lose the integral value assigned to each enum member, but it does not work correctly with PBDirect - an empty field is sent over the wire and deserialization produces the wrong value.

For PBDirect, the enum needs to be encoded like this:

sealed trait Test extends Pos
object Test {
  case object LATER extends Test with Pos._0
  case object HELLO extends Test with Pos._1
  case object GOODBYE extends Test with Pos._2
  case object HI extends Test with Pos._5
}

One issue involved here is that, in mu/Transform.scala, both protobuf and avro enums are mapped to the MuF TSum, where the integral value associated with the enum members.

Since avro enums don't have the concept of assigned integral values, this makes them not isomorphic with the protobuf enums. So, one way forward would be to create a new MuF instance named something like TSumIntegral or TSumValued that preserves the integer associated with the protobuf enums.

This would be a fairly simple change, but it does mean that the code generated by skeumorph for the protobuf enums would be PBDirect specific. (Pos, Pos._1, etc. are defined in PBDirect.) This would mean that the MuF representation would not be completely decoupled from protobuf.

I would love to hear if someone has some better ideas.

Note: If we do add the extends Pos etc., import pbdirect.Pos will need to be added to the top of the generated source file. This is done by Mu in ProtoSrcGenerator.scala. The ProtoSrcGenTests would also need to be updated.
The skeumorph readme also contains an example enum and generated code.

Domain model / data schema in Scala, Protobuf and Avro

Looking to develop a combination of gRPC services with mu-RPC and REST endpoints which share a common domain model.

Would like to reduce boiler plate of maintaining multiple versions of the model and wondering if skeumorph can help. This provides more context for the motivation https://medium.com/@sugarpirate/exploring-the-strongly-typed-graph-31fc27512326.

Do you see a problem in principle with attempting this?

Is skeumorph mature enough to make possible
1, maintaining the truth model in protobuf
2. generating Scala domain classes and @message classes (through a few custom morphisms)
3. generating Avro version of same model from 1.

Or some combination of the above.

Proto2 primitives should be Option[primitive]

syntax = "proto2";

package src.main.hello;

message SayHelloRequest {
  optional string name = 1;
}
message SayHelloResponse {
  optional string message = 1;
}

service HelloWorldService {
  rpc SayHello (SayHelloRequest) returns (SayHelloResponse) {}
}

Should generate case classes with fields which are Option[String] instead of String. There is a semantic difference between what skeuomorph produces and the Proto2 spec. This means that you can't send the "not-set" string/int/bool value with skeuomorph without explicitly passing null (which is awful). This is problematic when dealing with apis that expect a null value be distinguishable from the default value.

See #89 and #92 which fix this for non-primitive types.

multimaster setup

Since the beginning of the development of UAST there were some features that got introduced to master. Since porting them all in a single PR to UAST is a big endeavor, we will use a multi master setup with two branches:

series/0.0

We will rename the current master to series/0.0. In this branch we will try not to introduce new functionality apart of fixes to code being used by other applications.

series/0.1

Once we all aggree on uast branch, we will rename it to series/0.1. All new functionality should go to this branch.

Stop merging dependent messages from imports

Currently, we are compiling the proto passed as an argument and the dependent messages from the imports are being merged in the same companion object.

More details of the issue:

If we have author.proto:

package com.acme;

message Author {
    string name = 1;
    string nick = 2;
}

and a book.proto:

package com.acme;

import "author.proto";

message Book {
    int64 isbn = 1;
    string title = 2;
    repeated Author author = 3;
}

If we try to produce the scala code given the book.proto

What we have

We are generating code like this:

package com.acme

object book {
  @message final case class Author(name: String, nick: String)
  @message final case class Book(isbn: Long, title: String, author: List[Author])
}

What we want to have

We are generating code like this:

package com.acme

import com.acme.author.Author

object book {
  @message final case class Book(isbn: Long, title: String, author: List[Author])
}

Think of a better representation of Protocols AST

Currently we are putting a lot of effort in Types AST but we're not considering Protocols AST. Some things that we might want to consider:

  1. It should work for both sync and async IDLs (think asyncapi/pact)
    2.It should work for RPC style IDLs and HTTP IDLs.

@BeniVF ping

Fix README

The following items are not properly set in the README.md

  • Badges
  • skeuomorph version

Parse proto files

It would be great to have a way to parse protocol buffers in
Skeuomorph!

This task would involve several smaller tasks to be completed (we can
address each one in a separate PR if we like):

  • compile the proto file into its binary format
  • get a FileDescriptorProto out of it
  • convert all message declarations into ProtobufF declarations.
  • create a protocol describing a Protobuf file.
  • create transformation between Protobuf's protocol and Mu's
    protocol.

Compile proto file

To do this we can use protoc-jar as a library.

Convert proto bytes into a FileDescriptorProto

FileDescriptorProto comes from protobuf-java library, the oficial
SDK for protobuf in Java. The idea would be to use
FileDescriptorProto.parseFrom to convert from the bytes generated
with protoc compilation into something with more semantics, like a
FileDescriptorProto.

Convert message declarations to ProtobufF

This is done with a Droste's Coalgebra, and the process would be
similar to the one we are doing for avro here.

Create a skeuomorph.protobuf.Protocol

This should be similar to Avro's.

Conversion Protobuf -> Mu

This should be similar to Mu's fromAvroProtocol

Thanks @AdrianRaFo for https://github.com/AdrianRaFo/proton, from which I pulled the inspiration to create this issue a la skeuomorph

Fix proto imports cannot see other proto folders

@SemanticBeeng discovered an issue on the Protobuf imports where if we have 2 folders with proto files even on the same module we are not able to import from one folder to another (you can see the diff here ).

The real problem is we are facing here is that we are missing the -I argument on the Protobuf compilation which should allow us to add the other folder on our compilation AFAIK.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.