-
Notifications
You must be signed in to change notification settings - Fork 348
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Collections support #758
Collections support #758
Conversation
…raversable encoding to core module
@mentegy this is great! Let us know if there's something we can help with. |
@fwbrasil Yes I saw it, but it seems to be inactive now. Also, I have a little bit another design idea as far as you can see from my latest commit. Please look at it, I have created a separate module for cassandra macros. It is already support primitive types (using boxing methods from Predef). |
@mentegy cool, I'll take a look in the evening today |
I started that PR long ago and did not have time to make much progress so it's nice that @mentegy will implement Cassandra collections support. |
* Base trait for encoding sql arrays via async driver. | ||
* We say `array` only in scope of sql driver. In Quill we represent them as instances of Traversable | ||
*/ | ||
trait ArrayAsyncEncoding extends ArrayEncoding { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Wdyt about having separate traits for encoders and decoders? I'd be more similar to the rest of the encoding classes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Make sense, will do
|
||
import scala.reflect.macros.blackbox.{ Context => MacroContext } | ||
|
||
trait CollectionsEncodingMacro { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you share some context on why you need these macros for cassandra? It's not clear to me
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
On the first glance it looked easier to implement because of the following:
For Cassandra encoders/decoders we need a some kind of mapper:
- boxing/unboxing primitives (this needs to be done manually)
- identity function for supported types
- mapped encoding
For list and set we can implement something like with sql arrays - collection encoder for each base type. But for map?
One solution is to introduce some type class which will mark supported types and pass each into implicit context. Also another encoders for AnyVals (oh, i completely forgot about anyvals) and primitives. And another one for mapping encoders. This looked a boilerplate for me for the first time and that's why I went into macros.
Having such "CassandraType[T]" will allow as to create generic encoder using cassandra row.set[T]. Also we can create some mechanism to create custom CassandraType[T] in order to create cassandra codecs for it.
Wdyt? I can play around it
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
MappedType[T, Cas]
is designed to map any scala type of T
to already supported type Cas
via cassandra codecs.
type CassandraType[T] = MappedType[T, T]
- signals that this type T
is already supported so no transformations is needed. Both of these type is needed for collection support. Later on we can reuse this for UDT for example
} | ||
|
||
/* implicit def mappedEncodingForMappedType[I, O, Cas] is not working!!! | ||
"Mapped encoding for mapped type" in { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This issue blocks me from complete cassandra collection support
m2: MappedEncoding[O, I], | ||
mappedType: MappedType[O, Cas] | ||
): MappedType[I, Cas] = MappedType(m1.f.andThen(mappedType.encode), mappedType.decode.andThen(m2.f)) | ||
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This implicits is not working for unknowns reasons for me. Maybe I'm missing something?
…eq (since it more convenient)
- add tests in sql base module regarding arrays encoding - add decoderUnsafe to MirrorDecoder to reduce ClassTags (removed unnecessary classtags from cassandra collection decoders)
@fwbrasil I think it's completed. Please review. |
Seems like tut doesn't like my example with LocalDate in readme. Also I found that #751 has the same problem. |
@fwbrasil tut still cannot compile, it complains that mirror sql context does not have arrays encoding. Should I introduce arrays encoding for sql mirror context (it means thta for all idioms mirror will encode/decode arrays) or just make tut not compile that line? |
@mentegy I think it'd be nice if the mirror source also supports arrays, but I don't think it's a blocker. We can create a ticket and work on it later. |
@fwbrasil I have removed |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's great that we don't need macros for cassandra! Thank you for the contribution, it's an important feature.
@fwbrasil You're welcome! |
@getquill/maintainers let me know if you have objections since this is a relatively large change, I'm planning to merge it tomorrow. |
Looks awesome 👍 |
Fixes #706, #730, #309 and probably something more
This PR's brings collections support for row elements (SQL Arrays, Cassandra Collections)
SQL
Cassandra
CassandraType[T]
- marker which signals to quill that typeT
is already supported by Cassandra codecs so not additional mapping is neededCassandraMapper[I, O]
- additional mapper to/fromCassandraType[T]
CassandraMapper
fromMappingEncoding
andCassandraType
We can use
CassandraType[T]
for custom codecs and reuse later on for UDT for exampleChecklist
README.md
if applicable[WIP]
to the pull request title if it's work in progresssbt scalariformFormat test:scalariformFormat
to make sure that the source files are formatted@getquill/maintainers