Skip to content

Commit 5e41069

Browse files
author
jared
committed
Draft of Slack feedback
1 parent d757112 commit 5e41069

File tree

1 file changed

+143
-68
lines changed

1 file changed

+143
-68
lines changed

docs/aiken-integration.md

+143-68
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,8 @@
1-
# Aiken Integration Report
2-
This document describes LambdaBuffers integration with Aiken.
3-
It appears at Aiken's best,
4-
while it is possible to provide limited support for a subset of LambdaBuffers' features,
5-
integration with LambdaBuffers would provide a poor user experience that would be in conflict with either the major language features of Aiken or the key functionalities provided by LambdaBuffers.
6-
As such, this document describes the challenges LambdaBuffers integration with Aiken,
7-
and proposes alternative Milestone 4 outputs to better foster adoption of LambdaBuffers within the Cardano ecosystem.
1+
# Lambda Buffers: Aiken Research Document
2+
3+
The Lambda Buffers team has deeply researched the Aiken programming language with the intention to find a technical path to integrate it with Lambda Buffers along the already integrated languages Plutus, Haskell, Rust and JavaScript.
4+
The conclusion of this research phase is that, while it would be indeed possible for Lambda Buffers to provide limited support for Aiken, it would result in a poor user experience that would be in conflict with the major language features of Aiken or in conflict with the key functionalities provided by Lambda Buffers.
5+
This document presents in detail the challenges found and its impact on the feasibility or convenience to undertake the Aiken integration.
86

97
## Aiken limitations
108

@@ -19,7 +17,7 @@ aiken v1.0.28-alpha+c9a1519
1917

2018
### Aiken has no type class support
2119

22-
A key feature of LambdaBuffers is to provide both types and type class instances.
20+
A key feature of Lambda Buffers is to provide both types and type class instances.
2321
Aiken has no support for type classes, so one must generate the type class system themselves.
2422
In other words, one must provide:
2523

@@ -74,7 +72,7 @@ Moreover, it appears that Aiken does not provide any "back doors" to the type sy
7472

7573
It's clear now that having an explicit type for an instance dictionary is not feasible in Aiken,
7674
so owing to the fact that an instance dictionary is a product type of functions, one can achieve type classes via dictionary passing by replacing all instance dictionaries as multiple arguments of each method in the type class, and replace the function to create an instance dictionary with multiple functions to create each method in the type class.
77-
This is indeed possible in Aiken, and to demonstrate this technique, consider the following Haskell code (which loosely models code generated from LambdaBuffers)
75+
This is indeed possible in Aiken, and to demonstrate this technique, consider the following Haskell code (which loosely models code generated from Lambda Buffers)
7876
7977
```haskell
8078
class Eq a where
@@ -184,13 +182,52 @@ This translation of type classes has some limitations such as:
184182
185183
* Only Haskell2010 type classes would be supported.
186184
187-
### Aiken's encoding of its data is different from LambdaBuffers encoding
185+
While the above has demonstrated how one can translate type class instances in Lambda Buffers to type class free code in Aiken,
186+
this unfortunately leads to a bad user experience for the "builtin" PlutusData type class in Aiken.
187+
Aiken by default "generates" its own PlutusData instances for all composite types.
188+
As such, Aiken provides some nice syntactic features to make writing smart contracts particularly readable.
189+
190+
A common pattern to write a validator in Aiken is as follows.
191+
192+
```rust
193+
pub type MyRecord<t> {a : t, b : Int }
194+
195+
validator {
196+
pub fn hello_world(_redeemer: MyRecord<Int>, _scriptContext: Data) {
197+
// ^~~~ this will automatically use Aiken's builtin PlutusData instances
198+
...
199+
}
200+
}
201+
```
202+
203+
Unfortunately, with the type class system described in this section,
204+
an Aiken developer will no longer be able to write this (since Lambda Buffers would generate its own PlutusData instances that are not used by Aiken)
205+
and instead must write the validator more verbosely as follows.
206+
207+
```rust
208+
pub type MyRecord<t> {a : t, b : Int }
209+
210+
validator {
211+
pub fn hello_world(redeemer: Data, _scriptContext: Data) {
212+
let actualRedeemer = myRecordFromData(intFromData)(redeemer)
213+
// ^~~~ Aiken users need to write more code in order to use Lambda
214+
// Buffers so that it will use Lambda Buffers' encoding for the
215+
// validator. Note that this sample assumes that `myRecordFromData :: (Data -> t) -> Data -> MyRecord<t>`
216+
// exists as would be generated by Lambda Buffers.
217+
...
218+
}
219+
}
220+
```
221+
222+
Clearly, this increase in code bloat to express the same simple idea contradicts the promises of making smart contracts easy to write on Aiken.
223+
224+
### Aiken's encoding of its data is different from Lambda Buffers encoding
188225

189226
All onchain scripts must be compiled to UPLC which must in some method represent the higher level language constructs like data types in the original language.
190227
Often, data types in a higher level language are translated to UPLC's builtin `Data` type which supports types like lists, constructors, integers, bytestrings, and maps.
191228
Note that data which will exist as a datum or redeemer must admit a representation with this `Data` type.
192229

193-
LambdaBuffers chooses a particularly efficient encoding of its data types to `Data` mapping to its target languages that map to UPLC.
230+
Lambda Buffers chooses a particularly efficient encoding of its data types to `Data` mapping to its target languages that map to UPLC.
194231
For example, a record like
195232

196233
```purescript
@@ -205,9 +242,9 @@ would be translated to
205242

206243
i.e., records are lists of all record components[^recordsSpecialCases].
207244

208-
[^recordsSpecialCases]: There are some special cases for the encoding in LambdaBuffers. For example, singleton records are encoded as just the single element.
245+
[^recordsSpecialCases]: There are some special cases for the encoding in Lambda Buffers. For example, singleton records are encoded as just the single element.
209246

210-
If LambdaBuffers compiled `MyRecord` to a [record in Aiken](https://aiken-lang.org/language-tour/custom-types) as follows.
247+
If Lambda Buffers compiled `MyRecord` to a [record in Aiken](https://aiken-lang.org/language-tour/custom-types) as follows.
211248

212249
```rust
213250
type MyRecord {
@@ -222,7 +259,7 @@ Then, one can observe that Aiken will internally represent this as the following
222259
Constr 0 [a, b]
223260
```
224261

225-
where we note that Aiken includes a useless `Constr 0` tag meaning Aiken's encoding is less efficient than LambdaBuffers' encoding.
262+
where we note that Aiken includes a useless `Constr 0` tag meaning Aiken's encoding is less efficient than Lambda Buffers' encoding.
226263

227264
In general, Aiken's documentation for the encoding from Aiken's types to UPLC `Data` is unclear,
228265
but one can inspect the generated UPLC to verify that Aiken would encode the data as mentioned above.
@@ -362,11 +399,11 @@ In particular, the following lines are evidence to support that the record is en
362399
]
363400
```
364401
365-
This is awkward for LambdaBuffers since when Aiken works with the `MyRecord` type,
366-
it is represented with the `Constr` tag meaning that LambdaBuffers' efficient encoding would be translated to Aiken's inefficient encoding.
367-
Ideally, one would want to change how Aiken encodes its data types internally so that Aiken can use LambdaBuffers' efficient encoding everywhere.
368-
Thus, we lose all benefits of LambdaBuffers' efficient encoding when working with Aiken's mechanisms to define types because LambdaBuffers is forced to take an extra step to translate to Aiken's inefficient encoding.
369-
As such, Aiken's opinionated way of encoding its data is at odds with LambdaBuffers.
402+
This is awkward for Lambda Buffers since when Aiken works with the `MyRecord` type,
403+
it is represented with the `Constr` tag meaning that Lambda Buffers' efficient encoding would be translated to Aiken's inefficient encoding.
404+
Ideally, one would want to change how Aiken encodes its data types internally so that Aiken can use Lambda Buffers' efficient encoding everywhere.
405+
Thus, we lose all benefits of Lambda Buffers' efficient encoding when working with Aiken's mechanisms to define types because Lambda Buffers is forced to take an extra step to translate to Aiken's inefficient encoding.
406+
As such, Aiken's opinionated way of encoding its data is at odds with Lambda Buffers.
370407
371408
To resolve the mismatch in the encoding of data between the two, one could alternatively sidestep all of Aiken's methods for defining types and instead use Aiken's opaque types to alias `Data` and provide ones own constructors / record accesses as follows.
372409
@@ -399,7 +436,7 @@ validator {
399436
}
400437
```
401438
402-
Interested readers may inspect the compiled UPLC to verify that the data encoding of `MyRecord` agrees with LambdaBuffers' encoding.
439+
Interested readers may inspect the compiled UPLC to verify that the data encoding of `MyRecord` agrees with Lambda Buffers' encoding.
403440
404441
```purescript
405442
(program
@@ -506,12 +543,44 @@ Interested readers may inspect the compiled UPLC to verify that the data encodin
506543
507544
Note that this would most likely offer a poor user experience as this would essentially replace a large part of Aiken's language construct with our own generated functions for constructing, deconstructing, serialization / deserialization to `Data`, etc.
508545
546+
In either case,
547+
to mediate the Data serialization / deserialization mismatch of Aiken and Lambda Buffers,
548+
it puts a bulkier mental overhead on the Aiken developer.
549+
As in the previous section, an Aiken developer would expect to write a validator as follows.
550+
551+
```rust
552+
pub type MyRecord {a : Int, b : Int }
553+
554+
validator {
555+
pub fn hello_world(_redeemer: MyRecord, _scriptContext: Data) {
556+
// ^~~~ this will automatically use Aiken's builtin Data serialization and deserialization
557+
...
558+
}
559+
}
560+
```
561+
562+
But, any of the solutions to mediate the Data encoding mismatch of Aiken and Lambda Buffers would force an Aiken developer to instead write a more verbose validator as follows.
563+
564+
```rust
565+
pub type MyRecord {a : Int, b : Int }
566+
567+
validator {
568+
pub fn hello_world(redeemer: Data, _scriptContext: Data) {
569+
let actualRedeemer = myRecordFromData(redeemer)
570+
// ^~~~ Assume that Lambda Buffers would generate `myRecordFromData :: Data -> MyRecord`
571+
...
572+
}
573+
}
574+
```
575+
576+
Again, it's clear this contradicts Aiken's goal of making writing smart contracts easy as Lambda Buffers integration would increase the mental overhead of working with all generated data types.
577+
509578
### Aiken's packages only support fetching dependencies remotely
510579

511-
LambdaBuffers is more than just a code generator.
512-
In addition to generating code for sharing types and semantics, its Nix tools augment a set of packages for a project with a package of the generated LambdaBuffers code.
580+
Lambda Buffers is more than just a code generator.
581+
In addition to generating code for sharing types and semantics, its Nix tools augment a set of packages for a project with a package of the generated Lambda Buffers code.
513582

514-
Aiken does support having packages, but it appears that it only officially supports fetching packages from either Github, GitLab, or BitBucket i.e., it's unclear how to create a local package set augmented with LambdaBuffers' packages.
583+
Aiken does support having packages, but it appears that it only officially supports fetching packages from either Github, GitLab, or BitBucket i.e., it's unclear how to create a local package set augmented with Lambda Buffers' packages.
515584

516585
For example, the following `aiken.toml` file
517586

@@ -560,63 +629,69 @@ $ aiken build
560629

561630
where the error message makes it clear that it only expects the source of dependencies to be from either GitHub, GitLab, or BitBucket.
562631

563-
As such, it's unclear how to augment the local set of packages with a LambdaBuffers package.
564-
Indeed, it's possible to trick Aiken into thinking that a LambdaBuffers package is already installed by preparing Aiken's build directory with the dependencies copied in ahead of time,
632+
As such, it's unclear how to augment the local set of packages with a Lambda Buffers package.
633+
Indeed, it's possible to trick Aiken into thinking that a Lambda Buffers package is already installed by preparing Aiken's build directory with the dependencies copied in ahead of time,
565634
but this delves into implementation specific details of Aiken that may break between releases.
566635
An example of this technique is [here](https://github.com/mlabs-haskell/uplc-benchmark/blob/master/nix/aiken/lib.nix#L83).
567636
568637
## Summary of Aiken limitations
569638
570-
This section summarizes the Aiken limitations and incompatibilities with LambdaBuffers.
639+
This section summarizes the Aiken limitations and incompatibilities with Lambda Buffers.
571640
572-
1. Aiken has no type classes, but LambdaBuffers requires type classes. As such, LambdaBuffers support for Aiken would require its own implementation of type classes.
641+
1. Aiken has no type classes, but Lambda Buffers requires type classes. As such, Lambda Buffers support for Aiken would require its own implementation of type classes.
573642
Unfortunately, implementing type classes is awkward in Aiken because composite data types in Aiken cannot store functions.
574-
It can be argued that this awkwardness creates a poor user experience for an Aiken developer, but this can be mitigated by the fact that the only type classes LambdaBuffers generates are relatively simplistic.
643+
While there are work arounds to implement type classes in Aiken,
644+
this fundamentally will create a poor user experience for an Aiken developer as using Lambda Buffers' generated type classes such as PlutusData would be at odds with the builtin syntactic goodies of Aiken's default PlutusData type class instances.
575645
576-
2. Aiken's PlutusData representation of its data types is different from LambdaBuffers' representation of PlutusData.
646+
2. Aiken's PlutusData representation of its data types is different from Lambda Buffers' representation of PlutusData.
577647
This means that we have a choice of either:
578648
579-
* Translating LambdaBuffers types to Aiken's builtin composite types which would lead to inefficient code in the already constrained onchain code environment since this would be "massaging" PlutusData representations when we would really want Aiken to use LambdaBuffers PlutusData encoding directly.
649+
* Translating Lambda Buffers types to Aiken's builtin composite types which would lead to inefficient code in the already constrained onchain code environment since this would be "massaging" PlutusData representations when we would really want Aiken to use Lambda Buffers PlutusData encoding directly.
580650

581-
* Translating LambdaBuffers types to a opaque type alias in Aiken which would then require us to generate supporting functions for constructors and destructors which would make Aiken's major language features obsolete, and so have a poor user experience.
651+
* Translating Lambda Buffers types to a opaque type alias in Aiken which would then require us to generate supporting functions for constructors and destructors which would make Aiken's major language features obsolete, and so have a poor user experience.
582652
583-
To put this more explicitly, we either have inefficient code with a nice user experience for an Aiken developer, or efficient code with an awful user experience for an Aiken developer.
653+
To put this more explicitly, we either have inefficient code with a somewhat nice user experience for an Aiken developer, or efficient code with an awful user experience for an Aiken developer.
584654
585-
3. Creating local package sets in Aiken is unclear, but creating such local package sets is a principle feature of LambdaBuffers.
655+
3. Creating local package sets in Aiken is unclear, but creating such local package sets is a principle feature of Lambda Buffers.
586656
Indeed, there are tricks one can do to work around this, but this depends on internal implementation details of Aiken that may break between releases.
587657
588-
## Alternative milestone 4 outputs
589-
590-
Seeing the aforementioned incompatibilities between Aiken and LambdaBuffers,
591-
instead of hacking around the foundational design decisions of Aiken and LambdaBuffers to create an Aiken backend with limited support and a poor user experience,
592-
we strongly believe that milestone 4 would be better spent to improve the existing LambdaBuffers stack.
593-
In particular, LambdaBuffers has seen use in other projects such as [DeNS](https://github.com/mlabs-haskell/DeNS/tree/main), OrcFax, etc.
594-
and we've received feedback to better the LambdaBuffers existing facilities so addressing this feedback would aid in fostering adoption of LambdaBuffers in other projects.
595-
596-
As such, for milestone 4, we propose to provide the following instead:
597-
598-
Bugs:
599-
600-
* Haskell backend bugs.
601-
602-
* [Generated Haskell code is invalid](https://github.com/mlabs-haskell/lambda-buffers/issues/197)
603-
604-
* [Missing dependencies from the generated files](https://github.com/mlabs-haskell/lambda-buffers/issues/124)
605-
606-
* Plutarch backend bugs.
607-
608-
* [Generated Plutarch code is invalid](https://github.com/mlabs-haskell/lambda-buffers/issues/148)
609-
610-
* [Optimizing the LambdaBuffers compiler performance](https://github.com/mlabs-haskell/lambda-buffers/issues/76)
611-
612-
Features:
613-
614-
* [Completing the Plutus `.lbf` schemas to include all Plutus Ledger API types](https://github.com/mlabs-haskell/lambda-buffers/issues/175)
615-
616-
* [Creating a versioning scheme](https://github.com/mlabs-haskell/lambda-buffers/issues/220)
617-
618-
* [Separate the PlutusTx backend from a Haskell Plutus backend](https://github.com/mlabs-haskell/lambda-buffers/issues/221)
619-
620-
* [Optimizing slow nix build times](https://github.com/mlabs-haskell/lambda-buffers/pull/193#issuecomment-1942114795)
621-
622-
* [Improving error messages for better editor integration](https://github.com/mlabs-haskell/lambda-buffers/issues/152)
658+
All in all, at the moment it's clear that while it may be possible to integrate Aiken with Lambda Buffers, such integration would have
659+
660+
* limited support for Lambda Buffers' key features; and
661+
662+
* a poor user experience for Aiken developers that use Lambda Buffers.
663+
664+
So, the extra effort needed to mitigate these challenges appear to be counter productive with Aiken's and Lambda Buffers' project goals.
665+
Moreover, Aiken is still in an alpha release and is rapidly changing, so the effort to mitigate these challenges would be squandered away as Aiken evolves.
666+
Thus, given these challenges, it's clear that it would be unwise to undertake the Aiken implementation currently,
667+
and it would be wiser to revisit this later and focus on matters of pressing importance today to better foster adoption of Lambda Buffers.
668+
669+
Lambda Buffers has fortunately seen industry use in other projects such as [DeNS](https://github.com/mlabs-haskell/DeNS/tree/main), OrcFax, etc.,
670+
and there's been feedback to improve the existing facilities in Lambda Buffers which would aid in fostering the adoption of Lambda Buffers in the greater Cardano community.
671+
Some of these issues include the following.
672+
673+
* Bugs:
674+
675+
* Haskell backend bugs.
676+
677+
* [Generated Haskell code is invalid](https://github.com/mlabs-haskell/lambda-buffers/issues/197)
678+
679+
* [Missing dependencies from the generated files](https://github.com/mlabs-haskell/lambda-buffers/issues/124)
680+
681+
* Plutarch backend bugs.
682+
683+
* [Generated Plutarch code is invalid](https://github.com/mlabs-haskell/lambda-buffers/issues/148)
684+
685+
* [Optimizing the Lambda Buffers compiler performance](https://github.com/mlabs-haskell/lambda-buffers/issues/76)
686+
687+
* Features:
688+
689+
* [Completing the Plutus `.lbf` schemas to include all Plutus Ledger API types](https://github.com/mlabs-haskell/lambda-buffers/issues/175)
690+
691+
* [Creating a versioning scheme](https://github.com/mlabs-haskell/lambda-buffers/issues/220)
692+
693+
* [Separate the PlutusTx backend from a Haskell Plutus backend](https://github.com/mlabs-haskell/lambda-buffers/issues/221)
694+
695+
* [Optimizing slow nix build times](https://github.com/mlabs-haskell/lambda-buffers/pull/193#issuecomment-1942114795)
696+
697+
* [Improving error messages for better editor integration](https://github.com/mlabs-haskell/lambda-buffers/issues/152)

0 commit comments

Comments
 (0)