You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/aiken-integration.md
+130-32
Original file line number
Diff line number
Diff line change
@@ -70,11 +70,37 @@ This makes it impossible to pass instance dictionaries via Aiken's builtin types
70
70
Alternatively, one could try to sidestep Aiken's builtin types by creating a type which is a Church encoded tuple
71
71
(i.e., implementing a tuple type via function types),
72
72
but doing so requires higher ranked types which again Aiken does not support.
73
-
Moreover, it appears that Aiken does not provide any "back doors" to the type system (e.g. TypeScript's `any` type) to trick the type system that using a Church encoded tuple and with its projections are well typed.
73
+
Moreover, it appears that Aiken does not provide any "back doors" to the type system (e.g. TypeScript's `any` type) to trick the type system that using a Church encoded tuple and its projections are well typed.
74
74
75
75
It's clear now that having an explicit type for an instance dictionary is not feasible in Aiken,
76
-
so owing to the fact that an instance dictionary is a product type of functions, one can replace all instance dictionaries as an argument with multiple arguments with each method, and replace the functions to create an instance dictionary with multiple functions to create each method in the type class.
77
-
This is indeed possible in Aiken, and to demonstrate this technique, consider the following translation.
76
+
so owing to the fact that an instance dictionary is a product type of functions, one can achieve type classes via dictionary passing by replacing all instance dictionaries as multiple arguments of each method in the type class, and replace the function to create an instance dictionary with multiple functions to create each method in the type class.
77
+
This is indeed possible in Aiken, and to demonstrate this technique, consider the following Haskell code (which loosely models code generated from LambdaBuffers)
78
+
79
+
```haskell
80
+
class Eq a where
81
+
eq :: a -> a -> Bool
82
+
83
+
class PlutusData a where
84
+
toData :: a -> Data
85
+
fromData :: Data -> a
86
+
87
+
data MyOption a = MyJust a | MyNothing
88
+
89
+
instance Eq a => Eq (MyOption a) where
90
+
eq (MyJust s) (MyJust t) = s == t
91
+
eq MyNothing MyNothing = True
92
+
eq _ _ = False
93
+
94
+
instance PlutusData a => PlutusData (MyOption a) where
95
+
toData (MyJust s) = Constr 0 [toData s]
96
+
toData MyNothing = Constr 1 []
97
+
98
+
fromData (Constr 0 [s]) = MyJust (fromData s)
99
+
fromData (Constr 1 []) = MyNothing
100
+
fromData _ = error "bad parse"
101
+
```
102
+
103
+
A translation to type class free code in Aiken is as follows.
78
104
79
105
```rust
80
106
use aiken/builtin as builtin
@@ -342,27 +368,33 @@ Ideally, one would want to change how Aiken encodes its data types internally so
342
368
Thus, we lose all benefits of LambdaBuffers' efficient encoding when working with Aiken's mechanisms to define types because LambdaBuffers is forced to take an extra step to translate to Aiken's inefficient encoding.
343
369
As such, Aiken's opinionated way of encoding its data is at odds with LambdaBuffers.
344
370
345
-
To resolve the mismatch in the encoding of data between the two, one could alternatively sidestep all of Aiken's methods for defining types and instead use Aiken's opaque types to alias`Data` and provide ones own constructors / record accesses as follows
371
+
To resolve the mismatch in the encoding of data between the two, one could alternatively sidestep all of Aiken's methods for defining types and instead use Aiken's opaque types to alias`Data` and provide ones own constructors / record accesses as follows.
346
372
347
373
```rust
348
374
use aiken/builtin as builtin
349
375
350
376
pub opaque type MyRecord { data: Data }
351
377
352
-
pub fn myRecord(a: Int) -> MyRecord {
353
-
MyRecord{ data : builtin.list_data([builtin.i_data(a)]) }
378
+
// Constructor for`MyRecord`
379
+
pub fn myRecord(a: Int, b: Int) -> MyRecord {
380
+
MyRecord{ data : builtin.list_data([builtin.i_data(a), builtin.i_data(b)]) }
@@ -381,47 +413,93 @@ Interested readers may inspect the compiled UPLC to verify that the data encodin
381
413
[
382
414
(lam
383
415
i_2
384
-
(lam
385
-
i_3
416
+
[
386
417
(lam
387
-
i_4
388
-
(force
389
-
[
390
-
[
418
+
i_3
419
+
(lam
420
+
i_4
421
+
(lam
422
+
i_5
423
+
(force
391
424
[
392
-
i_2
393
425
[
394
426
[
395
-
(builtin equalsInteger)
427
+
i_3
396
428
[
397
-
(builtin unIData)
429
+
(lam
430
+
i_6
431
+
(force
432
+
[
433
+
[
434
+
[
435
+
i_3
436
+
[
437
+
[
438
+
(builtin equalsInteger)
439
+
[
440
+
(builtin unIData)
441
+
[
442
+
i_1
443
+
[ (builtin unListData) i_6 ]
444
+
]
445
+
]
446
+
]
447
+
(con integer 420)
448
+
]
449
+
]
450
+
(delay
451
+
[
452
+
[
453
+
(builtin equalsInteger)
454
+
[
455
+
(builtin unIData)
456
+
[
457
+
i_1
458
+
[
459
+
i_0
460
+
[ (builtin unListData) i_6 ]
461
+
]
462
+
]
463
+
]
464
+
]
465
+
(con integer -420)
466
+
]
467
+
)
468
+
]
469
+
(delay (con bool False))
470
+
]
471
+
)
472
+
)
398
473
[
399
-
i_0
474
+
(builtin listData)
400
475
[
401
-
[ i_1 (con data (I 69)) ]
402
-
(con (list data) [])
476
+
[ i_2 (con data (I 69)) ]
477
+
[
478
+
[ i_2 (con data (I -69)) ]
479
+
(con (list data) [])
480
+
]
403
481
]
404
482
]
405
483
]
406
484
]
407
-
(con integer 420)
485
+
(delay (con unit ()))
408
486
]
487
+
(delay [ (error ) (force (error )) ])
409
488
]
410
-
(delay (con unit ()))
411
-
]
412
-
(delay [ (error ) (force (error )) ])
413
-
]
489
+
)
490
+
)
414
491
)
415
492
)
416
-
)
493
+
(force (builtin ifThenElse))
494
+
]
417
495
)
418
-
(force (builtin ifThenElse))
496
+
(force (builtin mkCons))
419
497
]
420
498
)
421
-
(force (builtin mkCons))
499
+
(force (builtin headList))
422
500
]
423
501
)
424
-
(force (builtin headList))
502
+
(force (builtin tailList))
425
503
]
426
504
)
427
505
```
@@ -482,11 +560,31 @@ $ aiken build
482
560
483
561
where the error message makes it clear that it only expects the source of dependencies to be from either GitHub, GitLab, or BitBucket.
484
562
485
-
As such, it's unclear how to augment the local set of packages with a LambdaBuffers package, as the Nix tools would provide a local package.
486
-
Indeed, it's possible to trick Aiken into thinking that a LambdaBuffers package is already installed by preparing Aiken's build directory with the dependencies already included,
563
+
As such, it's unclear how to augment the local set of packages with a LambdaBuffers package.
564
+
Indeed, it's possible to trick Aiken into thinking that a LambdaBuffers package is already installed by preparing Aiken's build directory with the dependencies copied in ahead of time,
487
565
but this delves into implementation specific details of Aiken that may break between releases.
488
566
An example of this technique is [here](https://github.com/mlabs-haskell/uplc-benchmark/blob/master/nix/aiken/lib.nix#L83).
489
567
568
+
## Summary of Aiken limitations
569
+
570
+
This section summarizes the Aiken limitations and incompatibilities with LambdaBuffers.
571
+
572
+
1. Aiken has no type classes, but LambdaBuffers requires type classes. As such, LambdaBuffers support for Aiken would require its own implementation of type classes.
573
+
Unfortunately, implementing type classes is awkward in Aiken because composite data types in Aiken cannot store functions.
574
+
It can be argued that this awkwardness creates a poor user experience for an Aiken developer, but this can be mitigated by the fact that the only type classes LambdaBuffers generates are relatively simplistic.
575
+
576
+
2. Aiken's PlutusData representation of its data types is different from LambdaBuffers' representation of PlutusData.
577
+
This means that we have a choice of either:
578
+
579
+
* Translating LambdaBuffers types to Aiken's builtin composite types which would lead to inefficient code in the already constrained onchain code environment since this would be "massaging" PlutusData representations when we would really want Aiken to use LambdaBuffers PlutusData encoding directly.
580
+
581
+
* Translating LambdaBuffers types to a opaque typealiasin Aiken which would then require us to generate supporting functions for constructors and destructors which would make Aiken's major language features obsolete, and so have a poor user experience.
582
+
583
+
To put this more explicitly, we either have inefficient code with a nice user experience for an Aiken developer, or efficient code with an awful user experience for an Aiken developer.
584
+
585
+
3. Creating local package sets in Aiken is unclear, but creating such local package sets is a principle feature of LambdaBuffers.
586
+
Indeed, there are tricks one can do to work around this, but this depends on internal implementation details of Aiken that may break between releases.
587
+
490
588
## Alternative milestone 4 outputs
491
589
492
590
Seeing the aforementioned incompatibilities between Aiken and LambdaBuffers,
0 commit comments