Skip to content

Commit

Permalink
[gsoc24] Add an initial batch of proposals for compiler-research.org
Browse files Browse the repository at this point in the history
  • Loading branch information
vgvassilev committed Feb 5, 2024
1 parent 14a6164 commit 965480e
Show file tree
Hide file tree
Showing 8 changed files with 194 additions and 1 deletion.
11 changes: 11 additions & 0 deletions _gsocorgs/2024/compres.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
---
title: "Compiler Research"
author: "Vassil Vassilev"
layout: default
organization: CompRes
logo: CompRes-logo.png
description: |
The Compiler Research Group is a group of programming language enthusiasts at Princeton University and CERN. It's primary goal is to research foundational software tools that help scientists program for speed, interoperability, interactivity, flexibility, and reproducibility.
---

{% include gsoc_proposal.ext %}
17 changes: 17 additions & 0 deletions _gsocprojects/2024/project_Clad.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
---
project: Clad
layout: default
logo: Clad-logo.png
description: |
[Clad](https://clad.readthedocs.io/en/latest/) enables
automatic differentiation (AD) for C++. It is based on LLVM compiler
infrastructure and is a plugin for Clang compiler. Clad is based on
source code transformation. Given C++ source code of a mathematical
function, it can automatically generate C++ code for computing derivatives
of the function.
summary: |
Clad is an automatic differentiation (AD) tool for C++
---

{% include gsoc_project.ext %}

34 changes: 34 additions & 0 deletions _gsocproposals/2024/proposal_Clad-GPUReverseMode.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
---
title: Enable reverse-mode automatic differentiation of (CUDA) GPU kernels using Clad
layout: gsoc_proposal
project: Clad
year: 2023
difficulty: medium
duration: 350
mentor_avail: June-October
organization:
- CompRes
---

## Description

Clad is an automatic differentiation (AD) clang plugin for C++. Given a C++ source code of a mathematical function, it can automatically generate C++ code for computing derivatives of the function. Clad has found uses in statistical analysis and uncertainty assessment applications. In scientific computing and machine learning, GPU multiprocessing can provide a significant boost in performance and scalability. This project focuses on enabling the automatic differentiation of CUDA GPU kernels using Clad. This will allow users to take advantage of the power of GPUs while benefiting from the accuracy and speed of automatic differentiation.

## Project Milestones

* Research about automatic differentiation of code involving CUDA GPU kernels. Prepare a report and an initial strategy to follow.This may involve brainstorming and the need for innovative solutions.
* Enable reverse-mode automatic differentiation of CUDA GPU kernels and calls to CUDA GPU kernels from the host code.
* Add proper tests and documentation.

## Requirements

* Automatic differentiation
* CUDA C++ programming
* C++ programming and Clang frontend

## Mentors
* **[Parth Arora](mailto:[email protected])**
* [Vassil Vassilev](mailto:[email protected])

## Links
* [Repo](https://github.com/vgvassilev/clad)
41 changes: 41 additions & 0 deletions _gsocproposals/2024/proposal_Clad-ObjectOrientedAD.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
---
title: Add support for consteval and constexpr functions in Clad
layout: gsoc_proposal
project: Clad
year: 2023
difficulty: medium
duration: 350
mentor_avail: June-October
organization:
- CompRes
---

## Description

Clad is an automatic differentiation (AD) clang plugin for C++. Given a C++ source code of a mathematical function, it can automatically generate C++ code for computing derivatives of the function. Clad has found uses in statistical analysis and uncertainty assessment applications.

Object oriented paradigms (OOP) provide a structured approach for complex use cases, allowing for modular components that can be reused & extended. OOP also allows for abstraction which makes code easier to reason about & maintain. Gaining full OOP support is an open research area for automatic differentiation codes.

This project focuses on improving support for differentiating object-oriented constructs in Clad. This will allow users to seamlessly compute derivatives to the algorithms in their projects which use an object-oriented model. C++ object-oriented constructs include but are not limited to: classes, inheritance, polymorphism, and related features such as operator overloading.


## Project Milestones

* Study the current object-oriented differentiable programming support in Clad. Prepare a report of missing constructs that should be added to support the automatic differentiation of object-oriented paradigms in both the forward mode AD and the reverse mode AD.
* Some of the missing constructs are: differentiation of constructors, limited support for differentiation of operator overloads, reference class members, and no way of specifying custom derivatives for constructors.
* Add support for the missing constructs.
* Add proper tests and documentation.


## Requirements

* Automatic differentiation
* C++ programming
* Clang frontend

## Mentors
* **[Parth Arora](mailto:[email protected])**
* [Vassil Vassilev](mailto:[email protected])

## Links
* [Repo](https://github.com/vgvassilev/clad)
85 changes: 85 additions & 0 deletions _gsocproposals/2024/proposal_Clad-constant-evaluation-contexts.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
---
title: Add support for consteval and constexpr functions in Clad
layout: gsoc_proposal
project: Clad
year: 2023
difficulty: medium
duration: 350
mentor_avail: June-October
organization:
- CompRes
---

## Description

In mathematics and computer algebra, automatic differentiation (AD) is a set of
techniques to numerically evaluate the derivative of a function specified by a
computer program. Automatic differentiation is an alternative technique to
Symbolic differentiation and Numerical differentiation (the method of finite
differences). Clad is based on Clang which provides the necessary facilities
for code transformation. The AD library can differentiate non-trivial
functions, to find a partial derivative for trivial cases and has good unit test
coverage.

C++ provides the specifiers consteval and constexpr to allow compile time
evaluation of functions. `constexpr` declares a possibility, i.e the function
will be evaluated at compile time if possible, else at runtime; whereas
`consteval` makes it mandatory, i.e every call to the function must produce a
compile-time constant.

The aim of this project is to ensure that same semantics are followed by the
generated derivative function, i.e if the primal function is evaluated at
compile time (because of constexpr or consteval specifier), then the generated
derivative code should also have the same specifier to be evaluatable at compile
time.

This will enable clad to demonstrate the benefits of doing automatic
differentiation directly on C++ frontend to utilize the benefits of clang's
infrastructure.

After successful completion of the project the code snippet should work as
expected:

```cpp
#include <cstdio>
#include "clad/Differentiator/Differentiator.h"

constexpr double sq(double x) { return x*x; }
consteval double fn(double x, double y, double z) {
double res = sq(x) + sq(y) + sq(z);
return res;
}

int main() {
auto d_fn = clad::gradient(fn);
double dx = 0, dy = 0, dz = 0;
d_fn.execute(3, 4, 5, &dx, &dy, &dz);
printf("Gradient vector: [%.2f, %.2f, %.2f]", dx, dy, dz);
return 0;
}
```
## Project Milestones
* Add support for differentiation with respect to consteval and constexpr
functions in the forward mode.
* Add support for differentiation with respect to consteval and constexpr
functions in the reverse mode.
* Extend the unit test coverage.
* Develop tutorials and documentation.
* Present the work at the relevant meetings and conferences.
## Requirements
* Automatic differentiation
* C++ programming
* Clang frontend
## Mentors
* **[Vaibhav Thakkar](mailto:[email protected])**
* [Vassil Vassilev](mailto:[email protected])
## Links
* [Repo](https://github.com/vgvassilev/clad)
7 changes: 6 additions & 1 deletion gsoc/2024/mentors.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,11 @@ layout: plain
**Note for contributors:** entries must be sorted in **last name** alphabetic order

## Full Mentor List (Name, Email, Org)
* Parth Arora [[email protected]](mailto:[email protected]) CompRes
* Jakob Blomer [[email protected]](mailto:[email protected]) CERN
* Benedikt Hegner [[email protected]](mailto:[email protected]) CERN
* Wim Lavrijsen [[email protected]](mailto:[email protected]) CompRes
* Alexander Penev [[email protected]](mailto:[email protected]) CompRes
* Vaibhav Thakkar [[email protected]](mailto:[email protected]) CompRes
* Vassil Vassilev [[email protected]](mailto:[email protected]) CompRes
* Valentin Volkl [[email protected]](mailto:[email protected]) CERN
* Jakob Blomer [[email protected]](mailto:[email protected]) CERN
Binary file modified images/Clad-logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified images/CompRes-logo.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 965480e

Please sign in to comment.