Skip to content

A Tensor module that allows a deep learning framework to switch seamlessly between different engines.

License

Notifications You must be signed in to change notification settings

Ruhaan838/AnyGrad

Repository files navigation

🚂 AnyGrad: Flexible Engine for Tensor and Neural Network.

Python Version PyPI Version

Overview

AnyGrad is a simple tensor library that makes it easy to perform forward and backward passes. It uses a high-performance C++ backend together with a user-friendly Python frontend. You can change the backend easily and simply.

Note: currently version 0.0.2 does not support any engine. But in the future, the integrations of engines like numpy, pytorch etc. will come and you can use them for anything from Tensor operation to high-level transformer training.

Installation

Install the library from PyPI:

pip install anygrad

If you'd like to work on the code:

git clone https://github.com/Ruhaan838/AnyGrad.git
./setup.sh

Getting Started

Creating a Tensor

Create tensors by importing the library and instantiating Tensor. By default, gradients are not tracked unless you enable them:

import anygrad

# A tensor that does not calculate gradients
a = anygrad.Tensor([1, 2, 3])  

# A tensor with gradient tracking enabled
b = anygrad.Tensor([2, 3, 4], requires_grad=True)  

# A tensor with a specific data type (float64)
c = anygrad.Tensor([2, 3, 4], dtype=anygrad.float64)

Other datatypes:
anygrad.int32
anygrad.int64
anygrad.bool

Arithmetic Operations

Element-wise Operations

Perform calculations on tensors element by element:

d = a + b         # addition
d = a * d         # multiplication
d = d / 10        # division
e = d - 10        # subtraction

Matrix Multiplication

You can multiply matrices in two ways:

# Using the @ operator:
a = anygrad.ones((1, 2, 3), requires_grad=True)
b = anygrad.ones((2, 3, 4), requires_grad=True)
c = a @ b         # tensor of shape (2, 2, 4)

# Or using the function:
c = anygrad.matmul(a, b)

Gradient Calculation

AnyGrad automatically computes gradients, which you can access after running the backward pass:

a = anygrad.Tensor([1, 2, 3], requires_grad=True)
b = anygrad.Tensor([2, 3, 4], requires_grad=True)
c = a * b 
result = c.sum()
result.backward()

print(a.grad)
print(b.grad)

Contributing

Contributions are welcome! Whether you want to improve performance or enhance the documentation, please open an issue or submit a pull request.

License

This project is licensed under the terms outlined in the LICENSE file.