Skip to content

Commit 89bbadb

Browse files
Improvements to documentation/readme (#64)
1 parent c81e243 commit 89bbadb

File tree

5 files changed

+16
-10
lines changed

5 files changed

+16
-10
lines changed

README.md

+6-6
Original file line numberDiff line numberDiff line change
@@ -14,14 +14,14 @@
1414

1515
# Torchhd
1616

17-
Torchhd is a Python library for Hyperdimensional Computing.
17+
Torchhd is a Python library for *Hyperdimensional Computing* (also known as *Vector Symbolic Architectures*).
1818

19-
* **Easy-to-use:** Torchhd makes it painless to develop a wide range of Hyperdimensional Computing (HDC) applications and algorithms. For someone new to the field we provide Pythonic abstractions and examples to get you started fast. For the experienced researchers we made the library modular by design, giving you endless flexibility to prototype new ideas in no-time.
20-
* **Performant:** The library is build on top of the high-performance PyTorch library, giving you optimized tensor execution without the headaches. Moreover, PyTorch makes it effortless to accelerate your code on a GPU.
19+
* **Easy-to-use:** Torchhd makes it painless to develop a wide range of Hyperdimensional Computing (HDC) applications and algorithms. For someone new to the field, we provide Pythonic abstractions and examples to get you started fast. For the experienced researchers, we made the library modular by design, giving you endless flexibility to prototype new ideas in no-time.
20+
* **Performant:** The library is build on top of the high-performance [PyTorch](https://pytorch.org/) library, giving you optimized tensor execution without the headaches. Moreover, PyTorch makes it effortless to accelerate your code on a GPU.
2121

2222
## Installation
2323

24-
Torchhd is hosted on PyPi and Anaconda, use one of the following commands to install:
24+
Torchhd is hosted on PyPi and Anaconda. Use one of the following commands to install:
2525

2626
```bash
2727
pip install torch-hd
@@ -89,12 +89,12 @@ torchhd.functional.cosine_similarity(usd_of_mex, memory)
8989
# The hypervector for the Mexican Peso is the most similar.
9090
```
9191

92-
This example is from the paper [What We Mean When We Say "What's the Dollar of Mexico?": Prototypes and Mapping in Concept Space](https://redwood.berkeley.edu/wp-content/uploads/2020/05/kanerva2010what.pdf) by Kanerva. It first creates hypervectors for all the symbols that are used in the computation, i.e., the variables for `country`, `capital`, and `currency` and their values for both countries. These hypervectors are then combined to make a single hypervector for each country using a hash table structure. A hash table encodes key-value pairs as: `k1 * v1 + k2 * v2 + ... + kn * vn`. The hash tables are then bound together to form their combined representation which is finally queried by binding with the Dollar hypervector to obtain the approximate Mexican Peso hypervector. From the similarity output it shows that the Mexican Peso hypervector is indeed the most similar one.
92+
This example is from the paper [What We Mean When We Say "What's the Dollar of Mexico?": Prototypes and Mapping in Concept Space](https://redwood.berkeley.edu/wp-content/uploads/2020/05/kanerva2010what.pdf) by Kanerva. It first creates hypervectors for all the symbols that are used in the computation, i.e., the variables for `country`, `capital`, and `currency` and their values for both countries. These hypervectors are then combined to make a single hypervector for each country using a hash table structure. A hash table encodes key-value pairs as: `k1 * v1 + k2 * v2 + ... + kn * vn`. The hash tables are then bound together to form their combined representation which is finally queried by binding with the Dollar hypervector to obtain the approximate Mexican Peso hypervector. The similarity output shows that the Mexican Peso hypervector is indeed the most similar one.
9393

9494

9595
## About
9696

97-
Initial development of Torchhd was performed by Mike Heddes and Igor Nunes as part of their research in Hyperdimensional Computing at the University of California, Irvine. The library was extended with significant contributions from Pere Vergés and Dheyay Desai. Torchhd later merged with a project by Rishikanth Chandrasekaran who worked on similar problems as part of his research at the University of California, San Diego.
97+
Initial development of Torchhd was performed by [Mike Heddes](https://www.mikeheddes.nl/) and [Igor Nunes](https://sites.uci.edu/inunes/) as part of their research in Hyperdimensional Computing at the University of California, Irvine. The library was extended with significant contributions from Pere Vergés and Dheyay Desai. Torchhd later merged with a project by Rishikanth Chandrasekaran, who worked on similar problems as part of his research at the University of California, San Diego.
9898

9999
## Contributing
100100

docs/getting_started.rst

+7-3
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,11 @@ The first step to encode these records is to define the basis-hypervectors for e
4141
seasons = functional.circular_hv(4, d)
4242
var = functional.random_hv(3, d)
4343
44-
which creates hypervectors for the 3 fruit types, 10 weight levels, 4 seasons and the 3 variables.
44+
which creates hypervectors for the 3 fruit types, 10 weight levels, 4 seasons and the 3 variables. The figure below illustrates the distance between the pairs of hypervectors in each set:
45+
46+
.. image:: images/basis-hvs.png
47+
:width: 500
48+
:align: center
4549

4650
Similar behavior can be achieved using the classes in the :ref:`embeddings` module. The classes add convenience methods for mapping values to hypervectors. For example, to map the interval :math:`[0, 200]` to the ten weight hypervectors the :ref:`functional<functional>` version above requires an explicit mapping to an index:
4751

@@ -67,7 +71,7 @@ whereas the :ref:`embeddings<embeddings>` have this common behavior built-in:
6771
Operations
6872
----------
6973

70-
Once the basis-hypervectors are defined, we can use the MAP operations from :ref:`functional` to represent more complex objects. The hypervector for record :math:`r_1` can then be created as follows:
74+
Once the basis-hypervectors are defined, we can use the MAP operations from :ref:`functional` to encode more complex objects by combining basis-hypervectors. The hypervector for record :math:`r_1` can be created as follows:
7175

7276
.. code-block:: python
7377
@@ -93,7 +97,7 @@ Alternatively, we can use one of the commonly used encodings provided in the :re
9397
values = torch.stack([fruits[0], weights[w_i], seasons[3]])
9498
r1 = functional.hash_table(var, values)
9599
96-
The :ref:`structures` module contains the same encoding patterns in addition to binary trees and finite state automata, but provides them as data structures. This module provides class-based implementations of HDC data structures. Using the hash table class, record :math:`r_1` can be implemented as follows:
100+
The :ref:`structures` module contains the same encoding patterns in addition to binary trees and finite state automata, but provides them as data structures. This module provides class-based implementations of HDC data structures. Using the hash table class, record :math:`r_1` can be represented as follows:
97101

98102
.. code-block:: python
99103

docs/images/basis-hvs.png

105 KB
Loading

docs/index.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
Welcome to the Torchhd documentation!
22
=====================================
33

4-
*Torchhd* is a Python library dedicated to Hyperdimensional Computing and the operations related to it.
4+
Torchhd is a Python library dedicated to *Hyperdimensional Computing* (also knwon as *Vector Symbolic Architectures*).
55

66
.. toctree::
77
:glob:

docs/structures.rst

+2
Original file line numberDiff line numberDiff line change
@@ -5,6 +5,8 @@ torchhd.structures
55

66
.. currentmodule:: torchhd.structures
77

8+
This module provides class-based implementations of HDC data structures.
9+
810
.. autosummary::
911
:toctree: generated/
1012
:template: class.rst

0 commit comments

Comments
 (0)