New features in this release
Classical Generative models 💻
Two classical generative energy based models that learn from binary data are now available for use.
qml_benchmarks.models.energy_based_model.RestrictedBoltzmannMachine
: A wrapped version of sci-kit learn's BernoulliRBM class.qml_benchmarks.models.energy_based_model.DeepEBM
: An energy based model trained with k-contrastive divergence . The energy function is defined via a multi-layer perceptron neural network; more structured energy functions can be implemented by changing theMLP
class object to any other differentiable neural network written in flax.
The models have a similar structure to the existing classifier models, and feature two new methods: model.sample
and model.score
, which can be used to sample the probability distribution defined by the model and score the model using a test dataset of samples.
New datasets for generative learning 🎲
Two new kinds of generating functions for dataset generation are now available:
qml_benchmarks.data.ising.generate_ising
generates datasets that are approximate thermal distributions of classical Ising spin Hamiltonians. The generation uses the standard Metropolis-Hastings Markov chain Monte Carlo method to sample configurations.qml_benchmarks.data.spin_blobs.generate_spin_blobs
generates datasets that are analogs of 'Gaussian blobs' datasets but for binary data. Bit string configurations are sampled close in Hamming distance to a number of specified peak configurations. The generator also returns labels corresponding to the relevant peak configuration and so can be used for multi-class classification as well. Another generatorgenerate_8blobs
generates a simple 16-bit dataset of this kind.
Hyperparameter optimization for generative models 🎯
The previous hyperparameter optimization script found in /scripts
is now compatible with generative models with the correct structure; see the updated README for more information.