-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Request for the correct version of the code to regenerate the same paper results? #3
Comments
Hi BassantToblba1234, We are not sure what is your question. This is the code that we used to generate our results. It includes multiple different configurations, such as PCA. |
Dear Prof Yingmao,
Thank you very much for your reply.
I mean I’d like to share with me the code that generates the results ( of
QuGAN and the other classical GANs, such as QiGAN, TF-GAN )
I tried to use the shared code, but I could not reproduce the results
reported in the paper
(table 1 and figure 8,10, and 11 in the paper as shown in figures below )
I wonder if you can kindly share the code that produced them to be able to
do some experiments over it. I highly appreciate your help.
Best Regards,
…On Fri, 19 Aug 2022 at 7:58 PM yingmao ***@***.***> wrote:
Hi BassantToblba1234,
We are not sure what is your question. This is the code that we used to
generate our results. It includes multiple different configurations, such
as PCA.
—
Reply to this email directly, view it on GitHub
<#3 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AOE5CAZGP2PQ32A5RUYZT63VZ7DOBANCNFSM552K7BMQ>
.
You are receiving this because you authored the thread.Message ID:
***@***.***
.com>
|
Hi BassantTolba1234, We did not provide the code for TFQ-GANs and Qi-GANs as they were basically TensorFlow and Qiskit based implementations with default loss functions. You can check the papers below. Quantum generative adversarial networks for learning and loading random distributions (Qi-GAN) As for QuGAN, due to random seeds (within Qiskit simulators), you won't see exactly the same values. However, you should see similar trends. I'm not sure what values and trends of Hellinger distance you obtained. We can answer specific questions related to the code, such as PCA dimensions, data labels, and parameter values. |
Thank you very much for your clarification.
Yes I got it.
I have some questions please.
1-How can I calculate the number of parameters of QuGAN. How can I control
it??
2-Does QuGAN use training dataset samples or testing dataset samples?? And
what is the number of samples??
3- what was the number of label dataset samples used( real data) ??
And where we used in the code??
I noticed that only the input dataset samples used to be transformed into
qubit in the code, not the real labeled output data.
Please can you kindly clarify these points for me?.
Thank you very much in advance.
…On Fri, 19 Aug 2022 at 10:16 PM yingmao ***@***.***> wrote:
Hi BassantTolba1234,
We did not provide the code for TFQ-GANs and Qi-GANs as they were
basically TensorFlow and Qiskit based implementations with default loss
functions. You can check the papers below.
Quantum generative adversarial networks for learning and loading random
distributions (Qi-GAN)
Tensorflow quantum: A software framework for quantum machine learning
(Combining TFQ and TF)
As for QuGAN, due to random seeds (within Qiskit simulators), you won't
see exactly the same values. However, you should see similar trends. I'm
not sure what values and trends of Hellinger distance you obtained. We can
answer specific questions related to the code, such as PCA dimensions, data
labels, and parameter values.
—
Reply to this email directly, view it on GitHub
<#3 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AOE5CA62ETC2MTLZH4PGONLVZ7TSFANCNFSM552K7BMQ>
.
You are receiving this because you authored the thread.Message ID:
***@***.***
.com>
|
Dear Prof,
please can you kindly share the correct version of the code resulting in the same results in the paper?
I'm waiting for your reply.
Thanks a lot.
The text was updated successfully, but these errors were encountered: