Skip to content

Rename new neural networks tutorial and add contents #6105

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Draft
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

RalfG
Copy link

@RalfG RalfG commented May 17, 2025

  • Renames the tutorial to a more fitting title
  • Added metadata
  • Added tutorial contents

Copy link
Member

@bebatut bebatut left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here is my review with some suggestions
Could you add the previous path in bin/check-url-persistence.sh here:
https://github.com/galaxyproject/training-material/blob/main/bin/check-url-persistence.sh#L32?
Thanks a lot

- Discovering the basic training loop
- Learning the concepts of an activation function
- Training a simple fully-connected neural network
time_estimation: 1H
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
time_estimation: 1H
time_estimation: 1H
redirect_from:
- /topics/statistics/tutorials/deep-learning-without-gai-with-python/tutorial.md

layout: tutorial_hands_on
title: "Practical deep learning with PyTorch"
level: Intermediate
draft: true
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
draft: true

- intro-to-ml-with-python
- neural-networks-with-python
questions:
- to do
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you add a few questions covered in the tutorial?

---


# 1 Neural networks with PyTorch
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
# 1 Neural networks with PyTorch
# Neural networks with PyTorch

_ = torch.manual_seed(42)
```

## 1.1 Working with tensors
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
## 1.1 Working with tensors
## Working with tensors


Now the dataset can be accessed by index, just like a list. Each item is a tuple containing the features and targets as tensors.

*Assignment: Try to access the first 10 samples of the dataset. What do you see?*
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
*Assignment: Try to access the first 10 samples of the dataset. What do you see?*
> <hands-on-title></hands-on-title>
> Try to access the first 10 samples of the dataset.
>
> > <question-title></question-title>
> > What do you see?
> {: .question}
{: .hands-on}


### 1.4.2 Extending the model architecture

*Assignment: Modify the `BreastCancerClassifier` class to include three hidden layers, each with a decreasing number of neurons (30, 15, 1). Use the ReLU activation function between each layer. The final output layer should have 1 neuron and use the sigmoid activation function.*
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
*Assignment: Modify the `BreastCancerClassifier` class to include three hidden layers, each with a decreasing number of neurons (30, 15, 1). Use the ReLU activation function between each layer. The final output layer should have 1 neuron and use the sigmoid activation function.*
> <hands-on-title></hands-on-title>
> 1. Modify the `BreastCancerClassifier` class to include three hidden layers, each with a decreasing number of neurons (30, 15, 1).
> 2. Use the ReLU activation function between each layer.
>
> The final output layer should have 1 neuron and use the sigmoid activation function.
{: .hands-on}


### 1.4.3 Training the model

*Assignment: Now repeat the training process by reusing your code from 1.3.3. How does the model perform? Is it better than the previous model?*
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
*Assignment: Now repeat the training process by reusing your code from 1.3.3. How does the model perform? Is it better than the previous model?*
> <hands-on-title></hands-on-title>
> Repeat the training process by reusing your code from 1.3.3.
>
> > <question-title></question-title>
> > 1. How does the model perform?
> > 2. Is it better than the previous model?
> {: .question}
{: .hands-on}


All configuration of the training process is done in the [`Trainer`](https://pytorch-lightning.readthedocs.io/en/stable/common/trainer.html#trainer) class. This class will take care of the training and validation loops, as well as logging and checkpointing. We will create an instance of this class and pass it our model and the training and validation data loaders.

*Assignment: Browse through the documentation for the trainer class and try to understand the different arguments of the class. Pay special attention to the `accelerator` and `devices` arguments, which are some of the most useful features of PyTorch Lightning.*
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
*Assignment: Browse through the documentation for the trainer class and try to understand the different arguments of the class. Pay special attention to the `accelerator` and `devices` arguments, which are some of the most useful features of PyTorch Lightning.*
> <hands-on-title></hands-on-title>
> Browse through the documentation for the trainer class and try to understand the different arguments of the class. Pay special attention to the `accelerator` and `devices` arguments, which are some of the most useful features of PyTorch Lightning.
{: .hands-on}

language: python
pyolite: true
---

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
> <agenda-title></agenda-title>
>
> In this tutorial, we will cover:
>
> 1. TOC
> {:toc}
>
{: .agenda}

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants