Skip to content

Commit 36a603a

Browse files
authored
Typographical adjustments to paper by JOSS editor @danielskatz (#308)
1 parent 67973f5 commit 36a603a

File tree

1 file changed

+7
-7
lines changed

1 file changed

+7
-7
lines changed

paper/paper.md

+7-7
Original file line numberDiff line numberDiff line change
@@ -60,7 +60,7 @@ coupling the two, allowing users to call PyTorch models from Fortran.
6060
`FTorch` is open-source, open-development, and well-documented with minimal dependencies.
6161
A central tenet of its design, in contrast to other approaches, is
6262
that FTorch removes dependence on the Python runtime (and virtual environments).
63-
By building on the `LibTorch` backend (written in C++ and accessible via an API) it
63+
By building on the `LibTorch` backend (written in C++ and accessible via an API), it
6464
allows users to run ML models on both
6565
CPU and GPU architectures without needing to port code to device-specific languages.
6666

@@ -77,8 +77,8 @@ and the development of data-driven components.
7777
Such deployments of ML can achieve improved computational and/or predictive performance,
7878
compared to traditional numerical techniques.
7979
A common example from the geosciences is ML parameterisation
80-
of subgrid processes — a major source of uncertainty in many models
81-
[e.g. @bony2015clouds; @rasp2018deep].
80+
of subgrid processes—a major source of uncertainty in many models
81+
(e.g., @bony2015clouds, @rasp2018deep).
8282

8383
Fortran is widely used for scientific codes due to its performance,
8484
stability, array-oriented design, and native support for shared and distributed memory,
@@ -91,7 +91,7 @@ Ideally, users would develop and validate ML models in the PyTorch environment
9191
before deploying them into a scientific model.
9292
This deployment should require minimal additional code, and guarantee
9393
identical results as obtained with the PyTorch
94-
interface — something not guaranteed if re-implementing by hand in Fortran.
94+
interface—something not guaranteed if re-implementing by hand in Fortran.
9595
Ideally one would call out, from Fortran, to an ML model
9696
saved from PyTorch, with the results returned directly to the scientific code.
9797

@@ -112,7 +112,7 @@ Python environments can be challenging.
112112
# Software description
113113

114114
`FTorch` is a Fortran wrapper to the `LibTorch` C++ framework using the `iso_c_binding`
115-
module, intrinsic to Fortran since the 2003 standard
115+
module, intrinsic to Fortran since the 2003 standard.
116116
This enables shared memory use (where possible) to
117117
maximise efficiency by reducing data-transfer during coupling^[i.e. the same
118118
data in memory is used by both `LibTorch` and Fortran without creating a copy.]
@@ -169,7 +169,7 @@ projects is available at
169169
runtime from Fortran.
170170

171171
* **TorchFort** [@torchfort]\
172-
Since we began `FTorch` NVIDIA has released `TorchFort`.
172+
Since we began `FTorch`, NVIDIA has released `TorchFort`.
173173
This has a similar approach to `FTorch`, avoiding Python to link against
174174
the `LibTorch` backend. It has a focus on enabling GPU deployment on NVIDIA hardware.
175175

@@ -183,7 +183,7 @@ projects is available at
183183

184184
* **SmartSim** [@partee2022using]\
185185
SmartSim is a workflow library developed by HPE and built upon Redis API.
186-
It provides a framework for launching ML and HPC workloads transferring data
186+
It provides a framework for launching ML and HPC workloads, transferring data
187187
between the two via a database.
188188
This is a versatile approach that can work with a variety of languages and ML
189189
frameworks. However, it has a significant learning curve, incurs data-transfer

0 commit comments

Comments
 (0)