Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Annual fAPAR_max and LAI_max #403

Draft
wants to merge 14 commits into
base: develop
Choose a base branch
from

Conversation

MarionBWeinzierl
Copy link
Collaborator

@MarionBWeinzierl MarionBWeinzierl commented Jan 28, 2025

Description

Implements the computation of fAPAR_max and LAI_max, and provides examples usages of the API (including test data).

Fixes #348 .

Type of change

  • New feature (non-breaking change which adds functionality)
  • Optimization (back-end change that speeds up the code)
  • Bug fix (non-breaking change which fixes an issue)

Key checklist

  • Make sure you've run the pre-commit checks: $ pre-commit run -a
  • All tests pass: $ poetry run pytest

Further checks

  • Code is commented, particularly in hard-to-understand areas
  • Tests added that prove fix is effective or that feature works

@MarionBWeinzierl MarionBWeinzierl added the enhancement New feature or request label Jan 28, 2025
@MarionBWeinzierl MarionBWeinzierl self-assigned this Jan 28, 2025
@MarionBWeinzierl MarionBWeinzierl marked this pull request as draft January 28, 2025 16:30
@codecov-commenter
Copy link

codecov-commenter commented Jan 29, 2025

Codecov Report

Attention: Patch coverage is 21.42857% with 44 lines in your changes missing coverage. Please review.

Project coverage is 94.71%. Comparing base (2d06b2b) to head (9139375).
Report is 60 commits behind head on develop.

Files with missing lines Patch % Lines
pyrealm/phenology/fapar_limitation.py 21.42% 44 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           develop     #403      +/-   ##
===========================================
- Coverage    96.26%   94.71%   -1.55%     
===========================================
  Files           34       35       +1     
  Lines         2652     2708      +56     
===========================================
+ Hits          2553     2565      +12     
- Misses          99      143      +44     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@MarionBWeinzierl
Copy link
Collaborator Author

@davidorme , could you have a look and sense-check what's there? As you suggested in #407 (I hope I understood that right) I used Scaler directly, rather than extracting any functionality.

Copy link
Collaborator

@davidorme davidorme left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Some optimisations and a couple of issues. I think we're on the right track here. I did wonder about making the class __init__ accept daily/subdaily observations and moving the resampling in the from_Pmodel into there, but on balance I think it is right to have the straightforward usage as __init__ and then use factory methods to handle other core inputs.

"""

# Extract years from datetimes
all_years = [np.datetime64(i, "Y") for i in datetimes]
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
all_years = [np.datetime64(i, "Y") for i in datetimes]
all_years = datetimes.astype("datetime64[Y]")

"""Class to compute the fAPAR_max and annual peak Leaf Area Index (LAI)."""

import numpy as np
from numpy.ma.core import zeros
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why not just use np.zeros? I think ma is the masked array module and that has significant performance issues.

Comment on lines +113 to +115
z: float = 12.227,
k: float = 0.5,
) -> None:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can bundle these constants (and the hardcoded ones below) into a pyrealm.constants class - probably a new one for PhenologyConsts?

Comment on lines +153 to +163
self.faparmax = -9999 * np.ones(np.shape(fapar_waterlim))
self.energylim = -9999 * np.ones(np.shape(fapar_waterlim))
self.annual_precip_molar = annual_total_precip

for i in range(len(fapar_waterlim)):
if fapar_waterlim[i] < fapar_energylim[i]:
self.faparmax[i] = fapar_waterlim[i]
self.energylim[i] = False
else:
self.faparmax[i] = fapar_energylim[i]
self.energylim[i] = True
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
self.faparmax = -9999 * np.ones(np.shape(fapar_waterlim))
self.energylim = -9999 * np.ones(np.shape(fapar_waterlim))
self.annual_precip_molar = annual_total_precip
for i in range(len(fapar_waterlim)):
if fapar_waterlim[i] < fapar_energylim[i]:
self.faparmax[i] = fapar_waterlim[i]
self.energylim[i] = False
else:
self.faparmax[i] = fapar_energylim[i]
self.energylim[i] = True
self.fapar_max = np.minimum(fapar_waterlim, fapar_energylim)
"""Maximum fapar given water or energy limitation."""
self.energy_limited = fapar_energylim < fapar_waterlim
"""Is fapar_max limited by water or energy."""
self.annual_precip_molar = annual_total_precip
"""The annual precipitation in moles."""

E.g.

In [8]: fapar_waterlim = np.arange(10)
In [10]: fapar_waterlim
Out[10]: array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9])
In [12]: fapar_energylim = np.arange(9, -1, -1)
In [13]: fapar_energylim
Out[13]: array([9, 8, 7, 6, 5, 4, 3, 2, 1, 0])

In [15]: np.minimum(fapar_waterlim, fapar_energylim)
Out[15]: array([0, 1, 2, 3, 4, 4, 3, 2, 1, 0])
In [16]: fapar_waterlim < fapar_energylim
Out[16]: array([ True,  True,  True,  True,  True, False, False, False, False,
       False])

Comment on lines +188 to +190

annual_total_potential_gpp = get_annual(
pmodel.gpp, datetimes, growing_season, "total"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this need to check we are dealing with whole years in the input?

Comment on lines +53 to +64
annual_x = zeros(len(years))

if method == "total":
for i in range(len(years)):
annual_x[i] = sum(
daily_x[growing_season & (years_by_day == years[i])].astype(np.int64)
)
elif method == "mean":
for i in range(len(years)):
annual_x[i] = np.mean(
daily_x[growing_season & (years_by_day == years[i])].astype(np.int64)
)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can avoid this loop by reshaping the inputs?

Oh no, we can't. Leap years. Hang on - does SubdailyScalar handle leap years? Must do? Need to check on this.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was mis-remembering this - SubdailyScalar reshapes evenly spaced (e.g.) hourly data into days by hours so is unaffected by leap years.

/ (1.6 * annual_mean_vpd * annual_total_potential_gpp)
)

self.faparmax = -9999 * np.ones(np.shape(fapar_waterlim))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Generally, I'd use np.nan in cases like this. Here though, I think we can make this more efficient - see suggestion below.

leaves. [mol m^{-2} year^{-1}]
k: Light extinction coefficient.
"""

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should _check_shapes on the inputs. All should be equal (although aridity_index can be scalar, that use case is permitted by _check_shape).

x: Array of values to be converted to annual values. Should be either daily (
same datetimes as growing_season) or subdaily (same datetimes as datetimes
array)
datetimes: Datetimes of the measurements as np.datetime62 arrays.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
datetimes: Datetimes of the measurements as np.datetime62 arrays.
datetimes: Datetimes of the measurements as np.datetime64 arrays.

Comment on lines +34 to +35
scaler = SubdailyScaler(datetimes)
scaler.set_nearest(np.timedelta64(12, "h"))
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can't use the SubdailyScalar on data that isn't sampled at subdaily frequency. The code explicitly checks for the spacing of observations within a day and breaks with coarser data (daily, monthly). I think we can still use the same approach if we nest this within the if statements below and solve it for the daily+ mode in another way.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We should add a failure mode to SubdailyScalar for that case!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Review
Development

Successfully merging this pull request may close these issues.

Calculation of annual fAPAR and LAI values from a fitted PModel and precipitation data
3 participants