Skip to content

FIX: Deprecation and Future Warnings #444

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 26 commits into from
May 19, 2025
Merged
Show file tree
Hide file tree
Changes from 25 commits
Commits
Show all changes
26 commits
Select commit Hold shift + click to select a range
0645aa8
updates to perm_income_cons
mmcky Mar 12, 2025
3bca43b
check JAX deprecation fix
mmcky Mar 12, 2025
92a885d
Merge branch 'main' into fix-deprecation+future
mmcky Mar 31, 2025
c2e23e6
Merge branch 'main' into fix-deprecation+future
mmcky Mar 31, 2025
33ff39f
fix deprecation in jax
mmcky Mar 31, 2025
fef0379
Merge branch 'fix-deprecation+future' of https://github.com/QuantEcon…
mmcky Mar 31, 2025
a529e01
Merge branch 'main' into fix-deprecation+future
mmcky May 14, 2025
aaab768
fix linear_models
mmcky May 14, 2025
b40262a
Merge branch 'fix-deprecation+future' of https://github.com/QuantEcon…
mmcky May 14, 2025
7d5c7d1
fix kalman.md
mmcky May 14, 2025
81a44ff
remove testing variable
mmcky May 14, 2025
14b5359
kalman - remove debug
mmcky May 15, 2025
1775aa0
fix deprecations warnings in kalman_2
mmcky May 15, 2025
ffbe583
fix markov_perf deprecations
mmcky May 15, 2025
6158254
review kesten_processes and looking OK
mmcky May 15, 2025
9418da8
fix pandas_panel deprecations
mmcky May 15, 2025
2bb6e08
fix string formatting warning
mmcky May 15, 2025
383dfef
fix missing solution-end makers
mmcky May 15, 2025
3a1aba2
fix missing solution end
mmcky May 15, 2025
b62d770
ensure latex is installed for rendering of plot (collab)
mmcky May 19, 2025
6b1d639
Revert "ensure latex is installed for rendering of plot (collab)"
mmcky May 19, 2025
5fb9403
move texlive install for collab to action
mmcky May 19, 2025
d73fa6c
check pickled environment failure
mmcky May 19, 2025
6343390
[linear_models] final deprecation notice
mmcky May 19, 2025
5f66dd9
remove debug
mmcky May 19, 2025
1d84fc5
add gpu backend code to the top of the lecture
mmcky May 19, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 6 additions & 5 deletions .github/workflows/collab.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,12 @@ jobs:
- uses: actions/checkout@v4
with:
ref: ${{ github.event.pull_request.head.sha }}
# Install build software
- name: Install Build Software & LaTeX (kalman_2)
shell: bash -l {0}
run: |
pip install jupyter-book==1.0.3 quantecon-book-theme==0.8.2 sphinx-tojupyter==0.3.0 sphinxext-rediraffe==0.2.7 sphinxcontrib-youtube==1.3.0 sphinx-togglebutton==0.3.2 arviz sphinx-proof sphinx-exercise sphinx-reredirects
apt-get install dvipng texlive texlive-latex-extra texlive-fonts-recommended cm-super
- name: Check nvidia drivers
shell: bash -l {0}
run: |
Expand All @@ -28,11 +34,6 @@ jobs:
branch: main
name: build-cache
path: _build
# Install build software
- name: Install Build Software
shell: bash -l {0}
run: |
pip install jupyter-book==1.0.3 quantecon-book-theme==0.8.2 sphinx-tojupyter==0.3.0 sphinxext-rediraffe==0.2.7 sphinxcontrib-youtube==1.3.0 sphinx-togglebutton==0.3.2 arviz sphinx-proof sphinx-exercise sphinx-reredirects
# Build of HTML (Execution Testing)
- name: Build HTML
shell: bash -l {0}
Expand Down
8 changes: 3 additions & 5 deletions lectures/back_prop.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,9 @@ jupytext:
extension: .md
format_name: myst
format_version: 0.13
jupytext_version: 1.11.5
jupytext_version: 1.16.7
kernelspec:
display_name: Python 3
display_name: Python 3 (ipykernel)
language: python
name: python3
---
Expand Down Expand Up @@ -606,9 +606,7 @@ Image(fig.to_image(format="png"))

```{code-cell} ipython3
## to check that gpu is activated in environment

from jax.lib import xla_bridge
print(xla_bridge.get_backend().platform)
print(f"JAX backend: {jax.devices()[0].platform}")
```

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hi @mmcky,

This note is at the end of the lecture. Should we move it with the note below to the top?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

thanks @HumphreyYang I will add that change and then merge.

Appreciate you taking a look.

```{note}
Expand Down
10 changes: 4 additions & 6 deletions lectures/cass_fiscal.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ jupytext:
extension: .md
format_name: myst
format_version: 0.13
jupytext_version: 1.16.6
jupytext_version: 1.16.7
kernelspec:
display_name: Python 3 (ipykernel)
language: python
Expand Down Expand Up @@ -750,7 +750,7 @@ def plot_results(solution, k_ss, c_ss, shocks, shock_param,
R_bar_path = compute_R_bar_path(shocks, k_path, model, S)

axes[2].plot(R_bar_path[:T], linestyle=linestyle, label=label)
axes[2].set_title('$\overline{R}$')
axes[2].set_title(r'$\overline{R}$')
axes[2].axhline(1 / model.β, linestyle='--', color='black')

η_path = compute_η_path(k_path, model, S=T)
Expand Down Expand Up @@ -1041,7 +1041,7 @@ Indeed, {eq}`eq:euler_house` or {eq}`eq:diff_second` indicates that a foreseen i
crease in $\tau_{ct}$ (i.e., a decrease in $(1+\tau_{ct})$
$(1+\tau_{ct+1})$) operates like an increase in $\tau_{kt}$.

The following figure portrays the response to a foreseen increase in the consumption tax $\tau_c$.
The following figure portrays the response to a foreseen increase in the consumption tax $\tau_c$.

```{code-cell} ipython3
shocks = {
Expand Down Expand Up @@ -1101,7 +1101,6 @@ The figure shows that:
- Transition dynamics push $k_t$ (capital stock) toward a new, lower steady-state level. In the new steady state:
- Consumption is lower due to reduced output from the lower capital stock.
- Smoother consumption paths occur when $\gamma = 2$ than when $\gamma = 0.2$.


+++

Expand All @@ -1111,8 +1110,6 @@ foreseen one-time change in a policy variable (a "pulse").

**Experiment 4: Foreseen one-time increase in $g$ from 0.2 to 0.4 in period 10, after which $g$ returns to 0.2 forever**



```{code-cell} ipython3
g_path = np.repeat(0.2, S + 1)
g_path[10] = 0.4
Expand All @@ -1136,6 +1133,7 @@ The figure indicates how:
- Before $t = 10$, capital accumulates as interest rate changes induce households to prepare for the anticipated increase in government spending.
- At $t = 10$, the capital stock sharply decreases as the government consumes part of it.
- $\bar{R}$ jumps above its steady-state value due to the capital reduction and then gradually declines toward its steady-state level.

+++

### Method 2: Residual Minimization
Expand Down
45 changes: 22 additions & 23 deletions lectures/kalman.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,8 +3,10 @@ jupytext:
text_representation:
extension: .md
format_name: myst
format_version: 0.13
jupytext_version: 1.16.7
kernelspec:
display_name: Python 3
display_name: Python 3 (ipykernel)
language: python
name: python3
---
Expand All @@ -29,10 +31,9 @@ kernelspec:

In addition to what's in Anaconda, this lecture will need the following libraries:

```{code-cell} ipython
---
tags: [hide-output]
---
```{code-cell} ipython3
:tags: [hide-output]

!pip install quantecon
```

Expand All @@ -54,9 +55,8 @@ Required knowledge: Familiarity with matrix manipulations, multivariate normal d

We'll need the following imports:

```{code-cell} ipython
```{code-cell} ipython3
import matplotlib.pyplot as plt
plt.rcParams["figure.figsize"] = (11, 5) #set default figure size
from scipy import linalg
import numpy as np
import matplotlib.cm as cm
Expand Down Expand Up @@ -122,10 +122,9 @@ $2 \times 2$ covariance matrix. In our simulations, we will suppose that

This density $p(x)$ is shown below as a contour map, with the center of the red ellipse being equal to $\hat x$.

```{code-cell} python3
---
tags: [output_scroll]
---
```{code-cell} ipython3
:tags: [output_scroll]

# Set up the Gaussian prior density p
Σ = [[0.4, 0.3], [0.3, 0.45]]
Σ = np.matrix(Σ)
Expand Down Expand Up @@ -186,7 +185,7 @@ def bivariate_normal(x, y, σ_x=1.0, σ_y=1.0, μ_x=0.0, μ_y=0.0, σ_xy=0.0):

def gen_gaussian_plot_vals(μ, C):
"Z values for plotting the bivariate Gaussian N(μ, C)"
m_x, m_y = float(μ[0]), float(μ[1])
m_x, m_y = float(μ[0,0]), float(μ[1,0])
s_x, s_y = np.sqrt(C[0, 0]), np.sqrt(C[1, 1])
s_xy = C[0, 1]
return bivariate_normal(X, Y, s_x, s_y, m_x, m_y, s_xy)
Expand All @@ -213,15 +212,15 @@ The good news is that the missile has been located by our sensors, which report
The next figure shows the original prior $p(x)$ and the new reported
location $y$

```{code-cell} python3
```{code-cell} ipython3
fig, ax = plt.subplots(figsize=(10, 8))
ax.grid()

Z = gen_gaussian_plot_vals(x_hat, Σ)
ax.contourf(X, Y, Z, 6, alpha=0.6, cmap=cm.jet)
cs = ax.contour(X, Y, Z, 6, colors="black")
ax.clabel(cs, inline=1, fontsize=10)
ax.text(float(y[0]), float(y[1]), "$y$", fontsize=20, color="black")
ax.text(float(y[0].item()), float(y[1].item()), "$y$", fontsize=20, color="black")

plt.show()
```
Expand Down Expand Up @@ -284,7 +283,7 @@ This new density $p(x \,|\, y) = N(\hat x^F, \Sigma^F)$ is shown in the next fig

The original density is left in as contour lines for comparison

```{code-cell} python3
```{code-cell} ipython3
fig, ax = plt.subplots(figsize=(10, 8))
ax.grid()

Expand All @@ -298,7 +297,7 @@ new_Z = gen_gaussian_plot_vals(x_hat_F, Σ_F)
cs2 = ax.contour(X, Y, new_Z, 6, colors="black")
ax.clabel(cs2, inline=1, fontsize=10)
ax.contourf(X, Y, new_Z, 6, alpha=0.6, cmap=cm.jet)
ax.text(float(y[0]), float(y[1]), "$y$", fontsize=20, color="black")
ax.text(float(y[0].item()), float(y[1].item()), "$y$", fontsize=20, color="black")

plt.show()
```
Expand Down Expand Up @@ -391,7 +390,7 @@ A
Q = 0.3 * \Sigma
$$

```{code-cell} python3
```{code-cell} ipython3
fig, ax = plt.subplots(figsize=(10, 8))
ax.grid()

Expand All @@ -415,7 +414,7 @@ new_Z = gen_gaussian_plot_vals(new_x_hat, new_Σ)
cs3 = ax.contour(X, Y, new_Z, 6, colors="black")
ax.clabel(cs3, inline=1, fontsize=10)
ax.contourf(X, Y, new_Z, 6, alpha=0.6, cmap=cm.jet)
ax.text(float(y[0]), float(y[1]), "$y$", fontsize=20, color="black")
ax.text(float(y[0].item()), float(y[1].item()), "$y$", fontsize=20, color="black")

plt.show()
```
Expand Down Expand Up @@ -577,7 +576,7 @@ Your figure should -- modulo randomness -- look something like this
:class: dropdown
```

```{code-cell} python3
```{code-cell} ipython3
# Parameters
θ = 10 # Constant value of state x_t
A, C, G, H = 1, 0, 1, 1
Expand All @@ -598,7 +597,7 @@ xgrid = np.linspace(θ - 5, θ + 2, 200)

for i in range(N):
# Record the current predicted mean and variance
m, v = [float(z) for z in (kalman.x_hat, kalman.Sigma)]
m, v = [float(z) for z in (kalman.x_hat.item(), kalman.Sigma.item())]
# Plot, update filter
ax.plot(xgrid, norm.pdf(xgrid, loc=m, scale=np.sqrt(v)), label=f'$t={i}$')
kalman.update(y[i])
Expand Down Expand Up @@ -641,7 +640,7 @@ Your figure should show error erratically declining something like this
:class: dropdown
```

```{code-cell} python3
```{code-cell} ipython3
ϵ = 0.1
θ = 10 # Constant value of state x_t
A, C, G, H = 1, 0, 1, 1
Expand All @@ -657,7 +656,7 @@ y = y.flatten()

for t in range(T):
# Record the current predicted mean and variance and plot their densities
m, v = [float(temp) for temp in (kalman.x_hat, kalman.Sigma)]
m, v = [float(temp) for temp in (kalman.x_hat.item(), kalman.Sigma.item())]

f = lambda x: norm.pdf(x, loc=m, scale=np.sqrt(v))
integral, error = quad(f, θ - ϵ, θ + ϵ)
Expand Down Expand Up @@ -745,7 +744,7 @@ Observe how, after an initial learning period, the Kalman filter performs quite
:class: dropdown
```

```{code-cell} python3
```{code-cell} ipython3
# Define A, C, G, H
G = np.identity(2)
H = np.sqrt(0.5) * np.identity(2)
Expand Down
24 changes: 8 additions & 16 deletions lectures/kalman_2.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ jupytext:
extension: .md
format_name: myst
format_version: 0.13
jupytext_version: 1.14.4
jupytext_version: 1.16.7
kernelspec:
display_name: Python 3 (ipykernel)
language: python
Expand Down Expand Up @@ -237,7 +237,7 @@ for t in range(1, T):
x_hat, Σ = kalman.x_hat, kalman.Sigma
Σ_t[:, :, t-1] = Σ
x_hat_t[:, t-1] = x_hat.reshape(-1)
y_hat_t[t-1] = worker.G @ x_hat
[y_hat_t[t-1]] = worker.G @ x_hat

x_hat_t = np.concatenate((x[:, 1][:, np.newaxis],
x_hat_t), axis=1)
Expand All @@ -253,7 +253,6 @@ We also plot $E [u_0 | y^{t-1}]$, which is the firm inference about a worker's
We can watch as the firm's inference $E [u_0 | y^{t-1}]$ of the worker's work ethic converges toward the hidden $u_0$, which is not directly observed by the firm.

```{code-cell} ipython3

fig, ax = plt.subplots(1, 2)

ax[0].plot(y_hat_t, label=r'$E[y_t| y^{t-1}]$')
Expand All @@ -273,6 +272,7 @@ ax[1].legend()
fig.tight_layout()
plt.show()
```

## Some Computational Experiments

Let's look at $\Sigma_0$ and $\Sigma_T$ in order to see how much the firm learns about the hidden state during the horizon we have set.
Expand All @@ -290,7 +290,6 @@ Evidently, entries in the conditional covariance matrix become smaller over tim
It is enlightening to portray how conditional covariance matrices $\Sigma_t$ evolve by plotting confidence ellipsoides around $E [x_t |y^{t-1}] $ at various $t$'s.

```{code-cell} ipython3

# Create a grid of points for contour plotting
h_range = np.linspace(x_hat_t[0, :].min()-0.5*Σ_t[0, 0, 1],
x_hat_t[0, :].max()+0.5*Σ_t[0, 0, 1], 100)
Expand Down Expand Up @@ -338,7 +337,6 @@ For example, let's say $h_0 = 0$ and $u_0 = 4$.
Here is one way to do this.

```{code-cell} ipython3

# For example, we might want h_0 = 0 and u_0 = 4
mu_0 = np.array([0.0, 4.0])

Expand All @@ -361,7 +359,6 @@ print('u_0 =', u_0)
Another way to accomplish the same goal is to use the following code.

```{code-cell} ipython3

# If we want to set the initial
# h_0 = hhat_0 = 0 and u_0 = uhhat_0 = 4.0:
worker = create_worker(hhat_0=0.0, uhat_0=4.0)
Expand Down Expand Up @@ -398,8 +395,8 @@ for t in range(1, T):
kalman.update(y[t])
x_hat, Σ = kalman.x_hat, kalman.Sigma
Σ_t.append(Σ)
y_hat_t[t-1] = worker.G @ x_hat
u_hat_t[t-1] = x_hat[1]
[y_hat_t[t-1]] = worker.G @ x_hat
[u_hat_t[t-1]] = x_hat[1]


# Generate plots for y_hat_t and u_hat_t
Expand All @@ -426,10 +423,9 @@ plt.show()
More generally, we can change some or all of the parameters defining a worker in our `create_worker`
namedtuple.

Here is an example.
Here is an example.

```{code-cell} ipython3

# We can set these parameters when creating a worker -- just like classes!
hard_working_worker = create_worker(α=.4, β=.8,
hhat_0=7.0, uhat_0=100, σ_h=2.5, σ_u=3.2)
Expand Down Expand Up @@ -475,8 +471,8 @@ def simulate_workers(worker, T, ax, mu_0=None, Sigma_0=None,
kalman.update(y[i])
x_hat, Σ = kalman.x_hat, kalman.Sigma
Σ_t.append(Σ)
y_hat_t[i] = worker.G @ x_hat
u_hat_t[i] = x_hat[1]
[y_hat_t[i]] = worker.G @ x_hat
[u_hat_t[i]] = x_hat[1]

if diff == True:
title = ('Difference between inferred and true work ethic over time'
Expand All @@ -503,7 +499,6 @@ def simulate_workers(worker, T, ax, mu_0=None, Sigma_0=None,
```

```{code-cell} ipython3

num_workers = 3
T = 50
fig, ax = plt.subplots(figsize=(7, 7))
Expand All @@ -516,7 +511,6 @@ plt.show()
```

```{code-cell} ipython3

# We can also generate plots of u_t:

T = 50
Expand All @@ -539,7 +533,6 @@ plt.show()
```

```{code-cell} ipython3

# We can also use exact u_0=1 and h_0=2 for all workers

T = 50
Expand Down Expand Up @@ -568,7 +561,6 @@ plt.show()
```

```{code-cell} ipython3

# We can generate a plot for only one of the workers:

T = 50
Expand Down
Loading
Loading