Skip to content

Commit a79a74a

Browse files
committed
Update Optimize performance.md
1 parent f267b66 commit a79a74a

File tree

1 file changed

+25
-3
lines changed

1 file changed

+25
-3
lines changed

Optimize performance.md

+25-3
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11

2-
***Cache data objects***
2+
## Cache data objects***
33

44
**Define a cached function:**
55

@@ -30,7 +30,7 @@ clears values from all cached functions, either in-memory or on-disk.
3030

3131
These snippets demonstrate how Streamlit allows you to cache expensive computations or data retrievals to improve performance.
3232

33-
***Cache global resources***
33+
## Cache global resources
3434

3535
E.g. TensorFlow session, database connection, etc.
3636

@@ -53,4 +53,26 @@ Create and return a non-data object
5353
foo.clear()
5454
#Clear all global resources from cache
5555

56-
st.cache_resource.clear()
56+
st.cache_resource.clear()
57+
58+
## Deprecated caching
59+
60+
These Streamlit snippets showcase caching with the `@st.cache` decorator:
61+
62+
**Define a cached function:**
63+
64+
`@st.cache
65+
decorator marks the function `foo()` for caching.
66+
67+
Inside `foo()
68+
perform some expensive computation and return data.
69+
70+
**Execute the cached function:**
71+
72+
`d1 = foo(ref1)` executes `foo()` with the argument `ref1` and caches the result in `d1`.
73+
74+
`d2 = foo(ref1)` retrieves the cached result for the same argument, so `foo()` is not executed again, and `d1 == d2`.
75+
76+
`d3 = foo(ref2)` calls `foo()` with a different argument `ref2`, so the function executes again.
77+
78+
These snippets demonstrate how Streamlit allows you to cache function results to avoid redundant computation and improve performance.

0 commit comments

Comments
 (0)