@@ -101,7 +101,7 @@ Generate a completion from a model as JSON.
101
101
-v Show version information and exit.
102
102
103
103
This function sends a prompt to a specified model and returns the model's response as a raw JSON object.
104
- If streaming is enabled via the global 'OLLAMA_LIB_STREAM ' variable, it will return a stream of JSON objects.
104
+ If streaming is enabled via the global 'OBL_STREAM ' variable, it will return a stream of JSON objects.
105
105
This is a foundational function for 'ollama_generate' and 'ollama_generate_stream', which process this JSON output into plain text.
106
106
```
107
107
@@ -117,7 +117,7 @@ Generate a completion from a model as JSON.
117
117
-v Show version information and exit.
118
118
119
119
This function sends a prompt to a specified model and returns the model's response as a raw JSON object.
120
- If streaming is enabled via the global 'OLLAMA_LIB_STREAM ' variable, it will return a stream of JSON objects.
120
+ If streaming is enabled via the global 'OBL_STREAM ' variable, it will return a stream of JSON objects.
121
121
This is a foundational function for 'ollama_generate' and 'ollama_generate_stream', which process this JSON output into plain text.
122
122
```
123
123
@@ -162,7 +162,7 @@ Generate a completion from a model as a stream of JSON objects.
162
162
-h Show this help and exit.
163
163
-v Show version information and exit.
164
164
165
- This function sets the global 'OLLAMA_LIB_STREAM ' variable to 1 and then calls 'ollama_generate_json'.
165
+ This function sets the global 'OBL_STREAM ' variable to 1 and then calls 'ollama_generate_json'.
166
166
It is the basis for 'ollama_generate_stream', which further processes the output into a continuous stream of text.
167
167
```
168
168
@@ -177,7 +177,7 @@ Generate a completion from a model as a stream of JSON objects.
177
177
-h Show this help and exit.
178
178
-v Show version information and exit.
179
179
180
- This function sets the global 'OLLAMA_LIB_STREAM ' variable to 1 and then calls 'ollama_generate_json'.
180
+ This function sets the global 'OBL_STREAM ' variable to 1 and then calls 'ollama_generate_json'.
181
181
It is the basis for 'ollama_generate_stream', which further processes the output into a continuous stream of text.
182
182
```
183
183
@@ -248,7 +248,7 @@ Add a message to the current session's message history.
248
248
-h Show this help and exit.
249
249
-v Show version information and exit.
250
250
251
- This function appends a new message object to the \'OLLAMA_LIB_MESSAGES \' array.
251
+ This function appends a new message object to the \'OBL_MESSAGES \' array.
252
252
This history is then used by \'ollama_chat\' and related functions to maintain a conversation with the model.
253
253
```
254
254
@@ -263,7 +263,7 @@ Add a message to the current session's message history.
263
263
-h Show this help and exit.
264
264
-v Show version information and exit.
265
265
266
- This function appends a new message object to the \'OLLAMA_LIB_MESSAGES \' array.
266
+ This function appends a new message object to the \'OBL_MESSAGES \' array.
267
267
This history is then used by \'ollama_chat\' and related functions to maintain a conversation with the model.
268
268
```
269
269
@@ -276,7 +276,7 @@ Clear all messages from the current session.
276
276
-h Show this help and exit.
277
277
-v Show version information and exit.
278
278
279
- This function resets the \'OLLAMA_LIB_MESSAGES \' array, effectively deleting the entire conversation history for the current session.
279
+ This function resets the \'OBL_MESSAGES \' array, effectively deleting the entire conversation history for the current session.
280
280
This is useful for starting a new conversation without restarting the script.
281
281
```
282
282
@@ -289,7 +289,7 @@ Clear all messages from the current session.
289
289
-h Show this help and exit.
290
290
-v Show version information and exit.
291
291
292
- This function resets the \'OLLAMA_LIB_MESSAGES \' array, effectively deleting the entire conversation history for the current session.
292
+ This function resets the \'OBL_MESSAGES \' array, effectively deleting the entire conversation history for the current session.
293
293
This is useful for starting a new conversation without restarting the script.
294
294
```
295
295
@@ -302,7 +302,7 @@ Get the number of messages in the current session.
302
302
-h Show this help and exit.
303
303
-v Show version information and exit.
304
304
305
- This function returns the current number of messages stored in the 'OLLAMA_LIB_MESSAGES ' array.
305
+ This function returns the current number of messages stored in the 'OBL_MESSAGES ' array.
306
306
It can be used to check if a conversation has started or to monitor the length of the conversation history.
307
307
```
308
308
@@ -315,7 +315,7 @@ Get the number of messages in the current session.
315
315
-h Show this help and exit.
316
316
-v Show version information and exit.
317
317
318
- This function returns the current number of messages stored in the 'OLLAMA_LIB_MESSAGES ' array.
318
+ This function returns the current number of messages stored in the 'OBL_MESSAGES ' array.
319
319
It can be used to check if a conversation has started or to monitor the length of the conversation history.
320
320
```
321
321
@@ -895,7 +895,7 @@ Configure the 'thinking' mode for model responses.
895
895
-h Show this help and exit.
896
896
-v Show version information and exit.
897
897
898
- This function sets the \'OLLAMA_LIB_THINKING \' environment variable, which controls whether the model\'s \'thinking\' process is displayed.
898
+ This function sets the \'OBL_THINKING \' environment variable, which controls whether the model\'s \'thinking\' process is displayed.
899
899
Modes:
900
900
- on: Show thinking output.
901
901
- off: Hide thinking output.
@@ -912,7 +912,7 @@ Configure the 'thinking' mode for model responses.
912
912
-h Show this help and exit.
913
913
-v Show version information and exit.
914
914
915
- This function sets the \'OLLAMA_LIB_THINKING \' environment variable, which controls whether the model\'s \'thinking\' process is displayed.
915
+ This function sets the \'OBL_THINKING \' environment variable, which controls whether the model\'s \'thinking\' process is displayed.
916
916
Modes:
917
917
- on: Show thinking output.
918
918
- off: Hide thinking output.
@@ -954,7 +954,7 @@ Get the version of the Ollama Bash Lib.
954
954
-h Show this help and exit.
955
955
-v Show version information and exit.
956
956
957
- This function returns the current version number of the library as defined in the 'OLLAMA_LIB_VERSION ' variable.
957
+ This function returns the current version number of the library as defined in the 'OBL_VERSION ' variable.
958
958
It is useful for checking the library version for compatibility or debugging purposes.
959
959
```
960
960
@@ -967,6 +967,6 @@ Get the version of the Ollama Bash Lib.
967
967
-h Show this help and exit.
968
968
-v Show version information and exit.
969
969
970
- This function returns the current version number of the library as defined in the 'OLLAMA_LIB_VERSION ' variable.
970
+ This function returns the current version number of the library as defined in the 'OBL_VERSION ' variable.
971
971
It is useful for checking the library version for compatibility or debugging purposes.
972
972
```
0 commit comments