You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have searched the existing issues and this bug is not already filed.
My model is hosted on OpenAI or Azure. If not, please look at the "model providers" issue and don't file a new one here.
I believe this is a legitimate bug, not just a question. If this is a question, please use the Discussions area.
Describe the bug
json.decoder.JSONDecodeError: Unterminated string starting at WHEN generate_text_embeddings
Could you tell me how toc cope with this?
Thank you so much!!
Steps to reproduce
graphrag index --root ./rag_book
Expected Behavior
No response
GraphRAG Config Used
# Paste your config here### This config file contains required core defaults that must be set, along with a handful of common optional settings.### For a full list of available settings, see https://microsoft.github.io/graphrag/config/yaml/### LLM settings ##### There are a number of settings to tune the threading and token limits for LLM calls - check the docs.encoding_model: cl100k_base # this needs to be matched to your model!llm:
api_key: ${GRAPHRAG_API_KEY} # set this in the generated .env filetype: openai_chat # or azure_openai_chatmodel: gpt-4o-minimodel_supports_json: true # recommended if this is available for your model.# audience: "https://cognitiveservices.azure.com/.default"# api_base: https://<instance>.openai.azure.com# api_version: 2024-02-15-preview# organization: <organization_id># deployment_name: <azure_model_deployment_name>parallelization:
stagger: 0.3# num_threads: 50async_mode: threaded # or asyncioembeddings:
async_mode: threaded # or asynciovector_store:
type: lancedbdb_uri: 'output/lancedb'container_name: defaultoverwrite: truellm:
api_key: ${GRAPHRAG_API_KEY}type: openai_embedding # or azure_openai_embeddingmodel: text-embedding-ada-002max_tokens: 8000api_base: https://platform.llmprovider.ai/v1# api_version: 2024-02-15-preview# audience: "https://cognitiveservices.azure.com/.default"# organization: <organization_id># deployment_name: <azure_model_deployment_name>### Input settings ###input:
type: file # or blobfile_type: text # or csvbase_dir: "input"file_encoding: utf-8file_pattern: ".*\\.txt$"chunks:
size: 1200overlap: 100group_by_columns: [id]### Storage settings ##### If blob storage is specified in the following four sections,## connection_string and container_name must be providedcache:
type: file # or blobbase_dir: "cache"reporting:
type: file # or console, blobbase_dir: "logs"storage:
type: file # or blobbase_dir: "output"## only turn this on if running `graphrag index` with custom settings## we normally use `graphrag update` with the defaultsupdate_index_storage:
# type: file # or blob# base_dir: "update_output"### Workflow settings ###skip_workflows: []entity_extraction:
prompt: "prompts/entity_extraction.txt"entity_types: [organization,person,geo,event]max_gleanings: 1summarize_descriptions:
prompt: "prompts/summarize_descriptions.txt"max_length: 500claim_extraction:
enabled: falseprompt: "prompts/claim_extraction.txt"description: "Any claims or facts that could be relevant to information discovery."max_gleanings: 1community_reports:
prompt: "prompts/community_report.txt"max_length: 2000max_input_length: 8000cluster_graph:
max_cluster_size: 10embed_graph:
enabled: false # if true, will generate node2vec embeddings for nodesumap:
enabled: false # if true, will generate UMAP embeddings for nodessnapshots:
graphml: falseembeddings: falsetransient: false### Query settings ##### The prompt locations are required here, but each search method has a number of optional knobs that can be tuned.## See the config docs: https://microsoft.github.io/graphrag/config/yaml/#querylocal_search:
prompt: "prompts/local_search_system_prompt.txt"global_search:
map_prompt: "prompts/global_search_map_system_prompt.txt"reduce_prompt: "prompts/global_search_reduce_system_prompt.txt"knowledge_prompt: "prompts/global_search_knowledge_system_prompt.txt"drift_search:
prompt: "prompts/drift_search_system_prompt.txt"
Logs and screenshots
14:00:59,942 httpx INFO HTTP Request: POST https://platform.llmprovider.ai/v1/chat/completions "HTTP/1.1 200 OK"
15:52:48,215 graphrag.callbacks.file_workflow_callbacks INFO Error Invoking LLM details={'prompt': ['JOHN WILEY & SONS, INC.:John Wiley & Sons, Inc. is a publishing company known for its scientific and academic publications, including works on immunology.', 'AMERICAN JOURNAL OF EPIDEMIOLOGY:The American Journal of Epidemiology publishes research related to epidemiology and public health, featuring studies like the one on Guillain-Barre syndrome during the H1N1 vaccination campaign.', 'CURRENT PROTOCOLS IN IMMUNOLOGY:A publication that provides standardized procedures and protocols for immunology research, including vaccine studies.', 'PACKAGING REGULATIONS:Regulations concerning the development, safety, and distribution of vaccines and biological agents.', 'VACCINES AND ADJUVANTS SAFETY PROJECT COMMITTEE:A committee focusing on the safety evaluation of vaccines and their adjuvants.', 'M. THE EX VIVO IFN-G ENZYME-LINKED IMMUNOSPOT ASSAY:', 'YURASOV S:Yurasov S is an author who has contributed research in the field of rheumatology.', 'NUSSENZWEIG MC:Nussenzweig MC is a researcher noted for work on autoreactive antibodies and their regulation.', 'ZAITSEVA M:Zaitseva M is an author involved in research regarding the use of human MonoMac6 cells.', 'ROMANTSEVA T:Romantseva T is a collaborator on research surrounding vaccine safety.', 'BLINOVA K:Blanova K contributed to research on the in vitro assay predictive of adjuvant safety.', 'DRANE D:Drane D is an author associated with immunopotentiators in modern vaccines.', 'VACCINE 2012:Vaccine 2012 is a significant event or publication discussing advancements in vaccine research and safety.', 'CURRENTS OPINION RHEUMATOL:Current Opinion in Rheumatology is a journal that discusses contemporary issues in rheumatology.', 'MONOMAC6:', 'VERTEBRATE ADAPTIVE IMMUNE CELLS:The immune cells that possess two types of antigen receptors, immunoglobulins and T-cell receptors, crucial for adaptive immunity.'], 'kwargs': {}}
15:52:48,216 datashaper.workflow.workflow ERROR Error executing verb "generate_text_embeddings" in generate_text_embeddings: Unterminated string starting at: line 1 column 107183 (char 107182)
Traceback (most recent call last):
File "/root/miniconda3/lib/python3.12/site-packages/datashaper/workflow/workflow.py", line 415, in _execute_verb
result = await result
^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/workflows/v1/generate_text_embeddings.py", line 96, in workflow
await generate_text_embeddings(
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/flows/generate_text_embeddings.py", line 100, in generate_text_embeddings
await _run_and_snapshot_embeddings(
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/flows/generate_text_embeddings.py", line 123, in _run_and_snapshot_embeddings
data["embedding"] = await embed_text(
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/embed_text.py", line 89, in embed_text
return await _text_embed_with_vector_store(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/embed_text.py", line 179, in _text_embed_with_vector_store
result = await strategy_exec(
^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 62, in run
embeddings = await _execute(llm, text_batches, ticker, semaphore)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 102, in _execute
results = await asyncio.gather(*futures)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 96, in embed
chunk_embeddings = await llm(chunk)
^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 112, in call
return await self._invoke(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 128, in _invoke
return await self._decorated_target(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 109, in invoke
result = await execute_with_retry()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 93, in execute_with_retry
async for a in AsyncRetrying(
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/asyncio/init.py", line 166, in anext
do = await self.iter(retry_state=self._retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/asyncio/init.py", line 153, in iter
result = await action(retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/_utils.py", line 99, in inner
return call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 101, in execute_with_retry
return await attempt()
^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 78, in attempt
return await delegate(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/rate_limiter.py", line 70, in invoke
result = await delegate(prompt, **args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 152, in _decorator_target
output = await self._execute_llm(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/openai/llm/embeddings.py", line 133, in _execute_llm
response = await self._call_embeddings_or_cache(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/openai/llm/embeddings.py", line 110, in _call_embeddings_or_cache
return await self._cache.get_or_insert(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/cache_interactor.py", line 50, in get_or_insert
entry = await func()
^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/resources/embeddings.py", line 236, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1843, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1537, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1640, in _request
return await self._process_response(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1737, in _process_response
return await api_response.parse()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_response.py", line 431, in parse
await self.read()
^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_response.py", line 266, in _parse
data = response.json()
^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/httpx/_models.py", line 832, in json
return jsonlib.loads(self.content, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/init.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 107183 (char 107182)
15:52:48,218 graphrag.callbacks.file_workflow_callbacks INFO Error executing verb "generate_text_embeddings" in generate_text_embeddings: Unterminated string starting at: line 1 column 107183 (char 107182) details=None
15:52:48,225 graphrag.index.run.run ERROR error running workflow generate_text_embeddings
Traceback (most recent call last):
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/run/run.py", line 262, in run_pipeline
result = await _process_workflow(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/run/workflow.py", line 103, in _process_workflow
result = await workflow.run(context, callbacks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/datashaper/workflow/workflow.py", line 369, in run
timing = await self._execute_verb(node, context, callbacks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/datashaper/workflow/workflow.py", line 415, in _execute_verb
result = await result
^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/workflows/v1/generate_text_embeddings.py", line 96, in workflow
await generate_text_embeddings(
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/flows/generate_text_embeddings.py", line 100, in generate_text_embeddings
await _run_and_snapshot_embeddings(
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/flows/generate_text_embeddings.py", line 123, in _run_and_snapshot_embeddings
data["embedding"] = await embed_text(
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/embed_text.py", line 89, in embed_text
return await _text_embed_with_vector_store(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/embed_text.py", line 179, in _text_embed_with_vector_store
result = await strategy_exec(
^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 62, in run
embeddings = await _execute(llm, text_batches, ticker, semaphore)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 102, in _execute
results = await asyncio.gather(*futures)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 96, in embed
chunk_embeddings = await llm(chunk)
^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 112, in call
return await self._invoke(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 128, in _invoke
return await self._decorated_target(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 109, in invoke
result = await execute_with_retry()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 93, in execute_with_retry
async for a in AsyncRetrying(
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/asyncio/init.py", line 166, in anext
do = await self.iter(retry_state=self._retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/asyncio/init.py", line 153, in iter
result = await action(retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/_utils.py", line 99, in inner
return call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 101, in execute_with_retry
return await attempt()
^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 78, in attempt
return await delegate(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/rate_limiter.py", line 70, in invoke
result = await delegate(prompt, **args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 152, in _decorator_target
output = await self._execute_llm(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/openai/llm/embeddings.py", line 133, in _execute_llm
response = await self._call_embeddings_or_cache(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/openai/llm/embeddings.py", line 110, in _call_embeddings_or_cache
return await self._cache.get_or_insert(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/cache_interactor.py", line 50, in get_or_insert
entry = await func()
^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/resources/embeddings.py", line 236, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1843, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1537, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1640, in _request
return await self._process_response(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1737, in _process_response
return await api_response.parse()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_response.py", line 431, in parse
await self.read()
^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_response.py", line 266, in _parse
data = response.json()
^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/httpx/_models.py", line 832, in json
return jsonlib.loads(self.content, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/init.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 107183 (char 107182)
15:52:48,229 graphrag.callbacks.file_workflow_callbacks INFO Error running pipeline! details=None
15:52:48,254 graphrag.cli.index ERROR Errors occurred during the pipeline run, see logs for more details.
Additional Information
GraphRAG Version:1.0.1
Operating System:ubuntu 20.04
Python Version:3.12
Related Issues:
The text was updated successfully, but these errors were encountered:
gudehhh666
added
bug
Something isn't working
triage
Default label assignment, indicates new issue needs reviewed by a maintainer
labels
Dec 22, 2024
Do you need to file an issue?
Describe the bug
json.decoder.JSONDecodeError: Unterminated string starting at WHEN generate_text_embeddings
Could you tell me how toc cope with this?
Thank you so much!!
Steps to reproduce
graphrag index --root ./rag_book
Expected Behavior
No response
GraphRAG Config Used
Logs and screenshots
14:00:59,942 httpx INFO HTTP Request: POST https://platform.llmprovider.ai/v1/chat/completions "HTTP/1.1 200 OK"
15:52:48,215 graphrag.callbacks.file_workflow_callbacks INFO Error Invoking LLM details={'prompt': ['JOHN WILEY & SONS, INC.:John Wiley & Sons, Inc. is a publishing company known for its scientific and academic publications, including works on immunology.', 'AMERICAN JOURNAL OF EPIDEMIOLOGY:The American Journal of Epidemiology publishes research related to epidemiology and public health, featuring studies like the one on Guillain-Barre syndrome during the H1N1 vaccination campaign.', 'CURRENT PROTOCOLS IN IMMUNOLOGY:A publication that provides standardized procedures and protocols for immunology research, including vaccine studies.', 'PACKAGING REGULATIONS:Regulations concerning the development, safety, and distribution of vaccines and biological agents.', 'VACCINES AND ADJUVANTS SAFETY PROJECT COMMITTEE:A committee focusing on the safety evaluation of vaccines and their adjuvants.', 'M. THE EX VIVO IFN-G ENZYME-LINKED IMMUNOSPOT ASSAY:', 'YURASOV S:Yurasov S is an author who has contributed research in the field of rheumatology.', 'NUSSENZWEIG MC:Nussenzweig MC is a researcher noted for work on autoreactive antibodies and their regulation.', 'ZAITSEVA M:Zaitseva M is an author involved in research regarding the use of human MonoMac6 cells.', 'ROMANTSEVA T:Romantseva T is a collaborator on research surrounding vaccine safety.', 'BLINOVA K:Blanova K contributed to research on the in vitro assay predictive of adjuvant safety.', 'DRANE D:Drane D is an author associated with immunopotentiators in modern vaccines.', 'VACCINE 2012:Vaccine 2012 is a significant event or publication discussing advancements in vaccine research and safety.', 'CURRENTS OPINION RHEUMATOL:Current Opinion in Rheumatology is a journal that discusses contemporary issues in rheumatology.', 'MONOMAC6:', 'VERTEBRATE ADAPTIVE IMMUNE CELLS:The immune cells that possess two types of antigen receptors, immunoglobulins and T-cell receptors, crucial for adaptive immunity.'], 'kwargs': {}}
15:52:48,216 datashaper.workflow.workflow ERROR Error executing verb "generate_text_embeddings" in generate_text_embeddings: Unterminated string starting at: line 1 column 107183 (char 107182)
Traceback (most recent call last):
File "/root/miniconda3/lib/python3.12/site-packages/datashaper/workflow/workflow.py", line 415, in _execute_verb
result = await result
^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/workflows/v1/generate_text_embeddings.py", line 96, in workflow
await generate_text_embeddings(
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/flows/generate_text_embeddings.py", line 100, in generate_text_embeddings
await _run_and_snapshot_embeddings(
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/flows/generate_text_embeddings.py", line 123, in _run_and_snapshot_embeddings
data["embedding"] = await embed_text(
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/embed_text.py", line 89, in embed_text
return await _text_embed_with_vector_store(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/embed_text.py", line 179, in _text_embed_with_vector_store
result = await strategy_exec(
^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 62, in run
embeddings = await _execute(llm, text_batches, ticker, semaphore)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 102, in _execute
results = await asyncio.gather(*futures)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 96, in embed
chunk_embeddings = await llm(chunk)
^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 112, in call
return await self._invoke(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 128, in _invoke
return await self._decorated_target(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 109, in invoke
result = await execute_with_retry()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 93, in execute_with_retry
async for a in AsyncRetrying(
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/asyncio/init.py", line 166, in anext
do = await self.iter(retry_state=self._retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/asyncio/init.py", line 153, in iter
result = await action(retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/_utils.py", line 99, in inner
return call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 101, in execute_with_retry
return await attempt()
^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 78, in attempt
return await delegate(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/rate_limiter.py", line 70, in invoke
result = await delegate(prompt, **args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 152, in _decorator_target
output = await self._execute_llm(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/openai/llm/embeddings.py", line 133, in _execute_llm
response = await self._call_embeddings_or_cache(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/openai/llm/embeddings.py", line 110, in _call_embeddings_or_cache
return await self._cache.get_or_insert(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/cache_interactor.py", line 50, in get_or_insert
entry = await func()
^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/resources/embeddings.py", line 236, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1843, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1537, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1640, in _request
return await self._process_response(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1737, in _process_response
return await api_response.parse()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_response.py", line 431, in parse
await self.read()
^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_response.py", line 266, in _parse
data = response.json()
^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/httpx/_models.py", line 832, in json
return jsonlib.loads(self.content, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/init.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 107183 (char 107182)
15:52:48,218 graphrag.callbacks.file_workflow_callbacks INFO Error executing verb "generate_text_embeddings" in generate_text_embeddings: Unterminated string starting at: line 1 column 107183 (char 107182) details=None
15:52:48,225 graphrag.index.run.run ERROR error running workflow generate_text_embeddings
Traceback (most recent call last):
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/run/run.py", line 262, in run_pipeline
result = await _process_workflow(
^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/run/workflow.py", line 103, in _process_workflow
result = await workflow.run(context, callbacks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/datashaper/workflow/workflow.py", line 369, in run
timing = await self._execute_verb(node, context, callbacks)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/datashaper/workflow/workflow.py", line 415, in _execute_verb
result = await result
^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/workflows/v1/generate_text_embeddings.py", line 96, in workflow
await generate_text_embeddings(
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/flows/generate_text_embeddings.py", line 100, in generate_text_embeddings
await _run_and_snapshot_embeddings(
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/flows/generate_text_embeddings.py", line 123, in _run_and_snapshot_embeddings
data["embedding"] = await embed_text(
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/embed_text.py", line 89, in embed_text
return await _text_embed_with_vector_store(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/embed_text.py", line 179, in _text_embed_with_vector_store
result = await strategy_exec(
^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 62, in run
embeddings = await _execute(llm, text_batches, ticker, semaphore)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 102, in _execute
results = await asyncio.gather(*futures)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/graphrag/index/operations/embed_text/strategies/openai.py", line 96, in embed
chunk_embeddings = await llm(chunk)
^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 112, in call
return await self._invoke(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 128, in _invoke
return await self._decorated_target(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 109, in invoke
result = await execute_with_retry()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 93, in execute_with_retry
async for a in AsyncRetrying(
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/asyncio/init.py", line 166, in anext
do = await self.iter(retry_state=self._retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/asyncio/init.py", line 153, in iter
result = await action(retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/_utils.py", line 99, in inner
return call(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/tenacity/init.py", line 398, in
self._add_action_func(lambda rs: rs.outcome.result())
^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 101, in execute_with_retry
return await attempt()
^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/retryer.py", line 78, in attempt
return await delegate(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/rate_limiter.py", line 70, in invoke
result = await delegate(prompt, **args)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/base/base.py", line 152, in _decorator_target
output = await self._execute_llm(prompt, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/openai/llm/embeddings.py", line 133, in _execute_llm
response = await self._call_embeddings_or_cache(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/openai/llm/embeddings.py", line 110, in _call_embeddings_or_cache
return await self._cache.get_or_insert(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/fnllm/services/cache_interactor.py", line 50, in get_or_insert
entry = await func()
^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/resources/embeddings.py", line 236, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1843, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1537, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1640, in _request
return await self._process_response(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_base_client.py", line 1737, in _process_response
return await api_response.parse()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_response.py", line 431, in parse
await self.read()
^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/openai/_response.py", line 266, in _parse
data = response.json()
^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/site-packages/httpx/_models.py", line 832, in json
return jsonlib.loads(self.content, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/init.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/root/miniconda3/lib/python3.12/json/decoder.py", line 353, in raw_decode
obj, end = self.scan_once(s, idx)
^^^^^^^^^^^^^^^^^^^^^^
json.decoder.JSONDecodeError: Unterminated string starting at: line 1 column 107183 (char 107182)
15:52:48,229 graphrag.callbacks.file_workflow_callbacks INFO Error running pipeline! details=None
15:52:48,254 graphrag.cli.index ERROR Errors occurred during the pipeline run, see logs for more details.
Additional Information
The text was updated successfully, but these errors were encountered: