Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) #11

Open
sunil-thapa99 opened this issue Jul 10, 2024 · 10 comments

Comments

@sunil-thapa99
Copy link

sunil-thapa99 commented Jul 10, 2024

Hi, I'm getting an error for global search.

  • python -m graphrag.query --root ./ragtest --method global "what is the disclosures for companies"

INFO: Reading settings from ragtest/settings.yaml
creating llm client with {'api_key': 'REDACTED,len=9', 'type': "openai_chat", 'model': 'llama3', 'max_tokens': 4000, 'request_timeout': 180.0, 'api_base': 'http://localhost:11434/v1', 'api_version': None, 'organization': None, 'proxy': None, 'cognitive_services_endpoint': None, 'deployment_name': None, 'model_supports_json': True, 'tokens_per_minute': 0, 'requests_per_minute': 0, 'max_retries': 10, 'max_retry_wait': 10.0, 'sleep_on_rate_limit_recommendation': True, 'concurrent_requests': 25}
Error parsing search response json
Traceback (most recent call last):
File "/opt/anaconda3/envs/graphrag/lib/python3.12/site-packages/graphrag/query/structured_search/global_search/search.py", line 194, in _map_response_single_batch
processed_response = self.parse_search_response(search_response)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/graphrag/lib/python3.12/site-packages/graphrag/query/structured_search/global_search/search.py", line 232, in parse_search_response
parsed_elements = json.loads(search_response)["points"]
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/init.py", line 346, in loads
return _default_decoder.decode(s)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/decoder.py", line 337, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/decoder.py", line 355, in raw_decode
raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

SUCCESS: Global Search Response: I am sorry but I am unable to answer this question given the provided data.

@vamshi-rvk
Copy link

Hi, I'm getting an error for global search.

  • python -m graphrag.query --root ./ragtest --method global "what is the disclosures for companies"

INFO: Reading settings from ragtest/settings.yaml creating llm client with {'api_key': 'REDACTED,len=9', 'type': "openai_chat", 'model': 'llama3', 'max_tokens': 4000, 'request_timeout': 180.0, 'api_base': 'http://localhost:11434/v1', 'api_version': None, 'organization': None, 'proxy': None, 'cognitive_services_endpoint': None, 'deployment_name': None, 'model_supports_json': True, 'tokens_per_minute': 0, 'requests_per_minute': 0, 'max_retries': 10, 'max_retry_wait': 10.0, 'sleep_on_rate_limit_recommendation': True, 'concurrent_requests': 25} Error parsing search response json Traceback (most recent call last): File "/opt/anaconda3/envs/graphrag/lib/python3.12/site-packages/graphrag/query/structured_search/global_search/search.py", line 194, in _map_response_single_batch processed_response = self.parse_search_response(search_response) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/site-packages/graphrag/query/structured_search/global_search/search.py", line 232, in parse_search_response parsed_elements = json.loads(search_response)["points"] ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/init.py", line 346, in loads return _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

SUCCESS: Global Search Response: I am sorry but I am unable to answer this question given the provided data.

how did you install graphrag.."pip install graprag"? or "pip install -e ." ?

@sunil-thapa99
Copy link
Author

Hi, I'm getting an error for global search.

  • python -m graphrag.query --root ./ragtest --method global "what is the disclosures for companies"

INFO: Reading settings from ragtest/settings.yaml creating llm client with {'api_key': 'REDACTED,len=9', 'type': "openai_chat", 'model': 'llama3', 'max_tokens': 4000, 'request_timeout': 180.0, 'api_base': 'http://localhost:11434/v1', 'api_version': None, 'organization': None, 'proxy': None, 'cognitive_services_endpoint': None, 'deployment_name': None, 'model_supports_json': True, 'tokens_per_minute': 0, 'requests_per_minute': 0, 'max_retries': 10, 'max_retry_wait': 10.0, 'sleep_on_rate_limit_recommendation': True, 'concurrent_requests': 25} Error parsing search response json Traceback (most recent call last): File "/opt/anaconda3/envs/graphrag/lib/python3.12/site-packages/graphrag/query/structured_search/global_search/search.py", line 194, in _map_response_single_batch processed_response = self.parse_search_response(search_response) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/site-packages/graphrag/query/structured_search/global_search/search.py", line 232, in parse_search_response parsed_elements = json.loads(search_response)["points"] ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/init.py", line 346, in loads return _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
SUCCESS: Global Search Response: I am sorry but I am unable to answer this question given the provided data.

how did you install graphrag.."pip install graprag"? or "pip install -e ." ?

It's pip install graphrag

@TheAiSingularity
Copy link
Owner

Hi, I'm getting an error for global search.

  • python -m graphrag.query --root ./ragtest --method global "what is the disclosures for companies"

INFO: Reading settings from ragtest/settings.yaml creating llm client with {'api_key': 'REDACTED,len=9', 'type': "openai_chat", 'model': 'llama3', 'max_tokens': 4000, 'request_timeout': 180.0, 'api_base': 'http://localhost:11434/v1', 'api_version': None, 'organization': None, 'proxy': None, 'cognitive_services_endpoint': None, 'deployment_name': None, 'model_supports_json': True, 'tokens_per_minute': 0, 'requests_per_minute': 0, 'max_retries': 10, 'max_retry_wait': 10.0, 'sleep_on_rate_limit_recommendation': True, 'concurrent_requests': 25} Error parsing search response json Traceback (most recent call last): File "/opt/anaconda3/envs/graphrag/lib/python3.12/site-packages/graphrag/query/structured_search/global_search/search.py", line 194, in _map_response_single_batch processed_response = self.parse_search_response(search_response) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/site-packages/graphrag/query/structured_search/global_search/search.py", line 232, in parse_search_response parsed_elements = json.loads(search_response)["points"] ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/init.py", line 346, in loads return _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/decoder.py", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/opt/anaconda3/envs/graphrag/lib/python3.12/json/decoder.py", line 355, in raw_decode raise JSONDecodeError("Expecting value", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
SUCCESS: Global Search Response: I am sorry but I am unable to answer this question given the provided data.

how did you install graphrag.."pip install graprag"? or "pip install -e ." ?

It's pip install graphrag

install using "pip install -e ." . please follow the steps mentioned in readme exactly.

@mhet-droid
Copy link

@sunil-thapa99 did "pip install -e ." work for you? I'm getting the same error when using "pip install -e ."

@sunil-thapa99
Copy link
Author

@mhet-droid I'm also getting the same error, but my error is prone to the data I use. If I use simple text it works fine, while if I convert other file formats to text and then try to run, if sends an error.

@shailesh837
Copy link

I am getting below error for "mistral" model and i followed README excatly , how its said:

Traceback (most recent call last):
  File "/home/spandey2/miniconda3/envs/graphrag-local-ollama/lib/python3.10/site-packages/datashaper/workflow/workflow.py", line 410, in _execute_verb
    result = node.verb.func(**verb_args)
  File "/home/spandey2/LLM_KG_RAG/graphrag-local-ollama/graphrag/index/verbs/graph/clustering/cluster_graph.py", line 102, in cluster_graph
    output_df[[level_to, to]] = pd.DataFrame(
  File "/home/spandey2/miniconda3/envs/graphrag-local-ollama/lib/python3.10/site-packages/pandas/core/frame.py", line 4299, in __setitem__
    self._setitem_array(key, value)
  File "/home/spandey2/miniconda3/envs/graphrag-local-ollama/lib/python3.10/site-packages/pandas/core/frame.py", line 4341, in _setitem_array
    check_key_length(self.columns, key, value)
  File "/home/spandey2/miniconda3/envs/graphrag-local-ollama/lib/python3.10/site-packages/pandas/core/indexers/utils.py", line 390, in check_key_length
    raise ValueError("Columns must be same length as key")
ValueError: Columns must be same length as key
12:35:08,7 graphrag.index.reporting.file_workflow_callbacks INFO Error executing verb "cluster_graph" in create_base_entity_graph: Columns must be same length as key details=None
12:35:08,7 graphrag.index.run ERROR error running workflow create_base_entity_graph
Traceback (most recent call last):
  File "/home/spandey2/LLM_KG_RAG/graphrag-local-ollama/graphrag/index/run.py", line 323, in run_pipeline
    result = await workflow.run(context, callbacks)
  File "/home/spandey2/miniconda3/envs/graphrag-local-ollama/lib/python3.10/site-packages/datashaper/workflow/workflow.py", line 369, in run
    timing = await self._execute_verb(node, context, callbacks)
  File "/home/spandey2/miniconda3/envs/graphrag-local-ollama/lib/python3.10/site-packages/datashaper/workflow/workflow.py", line 410, in _execute_verb
    result = node.verb.func(**verb_args)
  File "/home/spandey2/LLM_KG_RAG/graphrag-local-ollama/graphrag/index/verbs/graph/clustering/cluster_graph.py", line 102, in cluster_graph
    output_df[[level_to, to]] = pd.DataFrame(
  File "/home/spandey2/miniconda3/envs/graphrag-local-ollama/lib/python3.10/site-packages/pandas/core/frame.py", line 4299, in __setitem__
    self._setitem_array(key, value)
  File "/home/spandey2/miniconda3/envs/graphrag-local-ollama/lib/python3.10/site-packages/pandas/core/frame.py", line 4341, in _setitem_array
    check_key_length(self.columns, key, value)
  File "/home/spandey2/miniconda3/envs/graphrag-local-ollama/lib/python3.10/site-packages/pandas/core/indexers/utils.py", line 390, in check_key_length
    raise ValueError("Columns must be same length as key")
ValueError: Columns must be same length as key
12:35:08,7 graphrag.index.reporting.file_workflow_callbacks INFO Error running pipeline! details=None

@aiChatGPT35User123
Copy link

您好,我在进行全局搜索时遇到错误。

  • python -m graphrag.query --root ./ragtest --method global "公司需要披露什么信息"

信息:从 ragtest/settings.yaml 读取设置,使用以下项创建 llm 客户端:{'api_key':'REDACTED,len=9'、'type':“openai_chat”、“model”:“llama3”、“ max_tokens”:4000、'request_timeout':180.0、'api_base':' http://localhost:11434/v1 '、'api_version':无、'organization':无、'proxy':无、'cognitive_services_endpoint':无、'deployment_name':无、'model_supports_json':True、'tokens_per_一分钟':0、'requests_per_一分钟':0、'max_retries':10、'max_retry_wait':10.0, 'sleep_on_rate_limit_recommendation':True,'concurrent_requests':25} 解析搜索响应 json 时出错 Traceback(最近一次调用最后一次):文件“/opt/anaconda3/envs/graphrag/lib/python3.12/site-packages/graphrag/ query/structed_search/global_search/search.py​​​​”,第194行,位于_map_response_single_batchprocessed_response = self.parse_search_response(search_response) ^^^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^^^^^ 文件“/opt/anaconda3/envs/graphrag/lib/python3.12/site-packages/graphrag/query/structured_search/global_search/search.py​​​​”,第232行,位于parse_search_response parsed_elements = json.loads(search_response) [“点”] ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 文件“/opt/anaconda3/envs/graphrag/lib/python3.12 /json/ init .py”,第346行,在loads中返回 _default_decoder.decode(s) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 文件“/opt/anaconda3/envs/graphrag/lib/python3.12/json/decoder.py”,第 337 条 行,在解码 obj 中,end = self.raw_decode(s, idx=_w(s, 0).end()) ^^^^^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^^^^^^^ 文件“/opt/anaconda3/envs/graphrag/lib/python3.12/json/decoder.py”,第355行,在 raw_decode 中引发 JSONDecodeError(“期望值”,s,err.value)来自 None json.decoder.JSONDecodeError:期望值:第 1 行第 1 列(字符 0)
成功:全局搜索回复:很抱歉,但我无法根据提供的数据回答这个问题。

您是如何安装 graphrag 的?“pip install graprag”?还是“pip install -e。”?

这是 pip install graphrag
你的问题解决了嘛,我也是这样的问题

@leowenlu
Copy link

similar error I had, I have followed exactly the steps in readme

python -m graphrag.query --root ./ragtest --method global "What is machinelearning?"


INFO: Reading settings from ragtest/settings.yaml
creating llm client with {'api_key': 'REDACTED,len=9', 'type': "openai_chat", 'model': 'mistral', 'max_tokens': 4000, 'temperature': 0.0, 'top_p': 1.0, 'request_timeout': 180.0, 'api_base': 'http://localhost:11434/v1', 'api_version': None, 'organization': 'org-ma3AH5CkGkk1si2Q6TafUK0Y', 'proxy': None, 'cognitive_services_endpoint': None, 'deployment_name': None, 'model_supports_json': True, 'tokens_per_minute': 0, 'requests_per_minute': 0, 'max_retries': 10, 'max_retry_wait': 10.0, 'sleep_on_rate_limit_recommendation': True, 'concurrent_requests': 25}
Error parsing search response json
Traceback (most recent call last):
  File "/data/leoprojects/graphRAG/graphrag-local-ollama/graphrag/query/structured_search/global_search/search.py", line 194, in _map_response_single_batch
    processed_response = self.parse_search_response(search_response)
  File "/data/leoprojects/graphRAG/graphrag-local-ollama/graphrag/query/structured_search/global_search/search.py", line 232, in parse_search_response
    parsed_elements = json.loads(search_response)["points"]
  File "/data/systems/miniconda3/envs/graphrag-ollama-local/lib/python3.10/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "/data/systems/miniconda3/envs/graphrag-ollama-local/lib/python3.10/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/data/systems/miniconda3/envs/graphrag-ollama-local/lib/python3.10/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

SUCCESS: Global Search Response: I am sorry but I am unable to answer this question given the provided data.

@KDD2018
Copy link

KDD2018 commented Aug 1, 2024

similar error I had, I have followed exactly the steps in readme

python -m graphrag.query --root ./ragtest --method global "What is machinelearning?"


INFO: Reading settings from ragtest/settings.yaml
creating llm client with {'api_key': 'REDACTED,len=9', 'type': "openai_chat", 'model': 'mistral', 'max_tokens': 4000, 'temperature': 0.0, 'top_p': 1.0, 'request_timeout': 180.0, 'api_base': 'http://localhost:11434/v1', 'api_version': None, 'organization': 'org-ma3AH5CkGkk1si2Q6TafUK0Y', 'proxy': None, 'cognitive_services_endpoint': None, 'deployment_name': None, 'model_supports_json': True, 'tokens_per_minute': 0, 'requests_per_minute': 0, 'max_retries': 10, 'max_retry_wait': 10.0, 'sleep_on_rate_limit_recommendation': True, 'concurrent_requests': 25}
Error parsing search response json
Traceback (most recent call last):
  File "/data/leoprojects/graphRAG/graphrag-local-ollama/graphrag/query/structured_search/global_search/search.py", line 194, in _map_response_single_batch
    processed_response = self.parse_search_response(search_response)
  File "/data/leoprojects/graphRAG/graphrag-local-ollama/graphrag/query/structured_search/global_search/search.py", line 232, in parse_search_response
    parsed_elements = json.loads(search_response)["points"]
  File "/data/systems/miniconda3/envs/graphrag-ollama-local/lib/python3.10/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
  File "/data/systems/miniconda3/envs/graphrag-ollama-local/lib/python3.10/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/data/systems/miniconda3/envs/graphrag-ollama-local/lib/python3.10/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

SUCCESS: Global Search Response: I am sorry but I am unable to answer this question given the provided data.

Hi, I ran into this error, did you solve it?

@shutter-cp
Copy link

microsoft/graphrag#575

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants