Skip to content

Commit

Permalink
enable stepwise execution of query pipelines (run-llama#14117)
Browse files Browse the repository at this point in the history
* enable stepwise execution of query pipelines

* improve the UX

* remove print
  • Loading branch information
logan-markewich authored Jun 19, 2024
1 parent 7f4fc7b commit 3ff0576
Show file tree
Hide file tree
Showing 3 changed files with 213 additions and 160 deletions.
93 changes: 92 additions & 1 deletion docs/docs/examples/pipeline/query_pipeline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,8 @@
"- Chain together prompt and LLM\n",
"- Chain together query rewriting (prompt + LLM) with retrieval\n",
"- Chain together a full RAG query pipeline (query rewriting, retrieval, reranking, response synthesis)\n",
"- Setting up a custom query component"
"- Setting up a custom query component\n",
"- Executing a pipeline step-by-step"
]
},
{
Expand Down Expand Up @@ -1225,6 +1226,96 @@
"source": [
"print(str(output))"
]
},
{
"cell_type": "markdown",
"id": "5a1186f8",
"metadata": {},
"source": [
"## Stepwise Execution of a Pipeline\n",
"\n",
"Executing a pipeline one step at a time is a great idea if you:\n",
"- want to better debug the order of execution\n",
"- log data in between each step\n",
"- give feedback to a user as to what is being processed\n",
"- and more!\n",
"\n",
"To execute a pipeline, you must create a `run_state`, and then loop through the exection. A basic example is below."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "fec7873b",
"metadata": {},
"outputs": [],
"source": [
"from llama_index.core.query_pipeline import QueryPipeline\n",
"from llama_index.core import PromptTemplate\n",
"from llama_index.llms.openai import OpenAI\n",
"\n",
"# try chaining basic prompts\n",
"prompt_str = \"Please generate related movies to {movie_name}\"\n",
"prompt_tmpl = PromptTemplate(prompt_str)\n",
"llm = OpenAI(model=\"gpt-3.5-turbo\")\n",
"\n",
"p = QueryPipeline(chain=[prompt_tmpl, llm], verbose=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "900210e6",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"1. Infernal Affairs (2002) - The original Hong Kong film that inspired The Departed\n",
"2. The Town (2010) - A crime thriller directed by Ben Affleck\n",
"3. Mystic River (2003) - A crime drama directed by Clint Eastwood\n",
"4. Goodfellas (1990) - A classic mobster film directed by Martin Scorsese\n",
"5. The Irishman (2019) - Another crime drama directed by Martin Scorsese, starring Robert De Niro and Al Pacino\n",
"6. The Departed (2006) - The Departed is a 2006 American crime film directed by Martin Scorsese and written by William Monahan. It is a remake of the 2002 Hong Kong film Infernal Affairs. The film stars Leonardo DiCaprio, Matt Damon, Jack Nicholson, and Mark Wahlberg, with Martin Sheen, Ray Winstone, Vera Farmiga, and Alec Baldwin in supporting roles.\n"
]
}
],
"source": [
"run_state = p.get_run_state(movie_name=\"The Departed\")\n",
"\n",
"next_module_keys = p.get_next_module_keys(run_state)\n",
"\n",
"while True:\n",
" for module_key in next_module_keys:\n",
" # get the module and input\n",
" module = run_state.module_dict[module_key]\n",
" module_input = run_state.all_module_inputs[module_key]\n",
"\n",
" # run the module\n",
" output_dict = module.run_component(**module_input)\n",
"\n",
" # process the output\n",
" p.process_component_output(\n",
" output_dict,\n",
" module_key,\n",
" run_state,\n",
" )\n",
"\n",
" # get the next module keys\n",
" next_module_keys = p.get_next_module_keys(\n",
" run_state,\n",
" )\n",
"\n",
" # if no more modules to run, break\n",
" if not next_module_keys:\n",
" run_state.result_outputs[module_key] = output_dict\n",
" break\n",
"\n",
"# the final result is at `module_key`\n",
"# it is a dict of 'output' -> ChatResponse object in this case\n",
"print(run_state.result_outputs[module_key][\"output\"].message.content)"
]
}
],
"metadata": {
Expand Down
2 changes: 1 addition & 1 deletion llama-index-core/llama_index/core/async_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ def asyncio_run(coro: Coroutine) -> Any:
If there is no existing event loop, creates a new one.
"""
try:
loop = asyncio.get_running_loop()
loop = asyncio.get_event_loop()
if loop.is_running():
raise RuntimeError(
"Nested async detected. "
Expand Down
Loading

0 comments on commit 3ff0576

Please sign in to comment.