Skip to content

Commit

Permalink
change back name to flaml
Browse files Browse the repository at this point in the history
  • Loading branch information
qingyun-wu committed Sep 2, 2023
1 parent 2eab9a3 commit 45b3f1d
Show file tree
Hide file tree
Showing 10 changed files with 61 additions and 60 deletions.
Empty file added website/docs/Blog.md
Empty file.
4 changes: 2 additions & 2 deletions website/docs/Contribute.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ pip install -e autogen

### Docker

We provide a simple [Dockerfile](https://github.com/microsoft/autogen/blob/main/Dockerfile).
We provide a simple [Dockerfile](https://github.com/microsoft/flaml/blob/main/Dockerfile).

```bash
docker build https://github.com/microsoft/autogen.git#main -t autogen-dev
Expand All @@ -79,7 +79,7 @@ docker run -it autogen-dev
### Develop in Remote Container

If you use vscode, you can open the autogen folder in a [Container](https://code.visualstudio.com/docs/remote/containers).
We have provided the configuration in [devcontainer](https://github.com/microsoft/autogen/blob/main/.devcontainer).
We have provided the configuration in [devcontainer](https://github.com/microsoft/flaml/blob/main/.devcontainer).

### Pre-commit

Expand Down
18 changes: 9 additions & 9 deletions website/docs/Examples/AutoGen-AgentChat.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,12 @@ AutoGen offers conversable agents powered by LLM, tool or human, which can be us
Please find documentation about this feature [here](/docs/Use-Cases/Autogen#agents).

Links to notebook examples:
* [Automated Task Solving with Code Generation, Execution & Debugging](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_auto_feedback_from_code_execution.ipynb)
* [Auto Code Generation, Execution, Debugging and Human Feedback](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_human_feedback.ipynb)
* [Solve Tasks Requiring Web Info](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_web_info.ipynb)
* [Use Provided Tools as Functions](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_function_call.ipynb)
* [Automated Task Solving with Coding & Planning Agents](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_planning.ipynb)
* [Automated Task Solving with GPT-4 + Multiple Human Users](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_two_users.ipynb)
* [Automated Chess Game Playing & Chitchatting by GPT-4 Agents](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_chess.ipynb)
* [Automated Task Solving by Group Chat](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_groupchat.ipynb)
* [Automated Continual Learning from New Data](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_stream.ipynb)
* [Automated Task Solving with Code Generation, Execution & Debugging](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_auto_feedback_from_code_execution.ipynb)
* [Auto Code Generation, Execution, Debugging and Human Feedback](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_human_feedback.ipynb)
* [Solve Tasks Requiring Web Info](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_web_info.ipynb)
* [Use Provided Tools as Functions](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_function_call.ipynb)
* [Automated Task Solving with Coding & Planning Agents](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_planning.ipynb)
* [Automated Task Solving with GPT-4 + Multiple Human Users](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_two_users.ipynb)
* [Automated Chess Game Playing & Chitchatting by GPT-4 Agents](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_chess.ipynb)
* [Automated Task Solving by Group Chat](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_groupchat.ipynb)
* [Automated Continual Learning from New Data](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_stream.ipynb)
4 changes: 2 additions & 2 deletions website/docs/Examples/AutoGen-OpenAI.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,5 +4,5 @@ AutoGen also offers a cost-effective hyperparameter optimization technique [EcoO
Please find documentation about this feature [here](/docs/Use-Cases/enhanced_inference).

Links to notebook examples:
* [Optimize for Code Generation](https://github.com/microsoft/autogen/blob/main/notebook/autogen_openai_completion.ipynb) | [Open in colab](https://colab.research.google.com/github/microsoft/autogen/blob/main/notebook/autogen_openai_completion.ipynb)
* [Optimize for Math](https://github.com/microsoft/autogen/blob/main/notebook/autogen_chatgpt_gpt4.ipynb) | [Open in colab](https://colab.research.google.com/github/microsoft/autogen/blob/main/notebook/autogen_chatgpt_gpt4.ipynb)
* [Optimize for Code Generation](https://github.com/microsoft/flaml/blob/main/notebook/autogen_openai_completion.ipynb) | [Open in colab](https://colab.research.google.com/github/microsoft/flaml/blob/main/notebook/autogen_openai_completion.ipynb)
* [Optimize for Math](https://github.com/microsoft/flaml/blob/main/notebook/autogen_chatgpt_gpt4.ipynb) | [Open in colab](https://colab.research.google.com/github/microsoft/flaml/blob/main/notebook/autogen_chatgpt_gpt4.ipynb)
8 changes: 4 additions & 4 deletions website/docs/Getting-Started.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,13 +18,13 @@ AutoGen is powered by collaborative [research studies](/docs/Research) from Micr

### Quickstart

Install AutoGen from pip: `pip install pyautogen`. Find more options in [Installation](/docs/Installation).
Install AutoGen from pip: `pip install "flaml[autogen]"`. Find more options in [Installation](/docs/Installation).


Autogen enables the next-gen LLM applications with a generic multi-agent conversation framework. It offers customizable and conversable agents which integrate LLMs, tools and human.
By automating chat among multiple capable agents, one can easily make them collectively perform tasks autonomously or with human feedback, including tasks that require using tools via code. For example,
```python
from autogen import AssistantAgent, UserProxyAgent
from flaml.autogen import AssistantAgent, UserProxyAgent
assistant = AssistantAgent("assistant")
user_proxy = UserProxyAgent("user_proxy")
user_proxy.initiate_chat(assistant, message="Show me the YTD gain of 10 largest technology companies as of today.")
Expand All @@ -49,9 +49,9 @@ response = autogen.Completion.create(context=test_instance, **config)

### Where to Go Next?

* Understand the use cases for [multi-agent conversation](/docs/Use-Cases/multiagent_conversation).
* Understand the use cases for [multi-agent conversation](/docs/Use-Cases/agent_chat).
* Understand the use cases for [enhanced LLM inference](/docs/Use-Cases/enhanced_inference).
* Find code examples from [Examples](/docs/Examples).
* Find code examples from [Examples](/docs/Examples/AutoGen-AgentChat).
* Learn about [research](/docs/Research) around AutoGen and check [blogposts](/blog).
* Chat on [Discord](TBD).

Expand Down
4 changes: 2 additions & 2 deletions website/docs/Installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,10 +5,10 @@
AutoGen requires **Python version >= 3.8**. It can be installed from pip:

```bash
pip install pyautogen
pip install "flaml[autogen]"
```

or conda:
```
conda install pyautogen -c conda-forge
conda install "flaml[autogen]" -c conda-forge
```
57 changes: 29 additions & 28 deletions website/docs/Use-Cases/agent_chat.md
Original file line number Diff line number Diff line change
@@ -1,29 +1,27 @@
# Multi-agent conversation Framework
# Multi-agent Conversation Framework

AutoGen offers a unified multi-agent conversation framework as a high-level abstraction of using foundation models. It features capable, customizable and conversable agents which integrate LLM, tool and human via automated agent chat.
By automating chat among multiple capable agents, one can easily make them collectively perform tasks autonomously or with human feedback, including tasks that require using tools via code.

This framework simplifies the orchestration, automation and optimization of a complex LLM workflow. It maximizes the performance of LLM models and augments their weakness. It enables building next-gen LLM applications based on multi-agent conversations with minimal effort.

### Conversable Agents
### Agents

We have designed a generic `ResponsiveAgent` class for Agents that are capable of conversing with each other through the exchange of messages to jointly finish a task. An agent can communicate with other agents and perform actions. Different agents can differ in what actions they perform after receiving messages. Two representative subclasses are `AssistantAgent` and `UserProxyAgent`.
We have designed a generic `ResponsiveAgent` class for Agents that are capable of conversing with each other through the exchange of messages to jointly finish a task. An agent can communicate with other agents and perform actions.

Different agents can differ in what actions they perform after receiving messages. Two representative subclasses are `AssistantAgent` and `UserProxyAgent`.

- `AssistantAgent`. Designed to act as an assistant by responding to user requests. It could write Python code (in a Python coding block) for a user to execute when a message (typically a description of a task that needs to be solved) is received. Under the hood, the Python code is written by LLM (e.g., GPT-4). It can also receive the execution results and suggest code with bug fix. Its behavior can be altered by passing a new system message. The LLM [inference](#enhanced-inference) configuration can be configured via `llm_config`.

- `UserProxyAgent`. Serves as a proxy for the human user. Upon receiving a message, the UserProxyAgent will either solicit the human user's input or prepare an automatically generated reply. The chosen action depends on the settings of the `human_input_mode` and `max_consecutive_auto_reply` when the `UserProxyAgent` instance is constructed, and whether a human user input is available.
By default, the automatically generated reply is crafted based on automatic code execution. The `UserProxyAgent` triggers code execution automatically when it detects an executable code block in the received message and no human user input is provided. Code execution can be disabled by setting `code_execution_config` to False. LLM-based response is disabled by default. It can be enabled by setting `llm_config` to a dict corresponding to the [inference](#enhanced-inference) configuration.
When `llm_config` is set to a dict, `UserProxyAgent` can generate replies using an LLM when code execution is not performed.

The auto-reply capability of `ResponsiveAgent` allows for more autonomous multi-agent communication while retaining the possibility of human intervention.
One can also easily extend it by registering auto_reply functions with the `register_auto_reply()` method.

## Multi-agent Conversations

### Basic Example

Example usage of the agents to solve a task with code:
```python
from autogen import AssistantAgent, UserProxyAgent
from flaml.autogen import AssistantAgent, UserProxyAgent

# create an AssistantAgent instance named "assistant"
assistant = AssistantAgent(name="assistant")
Expand All @@ -33,7 +31,14 @@ user_proxy = UserProxyAgent(
name="user_proxy",
human_input_mode="NEVER", # in this mode, the agent will never solicit human input but always auto reply
)
```

## Multi-agent Conversations

### A Basci Two-Agent Conversation Example

Example usage of the agents to solve a task with code:
```python
# the assistant receives a message from the user, which contains the task description
user.initiate_chat(
assistant,
Expand All @@ -49,19 +54,16 @@ In the example above, we create an AssistantAgent named "assistant" to serve as
Please find a visual illustration of how UserProxyAgent and AssistantAgent collaboratively solve the above task below:
![Agent Chat Example](images/agent_example.png)

### Human Input Mode
### Customizable Conversation Pattern

The `human_input_mode` parameter of `UserProxyAgent` controls the behavior of the agent when it receives a message. It can be set to `"NEVER"`, `"ALWAYS"`, or `"TERMINATE"`.
- Under the mode `human_input_mode="NEVER"`, the multi-turn conversation between the assistant and the user_proxy stops when the number of auto-reply reaches the upper limit specified by `max_consecutive_auto_reply` or the received message is a termination message according to `is_termination_msg`.
- When `human_input_mode` is set to `"ALWAYS"`, the user proxy agent solicits human input every time a message is received; and the conversation stops when the human input is "exit", or when the received message is a termination message and no human input is provided.
- When `human_input_mode` is set to `"TERMINATE"`, the user proxy agent solicits human input only when a termination message is received or the number of auto replies reaches `max_consecutive_auto_reply`.
### Dynamic Multi-Agent Conversation

### Function Calling
To leverage [function calling capability of OpenAI's Chat Completions API](https://openai.com/blog/function-calling-and-other-api-updates?ref=upstract.com), one can pass in a list of callable functions or class methods to `UserProxyAgent`, which corresponds to the description of functions passed to OpenAI's API.
Leveraging [function calling capability of OpenAI's Chat Completions API](https://openai.com/blog/function-calling-and-other-api-updates?ref=upstract.com), one can pass in a list of callable functions or class methods to `UserProxyAgent`, which corresponds to the description of functions passed to OpenAI's API.

Example usage of the agents to solve a task with function calling feature:
```python
from autogen import AssistantAgent, UserProxyAgent
from flaml.autogen import AssistantAgent, UserProxyAgent

# put the descriptions of functions in config to be passed to OpenAI's API
llm_config = {
Expand Down Expand Up @@ -138,18 +140,17 @@ user_proxy.initiate_chat(
)
```

### Notebook Examples
### Diverse Applications Implemented with AutoGen

*Interested in trying it yourself? Please check the following notebook examples:*
* [Automated Task Solving with Code Generation, Execution & Debugging](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_auto_feedback_from_code_execution.ipynb)
* [Auto Code Generation, Execution, Debugging and Human Feedback](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_human_feedback.ipynb)
* [Solve Tasks Requiring Web Info](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_web_info.ipynb)
* [Use Provided Tools as Functions](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_function_call.ipynb)
* [Automated Task Solving with Coding & Planning Agents](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_planning.ipynb)
* [Automated Task Solving with GPT-4 + Multiple Human Users](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_two_users.ipynb)
* [Automated Chess Game Playing & Chitchatting by GPT-4 Agents](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_chess.ipynb)
* [Automated Task Solving by Group Chat](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_groupchat.ipynb)
* [Automated Continual Learning from New Data](https://github.com/microsoft/autogen/blob/main/notebook/autogen_agentchat_stream.ipynb)
* [Automated Task Solving with Code Generation, Execution & Debugging](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_auto_feedback_from_code_execution.ipynb)
* [Auto Code Generation, Execution, Debugging and Human Feedback](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_human_feedback.ipynb)
* [Solve Tasks Requiring Web Info](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_web_info.ipynb)
* [Use Provided Tools as Functions](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_function_call.ipynb)
* [Automated Task Solving with Coding & Planning Agents](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_planning.ipynb)
* [Automated Task Solving with GPT-4 + Multiple Human Users](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_two_users.ipynb)
* [Automated Chess Game Playing & Chitchatting by GPT-4 Agents](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_chess.ipynb)
* [Automated Task Solving by Group Chat](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_groupchat.ipynb)
* [Automated Continual Learning from New Data](https://github.com/microsoft/flaml/blob/main/notebook/autogen_agentchat_stream.ipynb)



Expand All @@ -158,4 +159,4 @@ user_proxy.initiate_chat(

*Interested in the research that leads to this package? Please check the following papers.*

* [AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework](https://arxiv.org/abs/2308.08155) Qingyun Wu, Gagan Bansal, Jieyu Zhang, Yiran Wu, Shaokun Zhang, Erkang Zhu, Beibin Li, Li Jiang, Xiaoyun Zhang and Chi Wang. ArXiv 2023.
* [AutoGen: Enabling Next-Gen LLM Applications via Multi-Agent Conversation Framework](https://arxiv.org/abs/2308.08155). Qingyun Wu, Gagan Bansal, Jieyu Zhang, Yiran Wu, Shaokun Zhang, Erkang Zhu, Beibin Li, Li Jiang, Xiaoyun Zhang and Chi Wang. ArXiv 2023.
4 changes: 2 additions & 2 deletions website/docs/Use-Cases/enhanced_inference.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,8 +6,8 @@ There are a number of benefits of using `autogen` to perform inference: performa
## Tune Inference Parameters

*Links to notebook examples:*
* [Optimize for Code Generation](https://github.com/microsoft/autogen/blob/main/notebook/autogen_openai_completion.ipynb)
* [Optimize for Math](https://github.com/microsoft/autogen/blob/main/notebook/autogen_chatgpt_gpt4.ipynb)
* [Optimize for Code Generation](https://github.com/microsoft/flaml/blob/main/notebook/autogen_openai_completion.ipynb)
* [Optimize for Math](https://github.com/microsoft/flaml/blob/main/notebook/autogen_chatgpt_gpt4.ipynb)

### Choices to optimize

Expand Down
16 changes: 8 additions & 8 deletions website/docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -26,15 +26,15 @@ module.exports = {
position: 'left',
label: 'Docs',
},
{to: 'blog', label: 'Blog', position: 'left'},
{
type: 'doc',
docId: 'FAQ',
position: 'left',
label: 'FAQ',
},
// {to: 'blog', label: 'Blog', position: 'left'},
// {
// type: 'doc',
// docId: 'FAQ',
// position: 'left',
// label: 'FAQ',
// },
{
href: 'https://github.com/microsoft/AutoGen',
href: 'https://github.com/microsoft/FLAML',
label: 'GitHub',
position: 'right',
},
Expand Down
6 changes: 3 additions & 3 deletions website/src/components/HomepageFeatures.js
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ const FeatureList = [
{
title: 'Customizable and Convertible Agents ',
Svg: require('../../static/img/auto.svg').default,
docLink: './docs/getting-started',
docLink: './docs/Use-Cases/agent_chat#agents',
description: (
<>
AutoGen provides customizable and convertible agents that can be backed by
Expand All @@ -18,7 +18,7 @@ const FeatureList = [
{
title: 'Flexible Multi-Conversation Patterns',
Svg: require('../../static/img/extend.svg').default,
docLink: './docs/getting-started',
docLink: './docs/Use-Cases/agent_chat#multi-agent-conversations',
description: (
<>
AutoGen supports flexible conversation patterns for realizing complex and dynamic workflows.
Expand All @@ -28,7 +28,7 @@ const FeatureList = [
{
title: 'Diverse Applications',
Svg: require('../../static/img/fast.svg').default,
docLink: './docs/getting-started',
docLink: './docs/Use-Cases/agent_chat#notebook-examples',
description: (
<>
AutoGen offers a collection of working systems spanning span a wide range of applications from various domains and complexities.
Expand Down

0 comments on commit 45b3f1d

Please sign in to comment.