Skip to content

Commit

Permalink
initial commit
Browse files Browse the repository at this point in the history
  • Loading branch information
lbeurerkellner committed Mar 9, 2023
1 parent eaa25c7 commit 6302cd7
Show file tree
Hide file tree
Showing 10 changed files with 150 additions and 0 deletions.
48 changes: 48 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
<div align="center">
<a href="https://lmql.ai">
<img src="https://raw.githubusercontent.com/eth-sri/lmql/web/lmql.svg" alt="Logo" width="80" height="80">
</a>

<h3 align="center">LMQL</h3>

<p align="center">
A query language for programming (large) language models.
<br />
<a href="https://arxiv.org/pdf/2212.06094"><strong>Read The Paper »</strong></a>
<br />
<br />
<a href="https://lmql.ai">Explore Examples</a>
·
<a href="https://lmql.ai/playground">Playground IDE</a>
·
<a href="https://github.com/eth-sri/lmql/issues">Report Bug</a>
<br/>
<br/>
<i>Full Code Release Coming Soon</i>
</p>
</div>

LMQL is a query language for large language models (LLMs). It facilitates LLM interaction by combining the benefits of natural language prompting with the expressiveness of Python. With only a few lines of LMQL code, users can express advanced, multi-part and tool-augmented LM queries, which then are optimized by the LMQL runtime to run efficiently as part of the LM decoding loop. To illustrate, consider the following LMQL program:

![lmql-overview](https://user-images.githubusercontent.com/17903049/222918379-84a00b9a-1ef0-45bf-9384-15a20f2874f0.png)

<p align="center">
<a href="https://lmql.ai">Explore More Examples »</a>
</p>

## About LMQL

Large language models have demonstrated outstanding performance on a wide range of tasks such as question answering and code generation. On a high level, given an input, a language model can be used to automatically complete the sequence in a statistically-likely way. Based on this, users prompt these models with language instructions or examples, to implement a variety of downstream tasks. Advanced prompting methods can even imply interaction between the language model, a user, and external tools such as calculators. However, to obtain state-of-the-art performance or adapt language models for specific tasks, complex task- and model-specific programs have to be implemented, which may still require ad-hoc interaction.

Based on this, we present the novel idea of Language Model Programming (LMP). LMP generalizes language model prompting from pure text prompts to an intuitive combination of text prompting and scripting. Additionally, LMP allows constraints to be specified over the language model output. This enables easy adaption to many tasks, while abstracting language model internals and providing high-level semantics.

To enable LMP, we implement LMQL (short for Language Model Query Language), which leverages the constraints and control flow from an LMP prompt to generate an efficient inference procedure that minimizes the number of expensive calls to the underlying language model.

We show that LMQL can capture a wide range of state-of-the-art prompting methods in an intuitive way, especially facilitating interactive flows that are challenging to implement with existing high-level APIs. Our evaluation shows that we retain or increase the accuracy on several downstream tasks, while also significantly reducing the required amount of computation or cost in the case of pay-to-use APIs.

### Code Release and Stability

We plan to release the LMQL source code soon, together with a packaged release on PyPi. Until then, feel free to experiment with LMQL in the web-based <a href="https://lmql.ai/playground">Playground IDE</a>, which includes the fully-featured LMQL runtime and compiler.

The current version of LMQL should be considered as an alpha release. Please report bugs and feature requests as GitHub Issues.

Binary file added dist/lmql-0.0.2.1-py3-none-any.whl
Binary file not shown.
Binary file added dist/lmql-0.0.2.1.tar.gz
Binary file not shown.
6 changes: 6 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
[build-system]
requires = [
"setuptools>=42",
"wheel"
]
build-backend = "setuptools.build_meta"
23 changes: 23 additions & 0 deletions setup.cfg
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
[metadata]
name = lmql
version = 0.0.2.1
author = Luca Beurer-Kellner
author_email = [email protected]
description = A query language for language models.
long_description = file: README.md
long_description_content_type = text/markdown
url = https://lmql.ai
project_urls =
Docs = https://docs.lmql.ai
classifiers =
Programming Language :: Python :: 3
Operating System :: OS Independent

[options]
package_dir =
= src
packages = find:
python_requires = >=3.9

[options.packages.find]
where = src
61 changes: 61 additions & 0 deletions src/lmql.egg-info/PKG-INFO
Original file line number Diff line number Diff line change
@@ -0,0 +1,61 @@
Metadata-Version: 2.1
Name: lmql
Version: 0.0.2.1
Summary: A query language for language models.
Home-page: https://lmql.ai
Author: Luca Beurer-Kellner
Author-email: [email protected]
Project-URL: Docs, https://docs.lmql.ai
Classifier: Programming Language :: Python :: 3
Classifier: Operating System :: OS Independent
Requires-Python: >=3.9
Description-Content-Type: text/markdown

<div align="center">
<a href="https://lmql.ai">
<img src="https://raw.githubusercontent.com/eth-sri/lmql/web/lmql.svg" alt="Logo" width="80" height="80">
</a>

<h3 align="center">LMQL</h3>

<p align="center">
A query language for programming (large) language models.
<br />
<a href="https://arxiv.org/pdf/2212.06094"><strong>Read The Paper »</strong></a>
<br />
<br />
<a href="https://lmql.ai">Explore Examples</a>
·
<a href="https://lmql.ai/playground">Playground IDE</a>
·
<a href="https://github.com/eth-sri/lmql/issues">Report Bug</a>
<br/>
<br/>
<i>Full Code Release Coming Soon</i>
</p>
</div>

LMQL is a query language for large language models (LLMs). It facilitates LLM interaction by combining the benefits of natural language prompting with the expressiveness of Python. With only a few lines of LMQL code, users can express advanced, multi-part and tool-augmented LM queries, which then are optimized by the LMQL runtime to run efficiently as part of the LM decoding loop. To illustrate, consider the following LMQL program:

![lmql-overview](https://user-images.githubusercontent.com/17903049/222918379-84a00b9a-1ef0-45bf-9384-15a20f2874f0.png)

<p align="center">
<a href="https://lmql.ai">Explore More Examples »</a>
</p>

## About LMQL

Large language models have demonstrated outstanding performance on a wide range of tasks such as question answering and code generation. On a high level, given an input, a language model can be used to automatically complete the sequence in a statistically-likely way. Based on this, users prompt these models with language instructions or examples, to implement a variety of downstream tasks. Advanced prompting methods can even imply interaction between the language model, a user, and external tools such as calculators. However, to obtain state-of-the-art performance or adapt language models for specific tasks, complex task- and model-specific programs have to be implemented, which may still require ad-hoc interaction.

Based on this, we present the novel idea of Language Model Programming (LMP). LMP generalizes language model prompting from pure text prompts to an intuitive combination of text prompting and scripting. Additionally, LMP allows constraints to be specified over the language model output. This enables easy adaption to many tasks, while abstracting language model internals and providing high-level semantics.

To enable LMP, we implement LMQL (short for Language Model Query Language), which leverages the constraints and control flow from an LMP prompt to generate an efficient inference procedure that minimizes the number of expensive calls to the underlying language model.

We show that LMQL can capture a wide range of state-of-the-art prompting methods in an intuitive way, especially facilitating interactive flows that are challenging to implement with existing high-level APIs. Our evaluation shows that we retain or increase the accuracy on several downstream tasks, while also significantly reducing the required amount of computation or cost in the case of pay-to-use APIs.

### Code Release and Stability

We plan to release the LMQL source code soon, together with a packaged release on PyPi. Until then, feel free to experiment with LMQL in the web-based <a href="https://lmql.ai/playground">Playground IDE</a>, which includes the fully-featured LMQL runtime and compiler.

The current version of LMQL should be considered as an alpha release. Please report bugs and feature requests as GitHub Issues.

8 changes: 8 additions & 0 deletions src/lmql.egg-info/SOURCES.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
README.md
pyproject.toml
setup.cfg
src/lmql/__init__.py
src/lmql.egg-info/PKG-INFO
src/lmql.egg-info/SOURCES.txt
src/lmql.egg-info/dependency_links.txt
src/lmql.egg-info/top_level.txt
1 change: 1 addition & 0 deletions src/lmql.egg-info/dependency_links.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@

1 change: 1 addition & 0 deletions src/lmql.egg-info/top_level.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
lmql
2 changes: 2 additions & 0 deletions src/lmql/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
def run():
print("This it he LMQL placeholder package.")

0 comments on commit 6302cd7

Please sign in to comment.