The Memory Layer for Personalized AI.
Learn more »
Mem0 enhances AI agents and Large Language Models (LLMs) with an intelligent memory layer. By retaining and utilizing contextual information, Mem0 enables more personalized and effective AI interactions across various applications. Whether you're building customer support chatbots, AI assistants, or autonomous systems, Mem0 helps your AI remember user preferences, adapt to individual needs, and continuously improve over time.
Use cases enabled by Mem0 include:
- Personalized Learning Assistants: Enhance learning experiences with tailored content recommendations and progress tracking.
- Customer Support AI Agents: Provide context-aware assistance by remembering past interactions and user preferences.
- Healthcare Assistants: Keep track of patient history, treatment plans, and medication schedules for personalized care.
- Virtual Companions: Build deeper relationships with users by remembering personal details and past conversations.
- Productivity Tools: Streamline workflows by remembering user habits, frequently used documents, and task history.
- Gaming AI: Create immersive gaming experiences by adapting game environments based on player choices and progress.
The simplest way to set up Mem0 is to create a managed deployment with Mem0 Cloud. This hosted solution offers a hassle-free experience with automatic updates, advanced analytics, and dedicated support. Sign up for Mem0 Cloud to get started.
If you prefer to install and manage Mem0 yourself, you can use the open-source Mem0 package. Read the manual installation instructions below to get started with Mem0 on your machine.
The Mem0 package can be installed directly from pip command in the terminal.
pip install mem0ai
Alternatively, you can use Mem0 in one click using the hosted platform here.
Mem0 supports a variety of LLMs, with details available in our Supported LLMs documentation. By default, Mem0 comes equipped with gpt-4o
. To use it, simply set the keys in the environment variables.
import os
os.environ["OPENAI_API_KEY"] = "sk-xxx"
Now, you can simply initialize the memory.
from mem0 import Memory
m = Memory()
You can perform the following task on the memory.
- Add: adds memory
- Update: update memory of a given memory_id
- Search: fetch memories based on a query
- Get: return memories for a certain user/agent/session
- History: describes how a memory has changed over time for a specific memory ID
# 1. Add: Store a memory from any unstructured text
result = m.add("I am working on improving my tennis skills. Suggest some online courses.", user_id="alice", metadata={"category": "hobbies"})
# Created memory --> 'Improving her tennis skills.' and 'Looking for online suggestions.'
# 2. Update: update the memory
result = m.update(memory_id=<memory_id_1>, data="Likes to play tennis on weekends")
# Updated memory --> 'Likes to play tennis on weekends.' and 'Looking for online suggestions.'
# 3. Search: search related memories
related_memories = m.search(query="What are Alice's hobbies?", user_id="alice")
# Retrieved memory --> 'Likes to play tennis on weekends'
# 4. Get all memories
all_memories = m.get_all()
memory_id = all_memories[0]["id"] # get a memory_id
# All memory items --> 'Likes to play tennis on weekends.' and 'Looking for online suggestions.'
# 5. Get memory history for a particular memory_id
history = m.history(memory_id=<memory_id_1>)
# Logs corresponding to memory_id_1 --> {'prev_value': 'Working on improving tennis skills and interested in online courses for tennis.', 'new_value': 'Likes to play tennis on weekends' }
Tip
If you are looking for a hosted version and don't want to setup the infrastucture yourself, checkout Mem0 Cloud to get started in minutes.
- Multi-Level Memory: User, Session, and AI Agent memory retention
- Adaptive Personalization: Continuous improvement based on interactions
- Developer-Friendly API: Simple integration into various applications
- Cross-Platform Consistency: Uniform behavior across devices
- Managed Service: Hassle-free hosted solution
For detailed usage instructions and API reference, visit our documentation at docs.mem0.ai.
For production environments, you can use Qdrant as a vector store:
from mem0 import Memory
config = {
"vector_store": {
"provider": "qdrant",
"config": {
"host": "localhost",
"port": 6333,
}
},
}
m = Memory.from_config(config)
- Integration with various LLM providers
- Support for LLM frameworks
- Integration with AI Agents frameworks
- Customizable memory creation/update rules
Join our Slack or Discord community for support and discussions. If you have any questions, feel free to reach out to us using one of the following methods:
We value and appreciate the contributions of our community. Special thanks to our contributors for helping us improve Mem0.
This project is licensed under the Apache 2.0 License - see the LICENSE file for details.