Project Miyagi - Envisioning sample for Copilot stack
“Start with the customer experience and work backwards for the technology” - Steve Jobs
"Change is the only constant" - Ancient wisdom
Project Miyagi serves as the foundation for an envisioning workshop that reimagines the design, development, and deployment of intelligent applications using Microsoft's Copilot stack. It demonstrates that integrating intelligence transcends a simple chat interface and permeates every aspect of your product experience, utilizing semantic data to generate personalized interactions and effectively address individual needs. Through a comprehensive exploration of generative and discriminative use cases, Miyagi offers hands-on experience with cutting-edge programming paradigms that harness the power of foundation models in every workflow. Additionally, it introduces traditional software engineers to emerging design patterns in prompt engineering (chain-of-thought, few-shot, retrieval-augmentation), vectorization for long-term memory, and tools or affordances to augment and ground LLMs.
Note
Work in Progress. Meanwhile, signup at intelligentapp.dev for updates and checkout our related repo that showcases Generative AI capabilities for cloud-native, event-driven microservices: Azure/reddog-solutions.For a preview, catch the recording on Cosmos DB Live TV
The project includes examples of usage for popular frameworks and orchestrators such as Semantic Kernel, Guidance, Promptflow, LlamaIndex, LangChain, vector stores (Azure Cognitive Search, CosmosDB Postgres pgvector, and Qdrant), and generative image utilities such as DreamFusion and ControlNet. Additionally, it features fine-tuned foundation Models from AzureML. Utilize this project to gain insights as you modernize and transform your applications with AI and fine-tune your private data to build your own Copilot.
Embedded with intelligence and built on a scalable event-driven architecture, Project Miyagi emphasizes customer-centricity. It challenges you to rethink how AI can curate and create hyper-personalized user interactions, whether in a line-of-business (LOB) or consumer-facing app. It features an easy-to-understand use case that offers concrete examples of how the capabilities in these new AI platforms and architectures, supported by Azure, can be leveraged for valuable insights.
This polyglot codebase relies on a multitude of microservices, implementing several use cases using our Copilot stack. It includes generative text and images for personalized financial coaching, summarization, and agent-like orchestration. Built on a cloud-native EDA backbone, the architecture and codebase ensures enterprise-grade quality attributes such as availability, scalability, and maintainability.
Embark on a journey to transform your applications into cutting-edge, intelligent systems with the self-guided workshop and discover the art of the possible.
Due to the rapid pace of advancements in foundation models, we are incrementally implementing use cases for Miyagi in the experiments folder. So far, we have the following implemented:
- MVP with Personalize (Synthesis via Semantic Kernel) and Chat on Azure Container Apps.
- Miyagi ChatGPT Plugin
- Knowledge Graph memory using Langchain's entity cache
- Qdrant vector store for embeddings via Langchain
- MS Graph API intent invoked via Semantic Kernel's skills
- Miyagi prompt engineered chat interaction using LangChain's PromptTemplate
- Azure OpenAI GPT-3.5 basic flow
- GPT-3.5-turbo and Whisper-1 usage to transcribe audio and demonstrate few-shot example
- DeepSpeed Chat MiyagiGPT (BYO Weights w/ RLHF - Reinforcement Learning from Human Feedback) - coming soon
Interaction with foundation models is more than chat. This sample shows a few use cases
This will be similar to reddog product image generation use case.
- Azure OpenAI
- gpt-4
- gpt-35-turbo
- text-embedding-ada-002
- Semantic Kernel
- Use your own data with Azure OpenAI
- AzureML PromptFlow
- TypeChat
- Azure Functions
- APIM
- Service Bus
- Event Grid
- Logic Apps
- AKS / ACA
- Cosmos DB
- Github Actions
- Azure Monitor
- Azure DB for PostgreSQL
- Azure Redis Cache
- Azure Storage
- Apache Kafka on Azure Event Hubs
- Azure HuggingFace Inference Endpoints
- LangChain
- Foundation Models from CogServices
- Qdrant
- Microsoft DeepSpeed Chat
- Azure Web PubSub
- Azure Communication Services (ACS)
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact [email protected] with any additional questions or comments.