Skip to content Skip to footer

Introducing PriomptiPy: A Python-Based Library for Token Budgeting and Dynamic Prompt Rendering for LLMs

The Quarkle development team has released “PriomptiPy”, a Python adaptation of Cursor’s Priompt library, marking a significant step in furthering the development of Python-based conversational AI. This gives developers access to Cursor’s cutting-edge features for all large language model (LLM) applications.

PriomptiPy, which combines the terms ‘priority’, ‘prompt’, and ‘python’, is a prompting library designed to streamline token budgeting in AI systems, which often struggle with extensive context handling. The library aids in managing conversations that comprise book summaries, instructions, conversation history and more which can otherwise escalate to 8-10K tokens.

The development of PriomptiPy came about when Quarkle found their WebSockets running in Python prevented them from utilizing the Priompt library. Consequently, they adapted Priompt to Python to facilitate smooth integration with their existing framework.

Although PriomptiPy is not yet as exhaustive as Priompt, it is a promising start for developers keen to incorporate priority-based context management in their Python applications, aiding in the development of AI-enable agents and chatbots.

To demonstrate the usage of PriomptiPy, the Quarkle team provides an example of a managed conversation using the library. The example showcases the use of message types such as SystemMessage, UserMessage, and AssistantMessage and the inclusion of “Scope” for prioritization. The tool operates on prioritized content rendering and dynamically manages conversation flow, critical when managing a limited token space.

PriomptiPy introduces logical components like Scope, Empty, Isolate, First, Capture, SystemMessage, UserMessage, AssistantMessage, and Function, each serving a specific role in constructing AI model prompts. However, the developers emphasize careful consideration of priorities for maintaining efficient and cache-friendly prompts.

The team admits that some features, such as runnable function calling and capture, are not supported by PriomptiPy yet but are planned for future development. Cacheing also presents a challenge which the team hopes will be addressed with community input, thus inviting contributions to PriomptiPy which is open-source and under the MIT license.

Leave a comment

0.0/5