Skip to content

Promptrefiner

promptrefiner Logo

Enhancing prompts with intelligent strategies for LLMs

Test Test Test Test


πŸš€ Welcome to Promptrefiner

Helping you craft the perfect prompt for better LLM responses!

PromptRefiner is a lightweight Python library that helps users write better prompts for Large Language Models (LLMs) with minimal configurations. Many users struggle to craft effective prompts that yield the desired results.

PromptRefiner takes user input, applies a selected strategy, and returns an improved prompt to get more specific and effective responses from LLMs (Large Language Models). It achieves this by leveraging an LLM to refine the user’s prompt based on predefined strategies, making it easier to get high-quality responses.

Whether you're using a prompt for GPT-4, Claude, Mistral, or any other LLM, PromptRefiner ensures your input is well-structured for the best possible output.


✨ Key Features

βœ… Supports 100+ LLM Clients – Works with OpenAI, Anthropic, Hugging Face, and more!
βœ… Highly Customizable – Use different LLM clients per strategy or a single client for all.
βœ… Command-Line First – Quickly refine prompts from the CLI for rapid experimentation.
βœ… Extensible – Developers can create their own custom prompt refinement strategies.
βœ… Seamless Integration – Works effortlessly in Python applications or scripts.


πŸ“₯ Installation

Install PromptRefiner using pip:

pip install promptrefiner

⚑ Quick Start

πŸ”§ Using from the Command Line

Before using promptrefiner in Python, make sure to set environment variables (Windows users should use set instead of export):

export PREFINER_API_KEY="your-api-key-here"
export PREFINER_MODEL="openai/gpt-4"  # Change based on your LLM model

and there you go...

promptrefiner --strategy persona "Tell me about AI"

🐍 Using in a Python Script

Note

Make sure to set environment variables PREFINER_API_KEY and PREFINER_MODEL before using PromptRefiner in your python script.

from promptrefiner import PromptRefiner

prompt_refiner = PromptRefiner(strategies=["persona"])
refined_prompt = prompt_refiner.refine("Explain quantum mechanics.")
print(refined_prompt)

πŸ” How It Works

  1. User provides a prompt (e.g., "Tell me about AI").
  2. Selects a strategy (e.g., "verbose" for a more detailed response).
  3. PromptRefiner applies a system prompt template for that strategy.
  4. Sends it to an LLM for refinement.
  5. Returns the improved prompt back to the user.

πŸš€ Under the hood: Each strategy is backed by a system prompt template that guides the LLM to refine the user’s input for better results.


πŸ€” Why Use PromptRefiner?

πŸ”Ή Improve prompt clarity & effectiveness – Get sharper, more relevant responses.
πŸ”Ή Save time – No need to manually tweak prompts for better results.
πŸ”Ή Optimized for developers & researchers – Quickly test different prompting strategies.
πŸ”Ή Fine-tune for different LLMs – Customize strategies for specific AI models.
πŸ”Ή Works for various use cases:

  • Chatbots & AI assistants
  • Content generation & summarization
  • Data extraction from LLMs
  • Code generation improvements

πŸš€ Join Us & Contribute!

We welcome contributors, feedback, and feature suggestions! πŸš€

πŸ“Œ GitHub Repo: darshit7/promptrefiner
πŸ“Œ Documentation: Promptrefiner
πŸ“Œ Report Issues & Ideas: Coming Soon

πŸ‘₯ Want to improve PromptRefiner? Open a GitHub issue or contribute a pull request! πŸ› οΈ


πŸš€ Refine your prompts. Supercharge your AI interactions. Try PromptRefiner today!