ComfyUI_LiteLLM
ComfyUI_LiteLLM is an addon for ComfyUI that provides nodes for interacting with various language models using the LiteLLM library directly within the ComfyUI interface. This addon serves as a bridge between ComfyUI and LiteLLM, allowing you to leverage any model supported by LiteLLM in your ComfyUI workflows.
Features
This addon provides ComfyUI nodes that interface with LiteLLM's functionality, including:
LiteLLMModelProvider
: Selects LiteLLM-supported models for use in other nodesLiteLLMCompletion
: Generates completions using any LiteLLM-supported modelLiteLLMCompletionPrePend
: Similar to LiteLLMCompletion but with a pre-pended promptLiteLLMCompletionListOfPrompts
: Generates completions for a list of promptsCreateReflectionFilter
andFirstCodeBlockReflectionFilter
: Creates reflection filters for use withLiteLLMCompletionWithReflectionFilter
LiteLLMCompletionWithReflectionFilter
: Generates completions with a reflection filterLiteLLMCompletionProvider
: Provides a completion function for use in other nodesLiteLLMMessage
: Creates LiteLLM messagesShowLastMessage
andShowMessages
: Display message content in the UIListToMessages
andMessagesToList
: Convert between list and message formatsAppendMessages
: Combine multiple message setsMessagesToText
andTextToMessages
: Convert between text and message formatsLLLMInterpretUrl
: Fetch and process content from URLs
Supported Models
This addon supports all models that LiteLLM can interface with. This includes, but is not limited to:
- OpenAI models
- Anthropic models (Claude)
- Google models (PaLM, Gemini)
- Azure OpenAI
- AWS Bedrock models
- Cohere models
- AI21 models
- And many more
For the most up-to-date list of supported models, please refer to the LiteLLM documentation.
Installation
Manual Installation
- Clone this repository into your
custom_nodes
directory in your ComfyUI installation. - Install the required dependencies by running
pip install -r requirements.txt
. - Restart ComfyUI.
Installation with ComfyUI-Manager
- Add this addon's repository URL to your ComfyUI-Manager configuration.
- Install the addon through ComfyUI-Manager.
- Restart ComfyUI.
Configuration
The addon reads configuration settings from config.yaml
. You can customize the settings by modifying this file.
Please ensure that your API keys are stored in environment variables and not saved to the graph. You can, at your discretion, put them in the config.yaml
file, but it is not recommended for security reasons.
Usage
After installation, the LiteLLM nodes will be available in the ComfyUI node editor under the "ETK/LLM/LiteLLM" category. You can use these nodes to integrate any LiteLLM-supported language model into your ComfyUI workflows.
To use a specific model, simply select it in the LiteLLMModelProvider
node, and ensure you have the necessary API keys and configurations set up as per LiteLLM's requirements.
Contributing
Contributions are welcome! If you encounter any issues or have suggestions for improvements, please open an issue or submit a pull request.
License
This addon is released under the MIT License.
Dependencies
- pyyaml
- addict
- litellm
- boto3
For the full list of dependencies, please refer to the requirements.txt
file.