ComfyOnline
LLM_local

ComfyUI Node: 本地大语言模型(LLM_local)

Authored by heshengtao

Created 7 months ago

Updated 21 days ago

791 stars

Category

大模型派对(llm_party)/模型(model)

Inputs

system_prompt STRING

user_prompt STRING

model_type

  • GLM
  • llama
  • Qwen

model_path STRING

tokenizer_path STRING

temperature FLOAT

is_memory

  • enable
  • disable

is_tools_in_sys_prompt

  • enable
  • disable

is_locked

  • enable
  • disable

is_reload

  • enable
  • disable

main_brain

  • enable
  • disable

device

  • cuda
  • cuda-float16
  • cuda-int8
  • cuda-int4
  • cpu

max_length INT

tools STRING

file_content STRING

Outputs

STRING

STRING

STRING

Extension: comfyui_LLM_party

A set of block-based LLM agent node libraries designed for ComfyUI.This project aims to develop a complete set of nodes for LLM workflow construction based on comfyui. It allows users to quickly and conveniently build their own LLM workflows and easily integrate them into their existing SD workflows.

Authored by heshengtao

related extension: