Skip to content

MLECO-6496: Introduce chat template functions for LLM frameworks

Yunus Kalkan requested to merge feature/chat-template into main
  • Add support for applying automatic chat templates in onnxruntime-genai and llama.cpp based on model metadata
  • Allow users to define a custom chat template by providing templates in the config file
  • Enable smoother switching between different LLM models
  • Rename llmPrefix to SystemPrompt for clearer representation

Change-Id: I738eab65667db4d775cd104521c85f7b194b1420 Signed-off-by: Yunus Kalkan yunus.kalkan@arm.com

Merge request reports

Loading