MLECO-6496: Introduce chat template functions for LLM frameworks
- Add support for applying automatic chat templates in onnxruntime-genai and llama.cpp based on model metadata
- Allow users to define a custom chat template by providing templates in the config file
- Enable smoother switching between different LLM models
- Rename llmPrefix to SystemPrompt for clearer representation
Change-Id: I738eab65667db4d775cd104521c85f7b194b1420 Signed-off-by: Yunus Kalkan yunus.kalkan@arm.com