fix: handle optional content types in LLM response parsing
- Add get_text_content() helper method to PromptMessage class - Update generate_suggested_questions_after_answer to use the helper method - Properly handle str, list[PromptMessageContent], and None content typespull/22809/head
parent
bcce68cead
commit
8e8f2fd826
Loading…
Reference in New Issue