主页 > 软件开发  > 

gradio创建openai前端对接deepseek等模型流式输出markdown格式文本

gradio创建openai前端对接deepseek等模型流式输出markdown格式文本
环境 gradio==3.50.2 openai==1.63.1 代码 import openai import gradio as gr#导入gradio的包 api_key = "sk-**a8" api_base = " api.deepseek /v1" import gradio as gr import openai from typing import List, Any, Iterator client = openai.OpenAI(api_key=api_key,base_url=api_base) def chat_stream( message: str, history: List[List[str]], temperature: float = 0.7, top_k: int = 40, system_prompt: str = "You are a helpful assistant." ) -> Iterator[Any]: """流式输出OpenAI响应""" messages = [{"role": "system", "content": system_prompt}] # 添加历史记录 for human_msg, ai_msg in history: messages.append({"role": "user", "content": human_msg}) messages.append({"role": "assistant", "content": ai_msg}) # 添加当前消息 messages.append({"role": "user", "content": message}) # 调用OpenAI API进行流式输出 (新版API) response = client.chat pletions.create( model="deepseek-chat", # 可以更换为其他模型 messages=messages, temperature=temperature, top_p=1 - (1.0 / top_k) if top_k > 1 else 1.0, # 转换top_k为top_p stream=True ) full_response = "" for chunk in response: if chunk.choices and len(chunk.choices) > 0: content = chunk.choices[0].delta.content if content is not None: full_response += content yield full_response def clear_history(): """清除聊天历史记录""" return [] # 创建Gradio界面 with gr.Blocks(css="footer {visibility: hidden}") as demo: gr.Markdown("# OpenAI Chat Interface") with gr.Row(): with gr.Column(scale=4): chatbot = gr.Chatbot(height=500, label="对话记录", render_markdown=True) with gr.Row(): message = gr.Textbox( show_label=False, placeholder="在这里输入您的消息...", container=False, scale=9 ) submit = gr.Button("发送", scale=1) with gr.Column(scale=1): system_prompt = gr.Textbox( label="系统提示", placeholder="设置AI的角色和行为...", value="You are a helpful assistant." ) temperature = gr.Slider( minimum=0.0, maximum=1.0, value=0.7, step=0.1, label="Temperature", info="控制生成文本的随机性(值越高越随机)" ) top_k = gr.Slider( minimum=1, maximum=100, value=40, step=1, label="Top K", info="从K个最可能的下一个词中选择(值越小结果越确定)" ) clear = gr.Button("清除历史记录") # 处理提交操作 def user(user_message, history): if user_message == "": return "", history return "", history + [[user_message, ""]] def bot(history, temperature, top_k, system_prompt): if not history: return history user_message = history[-1][0] bot_response = "" for response in chat_stream(user_message, history[:-1], temperature, top_k, system_prompt): bot_response = response history[-1][1] = bot_response yield history message.submit(user, [message, chatbot], [message, chatbot], queue=False).then( bot, [chatbot, temperature, top_k, system_prompt], chatbot ) submit.click(user, [message, chatbot], [message, chatbot], queue=False).then( bot, [chatbot, temperature, top_k, system_prompt], chatbot ) clear.click(clear_history, None, chatbot) # 启动应用 if __name__ == "__main__": demo.queue() demo.launch(debug=True) 展示

标签:

gradio创建openai前端对接deepseek等模型流式输出markdown格式文本由讯客互联软件开发栏目发布,感谢您对讯客互联的认可,以及对我们原创作品以及文章的青睐,非常欢迎各位朋友分享到个人网站或者朋友圈,但转载请说明文章出处“gradio创建openai前端对接deepseek等模型流式输出markdown格式文本