Chat completion support
See original GitHub issueI’ve been digging through the IKernel
and function abstractions hoping to find a way to enable gpt-3.5-turbo
APIs (chat completion) and more recently GPT-4 APIs but given ITextCompletion
only takes a string
as input I haven’t found a way to reasonably change the bits to enable the new behavior.
Issue Analytics
- State:
- Created 6 months ago
- Comments:13 (5 by maintainers)
Top Results From Across the Web
GPT models - OpenAI API
Our latest models, gpt-4 and gpt-3.5-turbo , are accessed through the chat completions API endpoint. Currently, only the older legacy models are available ......
Read more >API Reference
Represents a chat completion response returned by model, ... A unique identifier representing your end-user, which can help OpenAI to monitor and detect ......
Read more >OpenAI GPT Chat Completions Accounts For 97% Of Usage
Since the introduction of the Chat Completion API, it accounts for 97% of all GPT API usage as opposed to text completions.
Read more >How to Use OpenAI's Chat Completion API with Python
If you're looking to add a conversational AI feature to your application, OpenAI's Chat Completion API might be the solution you need.
Read more >How to work with the GPT-35-Turbo and GPT-4 models
The Chat Completion API is a new dedicated API for interacting with ... only supports gpt-35-turbo models, and the underlying format is more ......
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Another tip I’ll give you, for
gpt-3.5-turbo
at least, is that I would avoid sending “system” messages all together. The model will very quickly abandon them and I’ve gotten far better results by just including an extra “user” message containing the core system prompt.quick update: work is in progress, here’s the pull request adding ChatGPT and DallE: https://github.com/microsoft/semantic-kernel/pull/161