Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature]: SemanticKernel FuctionCall #758

Open
justmine66 opened this issue May 28, 2024 · 3 comments
Open

[Feature]: SemanticKernel FuctionCall #758

justmine66 opened this issue May 28, 2024 · 3 comments

Comments

@justmine66
Copy link

Background & Description

from https://github.com/kosimas/LLamaSharp/blob/master/LLama.Examples/Examples/SemanticKernelFunctionCalling.cs ,but unable to find attribute: LLamaSharpPromptExecutionSettings llamaSettings = new ()
{
AutoInvoke = true,
};

API & Usage

No response

How to implement

No response

@zsogitbe
Copy link
Contributor

This is not part of the official LLamaSharp codebase. You need to use the LLamaSharp from 'kosimas' until he files a PR and it is merged into the official LLamaSharp code.

@justmine66
Copy link
Author

This is not part of the official LLamaSharp codebase. You need to use the LLamaSharp from 'kosimas' until he files a PR and it is merged into the official LLamaSharp code.

thanks,I am very much looking forward to official support.

@zsogitbe
Copy link
Contributor

You are welcome.

Please note that you can do the same, but better with the Handlebars planner in Semantic Kernel. Plugins/functions in a handlebars plan are executed automatically.
This 'AutoInvoke' function feature was first introduced by OpenAI and it relies on a specialized model which can generate the function calling syntax. You can execute a handlebars plan with any model.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants