Develop Flashcards
(23 cards)
What is the ABAP AI SDK?
A reuse library providing access to generative AI via ISLM.
How do you use the ABAP AI SDK in your code?
By calling released ABAP classes and interfaces via APIs.
What does the Completion API in the ABAP AI SDK do?
It generates text from a prompt using an LLM.
What is the purpose of the Prompt Library API?
To use ISLM-defined prompt templates to create prompts.
How can you debug issues when using the SDK APIs?
Use ABAP Cross Trace to inspect the call stack.
Which class and method do you use to instantiate the Completion API?
Use CL_AIC_ISLM_COMPL_API_FACTORY=>GET( ) and then CREATE_INSTANCE.
What does the interface IF_AIC_COMPLETION_PARAMETERS allow you to do?
Set model parameters like temperature, max tokens, or any custom parameter.
What does the method SET_TEMPERATURE control?
How creative the LLM’s response should be (range 0 to 1).
What does SET_MAXIMUM_TOKENS do?
Sets the max number of tokens for the generated response.
What should you watch out for when using SET_ANY_PARAMETER?
Ensure the parameter is supported by the LLM to avoid runtime errors.
What are the two main methods provided by the Completion API to send prompts?
EXECUTE_FOR_STRING (prompt as a string) and EXECUTE_FOR_MESSAGES (prompt as a message list).
What does the system role message define?
The role or behavior of the LLM.
What is Function Calling in the Completion API?
It allows you to expose external functions/tools to the LLM so it can request information it doesn’t have, enriching its capabilities.
What does the LLM return when it needs a function call?
A response containing the function name and parameter values for the requested function call.
How are complex function results handled?
You can format complex results as JSON strings and explain the structure in the function definition.
Do all large language models support function calling?
No, function calling is not supported by all LLMs.
What is the main purpose of the Prompt Library API in the ABAP AI SDK?
To use prompt templates predefined in Intelligent Scenario Models to generate prompts with dynamic content for LLM interactions.
What two pieces of information do you need to get a prompt template instance?
The intelligent scenario name and the prompt template ID.
Why use prompt templates instead of hardcoding prompts?
To reuse recurring prompt texts efficiently and maintain consistency.
What tool is used for tracing errors and inspecting the call stack when using the ABAP AI SDK?
The ABAP Cross Trace in ABAP Development Tools.
How do you enable tracing for the ABAP AI SDK in the ABAP Cross Trace?
In the create configuration menu, check the entries ABAP Generative AI and SDK.
Why would you use the ABAP Cross Trace with the ABAP AI SDK?
To find errors and track the entire program process when calling the SDK APIs.
Where do you find the ABAP Cross Trace view in ABAP Development Tools?
Via Window > Show View > Other > ABAP Cross Trace.