abacusai.llm_response
Classes
Parsed code block from an LLM response |
|
The response returned by LLM |
Module Contents
- class abacusai.llm_response.LlmCodeBlock(client, language=None, code=None, start=None, end=None, valid=None)
Bases:
abacusai.return_class.AbstractApiClassParsed code block from an LLM response
- Parameters:
client (ApiClient) – An authenticated API Client instance
language (str) – The language of the code block. Eg - python/sql/etc.
code (str) – source code string
start (int) – index of the starting character of the code block in the original response
end (int) – index of the last character of the code block in the original response
valid (bool) – flag denoting whether the soruce code string is syntactically valid
- __repr__()
Return repr(self).
- class abacusai.llm_response.AbstractApiClass(client, id)
- __eq__(other)
Return self==value.
- _get_attribute_as_dict(attribute)
- class abacusai.llm_response.LlmResponse(client, content=None, tokens=None, stopReason=None, llmName=None, codeBlocks={})
Bases:
abacusai.return_class.AbstractApiClassThe response returned by LLM
- Parameters:
client (ApiClient) – An authenticated API Client instance
content (str) – Full response from LLM.
tokens (int) – The number of tokens in the response.
stopReason (str) – The reason due to which the response generation stopped.
llmName (str) – The name of the LLM model used to generate the response.
codeBlocks (LlmCodeBlock) – A list of parsed code blocks from raw LLM Response
- __repr__()
Return repr(self).