Will
04/26/2024, 2:32 AMproviders:
- id: openai:chat:mixtral-8x7b-32768
config:
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY
- id: openai:chat:llama2-70b-4096
config:
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY
Looks like the model name is passed in via the id
. Totally fine with me, but I just want to be able to rename the column to show groq somewhere, otherwise it gets pretty confusing when comparing providers because the columns all show openai:chat:model
Is there a way to rename this? Tbh it does feel a bit awkward to use the id
field to handle the model.Will
04/26/2024, 11:13 PMprompts:
- prompts/few-shot-json.js
- prompts/true-false-json.js
providers:
- id: openai:chat:llama3-70b-8192
label: 'GROQ-LLAMA-70B'
config:
temperature: 0
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY
response_format: { type: 'json_object' }
- id: openai:chat:llama3-8b-8192
label: 'GROQ-LLAMA-8B'
config:
temperature: 0
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY
response_format: { type: 'json_object' }
I tried following the docs instruction: To reference a specific function in your prompt file, use the following syntax: filename.js:functionName:
, but this didn't work. The code runs without error but 0 evals are run.
prompts:
- prompts/few-shot-json.js:few-shot-prompt
- prompts/true-false-json.js:true-false-prompt
providers:
- id: openai:chat:llama3-70b-8192
label: 'GROQ-LLAMA-70B'
prompts:
- prompts/few-shot-json.js:few-shot-prompt
config:
temperature: 0
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY
- id: openai:chat:llama3-8b-8192
label: 'GROQ-LLAMA-8B'
prompts:
- prompts/true-false-json.js:true-false-prompt
config:
temperature: 0
apiBaseUrl: https://api.groq.com/openai/v1
apiKeyEnvar: GROQ_API_KEY
When I remove the prompts:
at the top, it errors out and says
throw new Error(`There are no prompts in ${JSON.stringify(promptPathOrGlobs)}`);
^
Error: There are no prompts in {}
at readPrompts (/Users/wl/.nvm/versions/node/v18.17.0/lib/node_modules/promptfoo/dist/src/prompts.js:273:15)
Can anyone provide an example? That would be super helpful. Thanks!Henrique Coelho
04/29/2024, 3:27 PMRajesh Ch
04/29/2024, 3:38 PMHenrique Coelho
04/29/2024, 4:33 PMDieter
04/30/2024, 7:33 AMknathaannnn
04/30/2024, 8:38 PMvitmont
04/30/2024, 9:27 PMtest.yaml
file, but it would be nice to supplt an additional vars on the command line that get read into the provider that are not defined in the test.yaml
file so it is configurable on the fly without editting the test definition.
For example, if we have
yaml
vars:
a: "some var"
assert:
...
We can pull the vars into the provider (python) similarly to
py
kwargs = json.loads(sys.argv[3])["vars"]
var_a = kwargs["a"] # "some var"
But could additionally supply another var such as
bash
promptfoo eval --var b="somevalue"
where we could then
py
kwargs = json.loads(sys.argv[3])["vars"]
var_a = kwargs["a"] # "some var"
var_b = kwargs.get("a", "default_value")
Mariuz
05/01/2024, 4:52 PMdavidl
05/01/2024, 11:10 PMEdgar
05/02/2024, 7:33 PMjimd
05/02/2024, 7:47 PMWill
05/03/2024, 5:54 PMjoemiller88
05/06/2024, 4:06 PMpromptfoo eval -c router/*
And getting this error
/node/v20.5.0/lib/node_modules/promptfoo/dist/src/util.js:193
throw new Error(Unsupported configuration file format: ${ext}
);
Note - I am using javascript config files, so I assume that perhaps this has only been tested with the YAML format and is not setup to work with javascript config files?knathaannnn
05/08/2024, 8:25 PMandrew
05/09/2024, 1:49 AMsrc
|_ prompts
|_ prompt-that-classifies-utterance
| |_ prompt.ts
| |_ tests
| |_ output
| |_ testExecutor.ts
|
|_ prompt-that-extracts-dates
|_ prompt.ts
|_ tests
|_ output
|_ testExecutor.ts
I could cd into each /tests directory and run promptfoo view .
which would only show me evals associated with a specific prompt. This was very neat.
Now I don't see how I can distinguish between all of the prompts in my project since they all get saved to a single sqllite database.
Is there a legitamate path forward with JSON, or is the sqlite the only way?Kirti
05/09/2024, 8:26 AManthonyivn
05/09/2024, 4:29 PMknathaannnn
05/10/2024, 3:42 PMek1338
05/12/2024, 8:53 AMek1338
05/12/2024, 9:17 AMpurnama
05/12/2024, 7:48 PM[FAIL] TypeError: prompt.trim is not a function
.
here's my YAML
description: "My first eval"
prompts:
- "langfuse://test-chat-prompt:2"
providers:
- openai:gpt-3.5-turbo-0613
- openai:gpt-4
tests:
- vars:
question: "Albert is a startup founder who is a member of YCombinator"
astein91
05/14/2024, 10:18 PMderangedgirl
05/15/2024, 10:02 AMAlexisIMBERT
05/15/2024, 12:53 PMWill
05/15/2024, 3:26 PMnot_one_idea
05/16/2024, 6:13 PMsystem
to pass the system prompt with the messages
param restricted to user & assistant content.
Promptfoo docs state that the anthropic provider plays nicely with openai style "messages" prompt templates: https://www.promptfoo.dev/docs/providers/anthropic#prompt-template. When the anthropic provider runs across that style of template it will parse the template and properly set the system and messages params sent to the Anthropic endpoint.
Unfortunately, the bedrock provider makes no mention of similar functionality: https://github.com/promptfoo/promptfoo/tree/main/examples/amazon-bedrock. The examples for bedrock all use pretty elementary prompt templates with variables being appended at the end some static instructions.
Can i use the openai style "messages" prompt templates with Anthropic via Bedrock? If not, how can i set the system prompt separately from the user messages when working with Anthropic via Bedrock?MarkeD
05/17/2024, 5:02 AMromanq
05/17/2024, 6:23 PMfoobar
05/18/2024, 8:01 PM