nyellin
06/22/2024, 5:35 PMPlyra
06/23/2024, 4:56 PMMikko
06/25/2024, 12:43 PMhalsdunes
06/25/2024, 7:57 PMcallApi
function manually via Node, it works. However, when I run it with promptfoo, it just returns an empty output. Any idea why this might be happening?vikash
06/26/2024, 4:21 AMKrimp
06/27/2024, 6:46 AMTem
07/01/2024, 6:07 PMigotsevens
07/02/2024, 2:31 PMDerModMaster
07/03/2024, 7:26 AMenv:
OPENAI_API_KEY: sk-proj-XXXXXXXXXX
OPENAI_API_HOST: api.openai.com
OPENAI_ORGANIZATION: org-XXXXXXXXXX
When i don't use the llm-rubric assert everything works fine. The llm is generating a response to my prompt which is visible in the result.
But with the llm-rubric assert i only get an error. Even when i set apiKey in the providerconfig of the config.yaml
https://cdn.discordapp.com/attachments/1257960702099066911/1257960702363177082/image.png?ex=66864eb3&is=6684fd33&hm=9151f4f84015aad08ac44813c0cb75ef8174e28784615cd57f510ce21a6e261c&PedroA
07/03/2024, 2:47 PMContinousPretraining
07/04/2024, 9:35 AMasad
07/06/2024, 7:00 AMyubo
07/07/2024, 2:35 AMkhat-er
07/08/2024, 10:06 AMkhat-er
07/09/2024, 1:37 PMGianfree
07/12/2024, 1:41 PMnpx promptfoo@latest generate dataset --config generate_dataset.yaml
I get the following error:
SyntaxError: "undefined" is not valid JSON
at JSON.parse (<anonymous>)
at synthesize (/Users/username/.npm/_npx/81bbc6515d992ace/node_modules/promptfoo/dist/src/testCases.js:253:27)
at async Command.<anonymous> (/Users/username/.npm/_npx/81bbc6515d992ace/node_modules/promptfoo/dist/src/commands/generate.js:157:25)
generate_dataset.yaml contains the following:
prompts:
- 'Act as a travel guide for {{location}}'
- 'I want you to act as a travel guide. I will write you my location and you will suggest a place to visit near my location. In some cases, I will also give you the type of places I will visit. You will also suggest me places of similar type that are close to my first location. My current location is {{location}}'
providers:
- id: azureopenai:chat:deploymentNameHere
config:
apiHost: 'xxxxxxxx.openai.azure.com'
tests:
- vars:
location: 'San Francisco'
- vars:
location: 'Wyoming'
- vars:
location: 'Kyoto'
- vars:
location: 'Great Barrier Reef'
Normal evaluation using a promptfooconfig.yaml
file with same provider works perfectly. Any ideas on what could be wrong? Thanks!chongus6280
07/12/2024, 7:05 PMluiscosio
07/15/2024, 2:57 AMdescription: "My eval"
prompts:
- file://prompts.json
providers:
- id: 'http://localhost:1234/v1/chat/completions'
config:
method: 'POST'
headers:
'Content-Type': 'application/json'
body:
prompt: "{{prompt}}"
responseParser: 'json.choices[0].message.content'
tests:
- vars:
topic: bananas
This is my prompts.json:
[
{
"role": "system",
"content": "You are a helpful assistant"
},
{
"role": "user",
"content": "Tell me about {{topic}}"
}
]
This is the error:
SyntaxError: Bad control character in string literal in JSON at position 82 SyntaxError: Bad control character in string literal in JSON at position 82 at JSON.parse (<anonymous>) at HttpProvider.callApi (C:\Users\Luis\AppData\Roaming\npm\node_modules\promptfoo\dist\src\providers\http.js:37:37) at Evaluator.runEval (C:\Users\Luis\AppData\Roaming\npm\node_modules\promptfoo\dist\src\evaluator.js:297:62) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async processEvalStep (C:\Users\Luis\AppData\Roaming\npm\node_modules\promptfoo\dist\src\evaluator.js:621:25) --- SyntaxError: Bad control character in string literal in JSON at position 82 SyntaxError: Bad control character in string literal in JSON at position 82 at JSON.parse (<anonymous>) at HttpProvider.callApi (C:\Users\Luis\AppData\Roaming\npm\node_modules\promptfoo\dist\src\providers\http.js:37:37) at Evaluator.runEval (C:\Users\Luis\AppData\Roaming\npm\node_modules\promptfoo\dist\src\evaluator.js:297:62) at process.processTicksAndRejections (node:internal/process/task_queues:95:5) at async processEvalStep (C:\Users\Luis\AppData\Roaming\npm\node_modules\promptfoo\dist\src\evaluator.js:621:25)
Backend is LM Studio (llama.cpp wrapper) and the model is gemma-2ContinousPretraining
07/15/2024, 5:43 AMpeanutshawn
07/15/2024, 3:01 PMEvalRunner
defined in my custom provider. EvalRunner
, in its constructor, creates a bunch of instance variables that require a network call.
class EvalRunner:
def __init__(self,):
self.tools = self._get_tools(key=api_key)
self.prompts = self._get_prompts(key=api_key)
and in my call_api
function, I instantiate EvalRunner
like so:
def call_api(prompt, options, context):
config = options.get("config", None)
runner = EvalRunner(config=runner_config, model_client=model_client)
im guessing here each time call_api
runs, I instantiate a new runner
and make two network calls. if I define runner outside of call_api
, does that cut down the number of network calls to only one? thank you!mrbanda
07/15/2024, 8:06 PMmrbanda
07/16/2024, 2:54 AMmrbanda
07/16/2024, 5:15 PMWill
07/20/2024, 4:44 AMmrbanda
07/20/2024, 2:57 PMneoteristis
07/21/2024, 8:53 AMneoteristis
07/22/2024, 12:21 PMyaml
prompts:
prompts/agent-system-prompts/api-event-expert/phi3.v001.txt: api_event_expert_phi3_v001
providers:
- file://./providers/phi3.t_0.tokens_1024.yml # <--- this line works
config: # <--- how to add custom info like this ?
prompts: # <---
- api_event_expert_phi3_v001 # <---
Thank you 🙂neoteristis
07/22/2024, 5:14 PMyaml
defaultTest:
- vars:
phi3_template_prompt: file://./prompts/model-templates/phi3.txt
gemma_template_prompt: file://./prompts/model-templates/gemma.txt
system_prompt: [ "system_prompt_001", "system_prompt_002" ]
neoteristis
07/22/2024, 10:43 PMfile://
-> this does not result in an error, the id:
is correctly taken into account, however none of the other parameters are.
- with `yaml`/`yml` -> straight up an error : `TypeError [ERR_UNKNOWN_FILE_EXTENSION]: Unknown file extension ".yml" for /app/test/providers/gemma.t_0.tokens_1024.yml`*
yaml
providers: [
file://./providers/phi3.t_0.tokens_1024.yml
]
yaml
providers:
- file://./providers/phi3.t_0.tokens_1024.yml
File content :
yaml
id: ollama:chat:phi3
prompts:
- phi3_template_prompt
config:
temperature: 0
max_tokens: 1024
*Full error :
https://cdn.discordapp.com/attachments/1265076695765880916/1265076696365662290/image.png?ex=66a031fc&is=669ee07c&hm=53506ccddddbf95af0f2950e4f4773277820494a51adcc66bd68d36c4754a32a&chukitow
07/23/2024, 3:35 PM