Testing
The test/bruteforce.py script facilitates testing the /generate-json endpoint with multiple user prompts and various AI prompt versions.
How to Run Tests:
-
Ensure the FastAPI server is running (see "Running the Application" above).
-
Place your test prompts in the ./test/user_prompts folder. Each prompt should be a separate .txt file.
Example: ./test/user_prompts/chatbot_intro.txt
Create a chatbot for a tutor: introduction, lesson schedule, reminders, and FAQ.
- Run the test script:
Bash
python test/bruteforce_json_generation.py
This script will:
-
Load prompts from ./user_prompts.
-
Iterate through predefined AI prompt versions (currently v2 for create_json_generation_prompt and v3, v4 for extract_state_names_node).
-
Send requests to the local API.
-
Save generated JSON responses (or error messages) into the ./json_generations_v2 folder (or other name you will make in script constants). Files are named with the prompt name, versions used, and a timestamp.