diff --git a/.cursorrules b/.cursorrules index 8dadf5a..7d7f247 100644 --- a/.cursorrules +++ b/.cursorrules @@ -12,7 +12,7 @@ The goal is to help you maintain a big picture as well as the progress of the ta # Tools -Note all the tools are in python. So in the case you need to do batch processing, you can always consult the python files and write your own script. +Note all the tools are in python3. So in the case you need to do batch processing, you can always consult the python files and write your own script. ## Screenshot Verification @@ -20,12 +20,12 @@ The screenshot verification workflow allows you to capture screenshots of web pa 1. Screenshot Capture: ```bash -venv/bin/python tools/screenshot_utils.py URL [--output OUTPUT] [--width WIDTH] [--height HEIGHT] +venv/bin/python3 tools/screenshot_utils.py URL [--output OUTPUT] [--width WIDTH] [--height HEIGHT] ``` 2. LLM Verification with Images: ```bash -venv/bin/python tools/llm_api.py --prompt "Your verification question" --provider {openai|anthropic} --image path/to/screenshot.png +venv/bin/python3 tools/llm_api.py --prompt "Your verification question" --provider {openai|anthropic} --image path/to/screenshot.png ``` Example workflow: @@ -51,7 +51,7 @@ print(response) You always have an LLM at your side to help you with the task. For simple tasks, you could invoke the LLM by running the following command: ``` -venv/bin/python ./tools/llm_api.py --prompt "What is the capital of France?" --provider "anthropic" +venv/bin/python3 ./tools/llm_api.py --prompt "What is the capital of France?" --provider "anthropic" ``` The LLM API supports multiple providers: @@ -67,16 +67,16 @@ But usually it's a better idea to check the content of the file and use the APIs ## Web browser You could use the `tools/web_scraper.py` file to scrape the web. -``` -venv/bin/python ./tools/web_scraper.py --max-concurrent 3 URL1 URL2 URL3 +```bash +venv/bin/python3 ./tools/web_scraper.py --max-concurrent 3 URL1 URL2 URL3 ``` This will output the content of the web pages. ## Search engine You could use the `tools/search_engine.py` file to search the web. -``` -venv/bin/python ./tools/search_engine.py "your search keywords" +```bash +venv/bin/python3 ./tools/search_engine.py "your search keywords" ``` This will output the search results in the following format: ``` @@ -90,15 +90,12 @@ If needed, you can further use the `web_scraper.py` file to scrape the web page ## User Specified Lessons -- You have a python venv in ./venv. Use it. -- Include info useful for debugging in the program output. -- Read the file before you try to edit it. +- You have a python venv in ./venv. Always use (activate) it when doing python development. First, to check whether 'uv' is available, use `which uv`. If that's the case, first activate the venv, and then use `uv pip install` to install packages. Otherwise, fall back to `pip`. - Due to Cursor's limit, when you use `git` and `gh` and need to submit a multiline commit message, first write the message in a file, and then use `git commit -F ` or similar command to commit. And then remove the file. Include "[Cursor] " in the commit message and PR title. ## Cursor learned - For search results, ensure proper handling of different character encodings (UTF-8) for international queries -- Add debug information to stderr while keeping the main output clean in stdout for better pipeline integration - When using seaborn styles in matplotlib, use 'seaborn-v0_8' instead of 'seaborn' as the style name due to recent seaborn version changes - Use 'gpt-4o' as the model name for OpenAI's GPT-4 with vision capabilities - When searching for recent news, use the current year (2025) instead of previous years, or simply use the "recent" keyword to get the latest information