there are still some issue when integrate with ai_news_collector_lib.
e.g.,
hardcode llm with gemini-2.5-flash,
topics list is not optimized with lib handling.
some news are "old" in fact, e.g, report gemini 2.0 is released...
the llm context windows max number is hardcode to 3000 which is not enough.
enhance user query in lib internal implementation.
merge same search engine to ai-news-collector-lib.
metaso search always failed