The bot gets a voice

The simple cron job that sends me a weather report as a XMPP message turned into a little persistent bot.

A couple of things changed. I switched from using OpenWeatherMap's API to Open-Meteo due to the former's wildly inacurate responses. I also pass the final data through a small LLM so that it turns into a few concise, human-parsable sentences.

I am using llama.cpp(link) with the python binding(link). For the model I just grabbed ggml-org/gemma-3-4b-it-GGUF. Takes a couple of seconds to generate the response. Not bad for an old low powered AM4 AMD CPU and 8GB RAM. I have not done any tweaking or experimenting besides playing wiht a few system messages. It still sound a bit robotic and dry.

Next up I want to try a couple of different things: - dial in the prompt - compile llama.cpp with CUDA support for my old GTX 1050ti (that will be an adventure) - generate a card with PIL and send it as a message instead of text