The models can’t do anything the inference library doesn’t allow for. So you shouldn’t need to worry about a rogue model if you’re loading it somewhere someone you trust can vouch for. If you’re worried about Ollama, you can monitor its network usage (and block it in your firewall). There shouldn’t be any network activity if you disable auto-update.
Almost as scummy as the concept of a YouTuber using affiliate links.