<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Local-Ai on Hugo documents a daily AI experiment</title><link>https://hugomelis.nl/tags/local-ai/</link><description>Recent content in Local-Ai on Hugo documents a daily AI experiment</description><generator>Hugo</generator><language>en-us</language><managingEditor>hello@hugomelis.nl (Hugo Melis)</managingEditor><webMaster>hello@hugomelis.nl (Hugo Melis)</webMaster><lastBuildDate>Wed, 08 Apr 2026 08:59:18 +0200</lastBuildDate><atom:link href="https://hugomelis.nl/tags/local-ai/index.xml" rel="self" type="application/rss+xml"/><item><title>Can Local Voice Dictation Keep Up With My Thoughts?</title><link>https://hugomelis.nl/experiments/experiment-37-local-voice-dictation-for-ai-prompts/</link><pubDate>Thu, 26 Mar 2026 07:00:00 +0100</pubDate><author>hello@hugomelis.nl (Hugo Melis)</author><guid>https://hugomelis.nl/experiments/experiment-37-local-voice-dictation-for-ai-prompts/</guid><description>Speaking gave me fuller prompts with less friction than typing.</description></item><item><title>Can I Run a Decent Local AI Model on My iPhone 17 Pro?</title><link>https://hugomelis.nl/experiments/experiment-14-local-ai-on-iphone/</link><pubDate>Tue, 03 Mar 2026 08:00:00 +0100</pubDate><author>hello@hugomelis.nl (Hugo Melis)</author><guid>https://hugomelis.nl/experiments/experiment-14-local-ai-on-iphone/</guid><description>Local iPhone models were useful, private, and still limited.</description></item><item><title>Can I Build a Local Transcript and Summary Menu Bar macOS App</title><link>https://hugomelis.nl/experiments/experiment-11-local-transscript-summary-menubar-macos-app/</link><pubDate>Sat, 28 Feb 2026 08:00:00 +0100</pubDate><author>hello@hugomelis.nl (Hugo Melis)</author><guid>https://hugomelis.nl/experiments/experiment-11-local-transscript-summary-menubar-macos-app/</guid><description>A local Mac app turned voice notes into private summaries.</description></item><item><title>Finding the Right Local AI Model to Run on My Hardware</title><link>https://hugomelis.nl/experiments/experiment-9-finding-right-local-ai-model-llmfit/</link><pubDate>Thu, 26 Feb 2026 08:00:00 +0100</pubDate><author>hello@hugomelis.nl (Hugo Melis)</author><guid>https://hugomelis.nl/experiments/experiment-9-finding-right-local-ai-model-llmfit/</guid><description>llmfit made local model selection faster and repeatable.</description></item><item><title>Running a Local Model to Allow My Thoughts to Flow Freely</title><link>https://hugomelis.nl/experiments/running-a-local-model-to-allow-my-thoughts-to-flow-freely/</link><pubDate>Tue, 24 Feb 2026 08:00:00 +0100</pubDate><author>hello@hugomelis.nl (Hugo Melis)</author><guid>https://hugomelis.nl/experiments/running-a-local-model-to-allow-my-thoughts-to-flow-freely/</guid><description>Local Ollama models made private thinking feel freer.</description></item></channel></rss>