Upvoting this. I definitely think that more and more people will want to host their AI locally and while it is already simple with ollama (shout out to the ollama contributors) how you then have that always on, and integrated into other applications blocks people. The home lab seems like a natural place for this when you have a gpu