Very nicely done, congrats, but, not a word about the content? What is available to show on such a device? A self-refreshing single URL only? A full-blown Home Assistant client? What does the admin panel provide?
Right now it's a structured display rather than just a single URL refresh.
Out of the box it handles things like time, schedules, rotating messages, and weather. The layout is driven by JSON and the admin UI lets you define slides, fields, and timing without touching code. There should be screenshots available on the GitHub.
It's not trying to be a full Home Assistant frontend — more like a lightweight, purpose-built display that boots straight into something usable.
You can extend it pretty easily since it's just PHP + JSON under the hood, but the default goal was: install to immediate working wall display then customize from there. Change numbers of fields per slide. Change whether those fields are static or weekly. Change slide titles. Change font sizes and type settings. Adjust screen timing. Turn slides on or off. Set dimming schedules and reboot, schedules and screen off schedules. Change weather location. Set up remote updating via email and limit email updating to a single email account defined in the settings.
Interesting. I use HA myself (though not the AI/LLM side yet). I think it's a different layer: Home Memory is the physical reality of the whole house, not just what's wired and smart. How do you picture "running inside"? Chat from the HA UI, or just running on the same hardware? I haven't dug into the feasibility yet. The part I'm fairly sure about: this only shines with a model that reliably uses tools. 23 MCP tools is a lot for a small local model, Claude or GPT tier handles it fine. What conversation agent are you running in HA?
I've been running Home Assistant for 5 years now. No turning back, it's so addictive (in a good sense). I didn't have a chance to start with AI/LLM in HA either, nor did I ever have a chance to use speech with HA. It will all come in this year, I hope. The other day, going through a cluttered drawer, looking for something, it dawned on me that the perfect solution to that problem would be if I could talk to an AI and explain that item X is in drawer Y of cabinet Z in room A. That would be a perfect interface to an inventory management application. And, since I'm using HA more and more for absolutely everything in my life, then, the perfect place to keep all that data would be HA itself. Later on, all that data will be useful in HA anyway, one way or the other. So, yes, if I need to build talk-to-LLM from every room infrastructure, then it would make it much harder to include other projects in the pipeline. HA mobile app already has talk-to-HA functionality which I can't use for Home Memory. And, frankly, having some home related data in Home Assistant, and some other home related data in Home Memory feels like a split brain situation to me. So, ideally, Home Memory should be just an integration/add-on/HACS to Home Assistant, in my humble opinion. What you did there with a windows server is a great start, and I will definitely test it out, but, eventually, I strongly believe it should be fully integrated into Home Assistant. Let me know if I can be of help. Thanks.
You just gave me the idea to 1) photo a cluttered drawer and have AI identify all the things, 2) it'd be nice if it could take that output and structure it (eg, items A, B, and C are in drawer 1) and 3) maybe even have a cheap cameras in the important places that update periodically. I dunno. Also maybe have it connect to this home memory system, ill check it out more. Thank you both!
Put all notifications to silent, except for the calls. Explain to everyone that, to get your immediate attention they have to call you. Look at your (now silent) messages when you have spare cycles (in between the focused sessions).
reply