Can't wait to get this, my new home will be ready early next year. I will for sure get the homex set
3 seconds for a local LLM is a GREAT objective.... I am interesting on a product like this but I think I prefer to run all on unraid local, with a powerful GPU. good work.
This concept is the holy grail for replacing a house full of Google Home or Alexa devices. I agree with everyone else that unless you can find some HW that is purpose built for LLMs, selling a server will likely not be as popular as having the customer provide their own compute. As for pricing, the max I think i could pay would be $100 CAD per room. Any more and it is too expensive to get multiples for every room. Obviously the cheaper you make it the more adoption you'd see. I assume the satellite would also run an audio streaming client like you did in your previous video. While I think an AUX out is a must so you can plug it in to a real speaker, you might want to consider putting a smaller one onboard for those who just want a one device solution.
So cool! Onboard and gave my feedback thru the link. I'd buy something like this in a heat beat to replace my Alexa<>HA voice control set up. Essential to me would be 1/ local 5G wifi 2/ TTS 3/ "just works" out of the box ..happy to do all the HA config but don't want any stuff around with hardware 4/ ability to select what AI engine to use both now and into the future. Thanks for the awesome video and fantastic build! What an exciting project!
Love your work, you inspired me to create something similar on my local hardware
I just love the work that you are doing! Tried to go into the exact same direction but you are far more advanced. I also would love to see this setup as a product, because I realized that progress on HA also impacts my custom setup and therefore needs a to of attention and maintenance. So yes, I absolutely would be interested if you can offer this as a product in the future.
Thank you for your work! This project is very cool and exciting! I’d love to buy a device like that for around-350$ especially if everything is completely open sores and adjustable, and I can keep adding functions and stuff. I wish you success with your project.👍👍👍
I am loving to see this it's something I have been planning on making for a bit
Ready to buy 4 !
super cool project. looking forward to this developing.
Love your work! Keep it up!
Love it! Get the performance a bit faster and I'll be ready to buy. I do need to do presence detectors, I'd get this, but it's a bit too large to put in the corner of my rooms.
Love the concept. Would definitely buy the voice/bluetooth/mwave part, if could be ported over to local LLM running on any personal hardware (old pc converted to server, nvidia GPU acceleration, MAC silicon run, or any hardware that can run the llama server).
Yes yes yes!!! Great work. I want it
this is amazing !! looking forward next videos!
Absolutely amazing. Looking forward to a sub 1 second response time that feels like you are talking to a human in real time!
Love the project. Keep up the good work.
Yes I would totally be interested in buying something like this
It would be cool to have a device that can plug directly into a sound system or single speaker that turns it into voice assistant with actually good sound quality (for playing music/casting)
@BFesper