123 points by edg3ninja 5 months ago flag hide 16 comments
ml_enthusiast 5 months ago next
This is really exciting! Edge devices are becoming more and more common, and efficient ML inference can give them a big advantage in many applications.
hypeddeveloper 5 months ago next
Absolutely! Right now, cloud-based NN inference is just not scalable enough for many use-cases. I think the future of AI is going to be right at the edge.
ai_researcher 5 months ago next
There are still so many unanswered questions, like does ML at the edge really adopt to its context, or is it simply using a pre-built model alone? What if that model could work better with additional data? Precious data is being left in the dust...
ml_enthusiast 5 months ago prev next
Edge devices are becoming more powerful all the time! Paper mentioned TinyML; it has potential to run ML algorithm on 10,000 devices with art-of-the-state SoC, or 10 million with an older chip.
techjournalist 5 months ago next
You raise excellent questions, @AI_Researcher. In an upcoming article, we'll examine the adaptation issue, context awareness, and other factors that can help models learn from local data.
x_techlover 5 months ago next
Definitely curious about future evaluations on this topic. More than the edge or the cloud, it's the technology that rules the future.
randomguy09 5 months ago prev next
Maybe I'm old fashioned, but I really prefer server-side computations. Edge devices just seem weirdly limited in comparison.
opensourcefantasy 5 months ago next
Resource constraints will keep pushing for more chip-level improvements (thinking #QuantumComputing here too). Edge devices will far outpace desktops in a few years. It's now, not the future.
ml_enthusiast 5 months ago next
Quantum computing...wow! Are there any studies exploring on-edge QC while stations are still in early research? Might have even more potential than TinyML!
x_techlover 5 months ago next
I think QC is still confined to academic context. Edge computing power won't be close to what's needed to realize this vision yet. But ML on edge computing will mature and bring benefits in the coming years.
smartfarmtech 5 months ago prev next
Here at smartFarmTech, we're looking at ML on edge devices to automatically detect / catalog plant growth stages. The speed and efficiency of on-device inference is crucial when you have distant and isolated environments under test.
decentralizeddude 5 months ago next
I am really bullish on ML at the edge. But it's not just performance gains, it's about privacy and security too. Why trust your data with a third party when you can keep it safe on the edge device itself?
ethicsguru 5 months ago next
Decentralized, private - it sounds like a dream. But will it lead to fractured/suboptimal models in the end? I guess there are trade-offs for every approach.
mlexperts 5 months ago prev next
Is there a chance to learn more about how the code change was done, and how model conversion to the device was managed? Just curious if there's a detailed procedure authors recommend.
somewhatconfused 5 months ago next
The article mentions tools (e.g., OpenVINO) were used for adaptation and quantization. Probably non-standard NN architectures would need tweaking. I can't wait to see these results.
ml_enthusiast 5 months ago next
Great pointers! The tech stack makes it sound exciting. Looking forward to reading more and attempting adapting ML for edge devices. Thanks, all!