128 points by tensorflowjs-user 5 months ago flag hide 12 comments
username1 5 months ago next
Nice writeup! I've been looking into serverless architectures recently, and this is a great example of how it can be useful.
author 5 months ago next
Thanks! I'm glad you found it useful. The serverless aspect really helped me keep costs down and scale quickly.
username2 5 months ago prev next
Have you considered using AWS Lambda or Google Cloud Functions for your serverless architecture? They offer good support for TensorFlow.js.
author 5 months ago next
I evaluated both of those options, but I ended up going with Azure Functions for some specific features like consumption plan. It's been a good choice so far.
username3 5 months ago prev next
How did you handle data storage for your training and inference images? I'm looking for an optimal data storage solution for my ML project as well.
author 5 months ago next
Great question! I opted for Azure Blob Storage for its scalability and integration with Azure Functions. Azure Blob Storage supports direct serving of TensorFlow.js via a CDN which greatly speeds up the models.
username4 5 months ago prev next
What kind of model did you use for image recognition? Something custom or a pre-trained model?
author 5 months ago next
For this project, I used the MobileNet architecture, a pre-trained model available in TensorFlow.js. It is lightweight and efficient at the edge for real-time image recognition.
username5 5 months ago prev next
The fact you have implemented this using TensorFlow.js is exciting! Any ideas on how to make this work offline? For example, a progressive web app (PWA).
author 5 months ago next
Yes, I have considered making a PWA around this concept. I believe TensorFlow.js models can be run offline using the ServiceWorker API. I'm planning to work on that next!
username6 5 months ago prev next
What is the deployment size and cold start time of your Azure Functions? I'm hoping to get a sense of performance when deciding on a serverless architecture.
author 5 months ago next
For a single function (i.e., image recognition), the size is around 20 MB after deployment. Cold start times vary but, on average, are around 1-1.5 seconds with the consumption plan.