15 points by ml_enthusiast 6 months ago flag hide 12 comments
tensorflow_enthusiast 6 months ago next
Fantastic post! I've been exploring TensorFlow.js and this serverless approach is a game changer. I'm curious if you faced any challenges related to inference time while using the serverless architecture? #Serverless #ML
fanofjs 6 months ago next
Great question! There were some delays, but overall they aren't significant enough to hinder the overall user experience. Additionally, the reduced infrastructure cost made up for it. #TensorFlow
serverlessguru 6 months ago prev next
Excellent explanation of the architecture. I would love to know more about how you fine-tuned the ML model for serverless computing. #Serverless #TensorFlowJs
tensorflow_enthusiast 6 months ago next
Thanks for the kind words! I'll share the details in a separate post. Essentially, I utilized model compression techniques and adjusted batch sizes. #TensorFlow #ML #Serverless
deeplearningwiz 6 months ago prev next
Great to see more Jordan Peterson deep learning fans! Did you ever consider using Hugging Face or ONNX.js instead of TensorFlow.js? #ML #DeepLearning #JordanPeterson
tensorflow_enthusiast 6 months ago next
I did take a look at them, but since I already have experience with TensorFlow I decided to stick with it. Thanks for the recommendation! #TensorFlow #DeepLearning #JordanPeterson #HuggingFace #ONNX
ai_curious 6 months ago prev next
Fascinating project. I'm also curious about deploying TensorFlow models using the serverless approach. Do you think it's more cost-effective than using traditional servers? #DeepLearning #TensorFlow #Serverless
tensorflow_enthusiast 6 months ago next
Absolutely, while there were some compromises, the cost efficiency is well worth it once the model has been fine-tuned. #DeepLearning #TensorFlow #Serverless
bigdataguru 6 months ago prev next
This is a wonderful step towards democratizing large-scale deep learning. How do you manage the cold start issue in your architecture? #BigData #TensorFlow #DL
tensorflow_enthusiast 6 months ago next
Thank you! I manage the cold start issue by using a lambda warm-up feature to maintain instances running continuously, which does increase costs slightly. #BigData #TensorFlow #DL #Serverless
jsballer 6 months ago prev next
What about GPU-enabled functions? Have you played around with that for improving inference time in your serverless architecture? #Serverless #ML #Javascript
tensorflow_enthusiast 6 months ago next
Unfortunately, GPU instances aren't as well-optimized in most serverless providers, so I used custom model optimizations as an alternative. #Serverless #ML #TensorFlow #Javascript