110 points by serverless_ninja 6 months ago flag hide 19 comments
lambdauser 6 months ago next
Exciting news! I've been playing around with this and it's awesome. Easy to deploy models using TensorFlow and Lambda. Great work AWS team!
awspro 6 months ago next
Happy to hear that! We've tried to make it as simple as possible for developers to deploy their ML models with minimal maintenance overhead!
optimizer 6 months ago prev next
This is interesting. But has anyone tried deploying larger models? Are there any limitations on model size?
awsdev 6 months ago next
We did test with bigger models and encountered some limitations. However, we have found ways to overcome them by splinting up bigger models and using Lambda layers.
mlops 6 months ago prev next
I agree with #optimizer, limitations on model size are a concern. One possible solution would be to deploy models on ECS/Fargate which allows for more flexibility in terms of memory and CPU utilization.
cyro 6 months ago prev next
Don't forget to consider the cost implications of deploying ML models using this new serverless architecture.
fargate_fan 6 months ago next
Absolutely true! Serverless architecture can be cost-effective but at certain scales, it might become expensive compared to traditional alternatives such as EC2.
cloud_enthusiast 6 months ago next
Are there any tools that can help estimate the cost of deploying ML models using lambda? Would be nice to have some overview of the costs before we start building!
awsguy 6 months ago next
Good question! AWS provides a cost estimation tool that can be used to estimate the costs for running serverless applications.
bigmodel 6 months ago prev next
Will this be available for other ML frameworks besides Tensorflow? E.g. PyTorch, JAX...
antonbla 6 months ago next
As of right now, only Tensorflow is supported, however we are always looking to expand the list of frameworks. Please provide your feedback on which frameworks are the most requested.
container_champ 6 months ago prev next
Have anyone try using TorchServe to deploy PyTorch models on ECS? This approach leverages containers to achieve a serverless function-like deployment.
function_guru 6 months ago next
Yes, this is a great alternative if you are using PyTorch and want to achieve serverless-like deployment with docker containers on ECS. #container_champ did you test the performance?
container_champ 6 months ago next
Performance was better than expected! TorchServe made managing, deploying, and scaling models more seamless.
greentea 6 months ago prev next
How does this compare to GCP and Azure's offerings for ML model deployment?
tech_evangelist 6 months ago next
Both GCP and Azure offer decent ML model deployment services, but AWS's approach of integrating with Lambda makes it more accessible for developers with AWS expertise.
cloud_warrior 6 months ago prev next
I personally think all three are strong options but it ultimately depends on the individual organization's preference and existing infrastructure.
agile_developer 6 months ago prev next
As a fan of AWS lambda, I'm curious if this will be part of the AWS free tier?
aws_marketer 6 months ago next
Serverless Deep Learning with AWS Lambda is not part of the AWS Free Tier at the moment, but it remains under consideration for future promotions.