111 points by awsguy 6 months ago flag hide 15 comments
lambdalover 6 months ago next
This is such a cool project, I've been looking to build something similar for my company! I'm curious, what challenges did you face when building the pipeline? Any tips for those looking to do the same?
serverless_dev 6 months ago next
Thanks for the kind words! The main challenge I faced was figuring out how to handle the limitations of AWS Lambda, specifically the 15 minute execution time limit. I ended up using AWS Step Functions to manage the transcoding process and handle time-intensive jobs.
aws_expert 6 months ago prev next
Step Functions is definitely the way to go when dealing with long-running Lambda functions. Another option is to use AWS Fargate to run a containerized transcoding application, but that can be more expensive.
cost_concious 6 months ago prev next
Speaking of cost, have you looked into using other cloud providers for transcoding? Google Cloud has some pretty competitive pricing and I've heard good things about their video API.
lambdalover 6 months ago next
I did consider Google Cloud, but ultimately decided to stick with AWS because of their extensive documentation and community support. In terms of cost, I haven't had any issues with AWS as of yet, but I'll definitely consider Google Cloud for future projects.
newbie_dev 6 months ago prev next
@LambdaLover could you explain how you set up the pipeline? I'm having trouble figuring out where to start.
serverless_dev 6 months ago next
Sure thing! I started by setting up a simple API Gateway to trigger the Lambda function when a new video is uploaded. Then, I used the AWS SDK to interact with Elastic Transcoder, which handles the actual transcoding process. From there, I used S3 to store the transcoded videos and implemented CloudFront for content delivery.
newbie_dev 6 months ago next
Thanks for the explanation, that makes a lot of sense! One more question, how did you handle the video metadata? Do you have any recommendations for a database solution?
lambdalover 6 months ago next
I used AWS DynamoDB to store the video metadata, as it's a fully managed NoSQL database and integrates well with the rest of the AWS ecosystem. As for recommendations, I would say to consider the volume and type of data you'll be dealing with before deciding on a database solution.
aws_expert 6 months ago prev next
Another option for managed databases on AWS is Amazon DocumentDB, which is designed for use with MongoDB workloads. It's a good choice if you're already familiar with MongoDB syntax and structure.
security_concious 6 months ago prev next
How do you handle security in your pipeline? Specifically, how do you ensure that only authorized users can access the transcoded videos?
serverless_dev 6 months ago next
I'm glad you brought up the topic of security! In my pipeline, I use Amazon Cognito for user authentication and authorization. Cognito integrates with AWS Identity and Access Management (IAM) to control access to your resources based on the user's authenticated state. For example, you can restrict access to a specific S3 bucket based on a user's IAM role, ensuring that only authorized users can view or download the transcoded videos.
security_concious 6 months ago next
Thanks for the detailed response! I'll definitely be looking into Cognito for my next project.
video_editing 6 months ago prev next
This is a great project, but have you considered adding support for closed captions or subtitles? I think that would make it even more user-friendly and accessible.
serverless_dev 6 months ago next
That's a great suggestion! I actually had closed captions and subtitles support in a previous version of the pipeline, but removed it to simplify the architecture. I'll definitely consider adding it back in in the future!