123 points by serverlesssam 7 months ago flag hide 10 comments
someuser1 7 months ago next
Great post! I've been thinking about building a serverless app for a while now. Do you mind sharing more about how you ensured 99.99% uptime? I'd imagine that would be quite challenging.
original_poster 7 months ago next
Glad you enjoyed the post! Uptime was a major concern for me too, but I decided to use a combination of AWS Lambda@Edge and S3 for hosting. This way, gaming servers are spun up as needed and destroyed when not in use. The game has a small lobby where players can meet before starting a game, and this is what keeps the app always available, driving up uptime. The game itself uses WebRTC for peer-to-peer communication, avoiding the use of dedicated servers altogether.
someuser2 7 months ago prev next
Interesting design. How do you maintain state and user sessions though? Are players required to rejoin a game every time it scales up?
original_poster 7 months ago next
No, players don't need to rejoin games as I use a mix of Lambda, DynamoDB, and AWS AppSync to manage state and sessions. I store game sessions in a DynamoDB table, and AWS AppSync triggers a Lambda function to generate game states for players. For WebSocket-based real-time features, I use AppSync's GraphQL subscriptions to avoid maintaining long-polling connections. Everything is automated and runs within the AWS ecosystem.
someuser3 7 months ago prev next
This feels innovative, but how do you manage possible performance differences between players for WebRTC gaming? Is there any sort of mitigation?
original_poster 7 months ago next
That's a great question! As the browser doesn't have real resource management or processing capabilities difference between players can still occur. Most of the heavy logic is applied on the server-side via Lambda, so the browser is mainly responsible for rendering. To even out the differences between players, I implemented a few things like: Lag compensation, movement interpolation, and a systematic prediction mechanism while the browser is rendering. Hope that helps and keeps performance stable between players.
someuser4 7 months ago prev next
Thanks for the explanation. Did you face any difficulties deploying to a production environment?
original_poster 7 months ago next
Yes, definitely. Here are some of the challenges I faced:1. Ensuring 99.99% uptime proved more difficult than I imagined, requiring constant monitoring and refinements to the system0.2. Fine-tuning AWS Lambda@Edge a bit tricky, especially since I had to make sure the latency for global players was minimal0.3. WebRTC can be hard to manage between browsers, so I provided a fallback system using WebSocket0.4. Cost optimization--keeping track of AWS fees and managing resources efficiently0.5. Calling AWS services directly from Lambda has a cold start problem, so I also resolved that by using serverless-http-aws-lambda package. Overall, there were many obstacles, but it's incredibly rewarding to see the app in production now.
someuser5 7 months ago prev next
Wow, that sounds like a huge lift! Have you open-sourced your implementation, and what tech stack did you choose for your front-end?
original_poster 7 months ago next
Unfortunately, I haven't open-sourced anything yet, but I'm planning to upload the source code to my GitHub early next year (2023). The front-end of my app uses React with TypeScript for the primary UI framework. I also rely on Redux-Saga for side-effects, and AWS Amplify for the integration with my backend services and authentication.