#serverless
19 posts

The best ways to test your serverless applications

For the serverless functions you write, test for each of the following risks: configuration (databases, tables, access rights), technical workflow (parsing and using incoming requests, handling of successful responses and errors), business logic and integration.
Read more

The best ways to test your serverless applications

  • For the serverless functions you write, test for each of the following risks: configuration (databases, tables, access rights), technical workflow (parsing and using incoming requests, handling of successful responses and errors), business logic and integration (reading incoming request structures, storage order in databases).
  • Break up functions into hexagonal architecture (ports and adapters) with separation of concerns through layers of responsibility.
  • For unit tests, use a local adapter or mock as an adapter to test the function business layer in isolation.
  • Use adapters to simulate to test integration with third-party end services. Save memory and time by testing not for full integration but file storage integration with the in-memory adapter.
  • For proper monitoring of integrations, use back-end tools such as IOpipe, Thundra, Dashbird, Epsagon, etc., and front-end tools such as Sentry or Rollbar. You can also use an open-source error tracking app such as Desole that you install in your AWS account.

Full post here, 10 mins read

The good and bad of serverless

The good It’s truly scalable & saves you from the pains of managing servers manually. Serverless applications are a notch above Virtual Private Servers - you only pay for what you need.
Read more

The good and bad of serverless

The good

  • It’s truly scalable & saves you from the pains of managing servers manually.
  • Serverless applications are a notch above Virtual Private Servers - you only pay for what you need.
  • Developers on your team don’t have to deal with the technicalities of setting up scaling policies or configuring load balancers, VPCs, server provisioning, etc.

The bad

  • Cold starts when a function has been idle. To solve it, ping your functions periodically to ensure they’re always warm or set up a single function to handle all API calls in order to ensure that cold-starts only happen once.
  • The need for applications to be truly stateless. You must design your application to be ready to serve a request from a cold, dead state.
  • Not ideal for long-running jobs. Re-examine whether the time limit hinders your ability to process all the data or try using Lambda recursively.

Full post here, 9 mins read

Cold start/warm start with AWS Lambda

Programming language can impact the duration of a cold start in Lambda: Java and C# are typically slower to initialize than Go, Python or Node but they perform better on warm calls.
Read more

Cold start/warm start with AWS Lambda

  • Programming language can impact the duration of a cold start in Lambda: Java and C# are typically slower to initialize than Go, Python or Node but they perform better on warm calls.
  • Adding a framework to structure the code deployed in Lambda increases execution time with cold calls, which can be minimized by using a serverless-oriented framework as opposed to a web framework. Typically, frameworks don’t impact warm calls.
  • In serverless applications, one way to avoid cold starts is to keep Lambda warm beyond its fixed 5-minute life by preventing it from being unloaded. You can do this by setting up a cron to invoke Lambda at regular intervals. However, AWS Lambda will still reset every 4 hours and autoscaling must be taken into account.
  • To avoid cold starts in case of concurrent calls from automatic autoscaling, make pools of Lambda instances kept warm as above; but you will need to determine an optimal number to avoid wasting resources.

Full post here, 11 mins read

Tips to speed up serverless web apps in AWS

Keep Lambda functions warm by invoking the Ping function using AWS CloudWatch or Lambda with Scheduled Events and using the Serverless WarmUP plugin.
Read more

Tips to speed up serverless web apps in AWS

  • Keep Lambda functions warm by invoking the Ping function using AWS CloudWatch or Lambda with Scheduled Events and using the Serverless WarmUP plugin.
  • Avoid cross-origin resource sharing (CORS) by accessing your API and frontend using the same origin point. Set origin protocol policy to HTTPS when connecting the API gateway to AWS CloudFront and configure both API Gateway and CloudFront to the same domain, and configure their routing accordingly.
  • Deploy API gateways as REGIONAL endpoints.
  • Optimize the frontend by compressing files such as JavaScript, CSS using GZIP, Upload to S3. Use the correct Content-Encoding: gzip headers, and enable Compress Objects Automatically in CloudFront.
  • Use the appropriate memory for Lambda functions. Increase CPU speed when using smaller memory for Lambda.

Full post here, 4 mins read

4 serverless myths to understand before getting started with AWS

One myth is that serverless implies Functions as a Service (FaaS). Cloud services are serverless if no servers are exposed for you to administer, if they scale automatically and you pay for what you use only.
Read more

4 serverless myths to understand before getting started with AWS

  • One myth is that serverless implies Functions as a Service (FaaS). Cloud services are serverless if no servers are exposed for you to administer, if they scale automatically and you pay for what you use only. In fact, serverless need not mean web-based apps, and can include real-time analytics and processing, so look beyond functions.
  • Don’t think that serverless is a silver bullet. Serverless technology is best suited for event-based architectures, rather than traditional client-server architecture, and you need to beware of recreating monolithic structures.
  • Another common myth is that serverless means an end to operational burdens. Advanced observability is intrinsic, so you need operational effort to monitor, maintain and effectively scale, though you need not administer servers.
  • Don’t believe that serverless is infinitely scalable. Serverless services have high availability but cannot scale infinitely - each service has limits, such as lambda’s memory limits and Kinesis’ throughput limits - so you need to optimize for the limits and plan for failure scenarios to ensure resilience.

Full post here, 6 mins read