![](https://static.wixstatic.com/media/9a3369_b894edd3ea4a4820bc8277d945674cc3~mv2.jpeg/v1/fill/w_971,h_421,al_c,q_85,enc_avif,quality_auto/9a3369_b894edd3ea4a4820bc8277d945674cc3~mv2.jpeg)
Let me start this article by telling you that "Serverless Architecture" does NOT mean running your code without any servers. Serverless means writing and deploying code without worrying about servers, containers, or even virtual machines.
Let me give you a quick walkthrough of the evolution of this concept.
Physical Servers
![](https://static.wixstatic.com/media/9a3369_c4c656f93fe9434c81d06dbf58fc044e~mv2.jpeg/v1/fill/w_318,h_301,al_c,q_80,enc_avif,quality_auto/9a3369_c4c656f93fe9434c81d06dbf58fc044e~mv2.jpeg)
Not so long ago, many of my clients invested a huge amount of money in procuring physical hardware to host their applications. Their mindset was very clear at that time, why to rent when you can buy and keep the asset with yourself? I am sure all their investments are now occupying some valuable square feet of their office space. In short, we all are aware of issues and limitations associated with physical servers and the advantages of moving to the cloud.
![](https://static.wixstatic.com/media/9a3369_fbb3650873b64fabbff8e7d6ae1a1186~mv2.jpeg/v1/fill/w_514,h_375,al_c,q_80,enc_avif,quality_auto/9a3369_fbb3650873b64fabbff8e7d6ae1a1186~mv2.jpeg)
Virtual Servers
Cloud was a revolutionary concept introduced to the public by the year 2005. Now Physical servers are replaced by virtual ones and servers term is no longer a physical entity but a unit of code running somewhere with on-demand availability of storage and computing resources available on a click of a button. Initially, there were apprehensions for cloud and concern regarding location, data, security, but slowly cloud emerged as the winner over the physical one.
Containers
![](https://static.wixstatic.com/media/9a3369_005da2d92c4445918f32379b2431309b~mv2.jpeg/v1/fill/w_387,h_403,al_c,q_80,enc_avif,quality_auto/9a3369_005da2d92c4445918f32379b2431309b~mv2.jpeg)
Application deployment, be it on a physical or virtual server has its own challenges. One step further to application isolation and ease of deployments, containers were introduced which are VMs that provide application isolation with lighter footprints. Containers have their own file system, CPU share, memory, process space with a much lighter footprint. Containers contain your deployed resources and can run on any VMs or on Physical servers.
Serverless Architecture
Even with VMs and Containers, you need to worry about procuring servers, deploying stuff, making sure patches are up to date, ports are open, antivirus is installed, and above all monitor your application and making sure you scale up or scale down as per application need.
Serverless also called FaaS (Function as a Service), wherein your application is broken down into functions that are hosted by a provider that executes your functions.
You will be charged with the amount of time your function runs, that's really cool, right?
![](https://static.wixstatic.com/media/9a3369_abc1bbca3f7141dca348e47ee2dd84b8~mv2.jpeg/v1/fill/w_980,h_497,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_abc1bbca3f7141dca348e47ee2dd84b8~mv2.jpeg)
Serverless Architecture can be considered as a step further to Microservice Architecture. With Microservice you break down your Monolithic application into a smaller set of independent application units, with serverless you further break down your Microservice into a smaller set of independently running functions. Execution of each function is in its own container or host and are isolated from each other. Isolation is the key to scalability and availability,
Architecting Serverless applications are not as simple and efficient as it may sound. You need to think differently in order to get maximum out of this architecture since the kind of development mindset we carry right now needs a paradigm shift. It is also important to select the right kind of use case for such implementations. We will see some use cases for the same in the coming section.
Serverless Providers
Serverless Architecture can only be implemented on cloud platforms. Almost all leading cloud service providers also provide serverless computing. Let me give you a glimpse of some of them,
AWS Lamda
Serverless functions are called Lambda functions in AWS. AWS Lamda was introduced in Nov 2014. With Lamda you can write Serverless functions in any of the following programming languages
Node.js | Python | Java | Ruby | C# | Go | PowerShell
Azure Functions
Microsoft introduced Serverless computing with Azure Functions. With Azure Functions, you can write Serverless code in any of the following programming languages
C# | Javascript | F# | Java | Powershell | Python | Typescript
IBM Cloud Functions
IBM's Serverless computing service is called IBM Cloud Functions. With Cloud Functions, you can write code in any languages below
Node.js | Swift | Python | Java | Ruby
Google Cloud Functions
Google's Serverless computing is also referred to as Google Cloud Functions. Language supported by these functions are
Node.js | Java | Python | Go
Concurrency & Function Scaling
Hassle-free deployment and no brainer scaling is what FaaS offers to its users. Different providers have different ways of providing scalability options to end-users. Let us discuss how concurrency is managed and scaling is offered by AWS and Azure.
AWS Lambda
AWS isolates function execution by spinning a small version of a container called micro container and executes the Lambda function inside this container. So how Lambda maintains isolation? Well for every function execution/request it runs a new instance of microcontainer and a function instance inside that controller which takes up your request and performs the execution.
![](https://static.wixstatic.com/media/9a3369_a3176089cb5349e58f51ad2cd5e2f1f5~mv2.jpeg/v1/fill/w_732,h_666,al_c,q_85,enc_avif,quality_auto/9a3369_a3176089cb5349e58f51ad2cd5e2f1f5~mv2.jpeg)
Each execution results in the creation of a new instance of the container and the function, so if suppose there are 1000 concurrent requests, 1000 containers and functions will be there to serve your request simultaneously. Once your request is complete, AWS will free up the resources. So how many instances of your functions are allowed at one point in time i.e concurrently?
AWS provides region-wise concurrency limits to your account. By default, AWS will provide 1000 account concurrency per region. This means that if you are having Lamda functions in one region, at one point in time 1000 instances of lambda functions will be executed by AWS. So 1000 being the upper limit, you can set a lower value of your own which is called Reserve Concurrency which should be any value below 1000. You can increase the upper limit by contacting AWS support.
If your concurrency limit is say 500 then once your concurrency level reaches 500, all further requests will be throttled, which means subsequent requests will be rejected by lambda till the execution of 500 finished.
So what if you have multiple Lambda functions under the same region, will they all jointly share the concurrency limit for your account? Sadly, the answer is Yes, all functions will share the same concurrency limit and that's where you have to be very careful in writing functions. A function belonging to one application may have an impact on a function in other applications if both are in the same region.
Luckily there is an option to set a cap of these concurrency limits per function, it is therefore recommended that you set your concurrency limit per function as well.
Azure Functions
![](https://static.wixstatic.com/media/9a3369_866a4a3938ba4adf9d8f078e3e170bd4~mv2.jpeg/v1/fill/w_754,h_666,al_c,q_85,enc_avif,quality_auto/9a3369_866a4a3938ba4adf9d8f078e3e170bd4~mv2.jpeg)
Azure takes an altogether different approach in processing Serverless functions. Azure creates a dedicated host instance to run the function and executes multiple instances of functions inside the same host. It is unclear how many functions are executed inside a single host.
Azure provides you with options to choose function hosting plans as per your application need. There are three types of plans provided by Azure, Consumption Plan, Premium Plan & Dedicated Plan.
With both Consumption & Premium plans, Azure automatically assures the availability of required compute power to run these functions, functions are automatically scaled out whenever required.
Consumption Plan
This is the default hosting plan provided by Azure. Consumption plan adds or removes host instances automatically depending upon your incoming requests. It is important to note that, you will be charged for computing resources only when the function is running inside these hosts. Billing depends upon the number of executions, execution time, and memory used.
Premium Plan
Premium plan provides the same kind of benefits like the Consumption plan does with respect to host addition and auto-scaling. Beyond this premium plan also provide the following additional benefits
Perpetually warm instances to avoid any cold start (We will discuss cold start in below section)
VNet connectivity
Unlimited execution duration (60 minutes guaranteed)
Premium instance sizes (one core, two core, and four core instances)
More predictable pricing
High-density app allocation for plans with multiple function apps
Dedicated Plan
Dedicated Plan provides you the option to execute serverless functions under the VM of your choice. You can choose a dedicated plan if you already have an underutilized VM or you want your functions to run on a specific image of a VM.
Creating A Sample Lambda Function
Let us try to create a simple Serverless Function using AWS Lambda. Go to your AWS services console and select Lambda, below screen will be displayed.
![](https://static.wixstatic.com/media/9a3369_eca73755fec54d1f93e19a8938fa3c88~mv2.png/v1/fill/w_980,h_550,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_eca73755fec54d1f93e19a8938fa3c88~mv2.png)
Click the Create Function button to display below the screen
![](https://static.wixstatic.com/media/9a3369_58c40ce8949a450d9a16dfcb72d82ecc~mv2.png/v1/fill/w_980,h_550,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_58c40ce8949a450d9a16dfcb72d82ecc~mv2.png)
You can choose to create a lambda function from Scratch, Use a blueprint provided by AWS or browse a public serverless repository that is created by other users.
Let us Create a simple lambda function from Scratch, let us give it the name HelloLambda. I will take a very simple example for the sake of keeping this post short. Clicking the "Create Function" button will create a Lambda function and take you to function
![](https://static.wixstatic.com/media/9a3369_691c34a7d7d940df83a0c1ba1c582b3d~mv2.png/v1/fill/w_980,h_547,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_691c34a7d7d940df83a0c1ba1c582b3d~mv2.png)
AWS provides you with a built-in code editor which is used to write code for your function. You also have the option of upload your existing code using Action > Upload a zip file. You need to create a Lambda function with format export.handler = function(), this will be the entry point to your function.
Your function can import and use other files as well if required, however, this is the starting point for your function. For simplicity let us return "Hello Lambda World" in response when this function is invoked.
![](https://static.wixstatic.com/media/9a3369_71ddcd906a1d40ada58eb16842b43fcc~mv2.png/v1/fill/w_980,h_336,al_c,q_85,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_71ddcd906a1d40ada58eb16842b43fcc~mv2.png)
Click on the Deploy button to deploy your Lambda function.
Invoking Lambda Function
Once you have a Lambda function, you need to find a way to execute this function. Well, Lambda provides you with options to add Triggers in order to run your Lambda function. A trigger is a Lambda resource or an external resource that you configure to invoke your function. You can add one or many triggers in order to invoke your Lambda function. Click on the "Add Trigger" button to display the below screen,
![](https://static.wixstatic.com/media/9a3369_67f0aaee0df44c1898bb95e47de2fb1a~mv2.png/v1/fill/w_980,h_1037,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_67f0aaee0df44c1898bb95e47de2fb1a~mv2.png)
As you can see, a predefined list of events is available which can be used to trigger a Lambda function. Let us select "API Gateway" as one of the trigger events.
Selecting API Gateway trigger will display a screen to configure API Gateway. You can create a new wrapper API and attach this API endpoint to your Lambda function, so basically whenever your API is called, the Lambda function will be executed, pretty cool right? For simplicity, let's keep the Security as Open.
![](https://static.wixstatic.com/media/9a3369_16f721a0bdc54779ab9765fe13bb6d50~mv2.png/v1/fill/w_980,h_880,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_16f721a0bdc54779ab9765fe13bb6d50~mv2.png)
Clicking "Add" button will create an API and this trigger information will also be displayed graphically in your main window. Click on the API Gateway box and your API endpoint will be displayed.
![](https://static.wixstatic.com/media/9a3369_da45624fd8c24913a6285cbe2e8a5c99~mv2.png/v1/fill/w_980,h_771,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_da45624fd8c24913a6285cbe2e8a5c99~mv2.png)
Copy this endpoint and run it on the browser, "Hello Lambda World" will be displayed in the browser.
![](https://static.wixstatic.com/media/9a3369_f5a6add1651447dca98816ac670ae6d5~mv2.png/v1/fill/w_980,h_444,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_f5a6add1651447dca98816ac670ae6d5~mv2.png)
It is pretty simple to attach an API endpoint to execute your function, you can also choose triggers like S3 events, Alexa events etc., or can choose from a range of third-party services like MongoDB, Zendesk, Shopify etc.
![](https://static.wixstatic.com/media/9a3369_b05a01f55c51489b8d00b4dcdd1671f1~mv2.png/v1/fill/w_980,h_1078,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_b05a01f55c51489b8d00b4dcdd1671f1~mv2.png)
Testing the Lambda Function
Instead of directly attaching to an event to test a Lambda function, a developer has an option to test the function directly from the console.
You can create a Test event by clicking the "Test > Configure Event" menu in the code editor. You can pass any desired value from your Test event to the Lambda function. Once your test event is created, you can select the event and click on the Test button to execute your Lambda function. The results will be displayed in the console window of the code editor.
Cold Start
As soon as you deploy a serverless function, it is not ready for execution. The moment the first request is made, the Lambda function is initialized inside a container and is now ready to take requests. This initial load time is called Cold Start.
This means that there will a time lag between your request reaching the server and the start of your function execution. Now if your application sees unpredictable spikes, which means you need to serve a lot of users in a very small fraction of time, in such scenarios, you can actually make your Lambda functions to perform a cold start in advance.
In your function console, go to the Concurrency section and click "Add" button. This will display below screen,
![](https://static.wixstatic.com/media/9a3369_024cc2005407439aa5262a8b9db43aa1~mv2.png/v1/fill/w_980,h_729,al_c,q_90,usm_0.66_1.00_0.01,enc_avif,quality_auto/9a3369_024cc2005407439aa5262a8b9db43aa1~mv2.png)
Provisional concurrency is a pre-initialized execution environment that makes sure that your functions are preloaded and are ready for execution. You can select concurrency for Alias or a Version. Now, for example, if I enter 100 in the Provisioned concurrency text box, this will tell Lambda to provision 100 concurrent instances which will be ready to take up my function requests. Lambda will spin up 100 micro containers and load the functions in Advance.
Benefits Of Serverless Architecture
Some of the benefits that Serverless architecture provides are
Easy Deployment & Auto Scaling
Charged Only On Execution
Improved Latency & Global Reach
Shorter Development & Release Cycle
Use Case For Serverless Applications
Although there can be tons of uses cases for implementing a Serverless architecture, below are some of them
Backend API for Websites
Microservices
Files & Data processing
Data transformation
Automation of CI/CD pipeline
Cron Jobs
Backend for IOT devices
You may have realized by now that Serverless functions provide a new way of writing software and will eventually evolve replacing many development methods we are using today.
Happy Reading...
コメント