Pavel Haletski is a Senior Software Engineer at Godel, specialising in Microsoft technologies and cloud computing. He recently ran a webinar for the Godel team about the value of serverless computing in Azure, and specifically Azure Functions – what they are, the challenges they solve and how to use them. He also ran a discussion, where he answered some tough questions about specific areas of Azure Functions.
What’s the problem that serverless computing solves?
Traditional on-premise servers are always switched on, which means you pay for the infrastructure whether it is running any systems or not. This is one of many reasons for the world’s shift toward cloud computing.
Despite its benefits over on-premise, cloud infrastructure is often managed poorly – there are plenty of ways for users to incur high spend when they don’t properly manage cloud environments. In general, infrastructure management is a time-sink for software engineers that takes focus away from actual development.
Infrastructure-as-a-service (IaaS) and Platform-as-a-service (PaaS) are cloud computing models that reduce the overhead of infrastructure management significantly (more-so with PaaS). Yet neither remove the responsibility completely, which can take developers’ focus away from code.
Serverless is the next level of provisioning beyond PaaS. It completely removes the need for developers to manage infrastructure. In recent years it’s become a more commonly adopted way for teams to manage applications that handle high volumes of events, because of its pay-as-you-go model and auto-scaling functionalities.
What is Azure Functions?
Azure Functions is a serverless solution that turn the focus away from managing infrastructure to coding, via its event-based scaling model. An Azure function is code that is triggered by a specific event – the function executes the code and handles all resourcing required to run that code.
For example – if a new message is added to a certain Azure Storage queue about a customer’s product order, an Azure Function can run code to process the order. If this is an e-commerce site that handles huge peaks and troughs in orders over holiday seasons, Azure Functions could be the perfect solution to flexibly scaling infrastructure to support high volumes of event triggers.
What’s the main benefit of Azure Functions over other cloud computing models?
For developers in industries like ecommerce, building and deploying code quickly is important. There is a specific set of triggers that applications commonly use which suit Azure Functions perfectly – primarily, high-volume or scheduled events. Applications like this can rack up huge bills if their infrastructure is managed poorly in other models such as IaaS.
How do Azure Functions work?
In a nutshell, an event triggers the function to run. Azure Functions has an auto-scaling mechanism to provision infrastructure dynamically depending on the volume of events at any given time.
Here are the key phrases, points and considerations:
- Triggers – for example a new message in the queue, a certain time of day, or a new file – indicate that an Azure Function should run.
- Time limitations – by default, functions have time limitations to limit consumption – for example the time limit on a HTTP event trigger is 230 seconds.
- Durable functions – this is an extension of Azure Functions that gets around the time limit of Azure Functions. Durable functions are stateful – meaning if you’re running a long operation and the server shuts down during it, a durable function will be able to continue execution from where it left off.
- Azure Functions Proxy – in Azure Functions, you can define a single API surface for multiple functions. This is a great way to manage lots of functions under one API surface.
- Bindings – this gives developers an easier way to connect functions to other resources (for example Azure Cosmos DB) than coding a connector – no need to read lots of documentation.
- Cold starts – one consideration of Azure Functions is that when a function starts for the first time (for example if API endpoint is being called for the first time) there can be a certain duration of latency before the code runs. It’s dependent on a number of factors (including trigger type, operation system and language used). When using Azure Functions in the dedicated plan, the Functions host is always running, which means that cold start isn’t really an issue.
When not to use Azure Functions?
By design Azure Functions are simple and modular. One function should not be in place to perform multiple tasks – as is the case in a microservices architecture, the purpose of a function should be highly specific. Azure Functions also aren’t designed to support CPU-intensive or overly complex workflows.
Can you give us an example of how you have used Azure Functions?
Sure! My team is working with a client in the media industry on a solution that uses data to generate reports for various purposes using Qlik BI platform.
The application is fed data from a lot of external sources, like Google Analytics and Microsoft Dynamics. Running on a scheduled timer, a trigger fires every day to execute a function that pulls new data from each source. Then the application aggregates this data and adds it to a central database. We use Azure Data Factory to validate the new data and clean it in preparation for loading it into our data warehouse.
Once a month the team needs to pull more extensive reports – Azure Functions lets us scale up dynamically for this event. If we were using IaaS or PaaS this would be a lot more expensive.
What were some of the questions you answered on the webinar?
One question was “Can a function be deployed while is being run?”. It isn’t a good idea to deploy a function when something is performed because everything depends on the operation. Durable operations are ideal here – you can turn the function off during a long operation, and when you switch it back on it will pick up where you left off.
Another was “When should I do a cold start after a function has been turned off?” The answer is, simply, ten minutes.
Somebody else asked, “If you implement something new, how do you make sure you don’t break anything when the application is under load?”. In this scenario monitoring systems come into importance. Azure Functions has a monitoring solution that integrates really easily with applications and has dashboards which show live performance of functions. In this scenario you could use the monitoring tool to set up alerts when problems occur.
My Plans for Scaling Data at Godel
Siarhei Oshyn was recently appointed as Godel’s new Head of Data, having previously worked within the Data division as Lead Software Engineer. We sat down with Siarhei to discuss the journey that led to his exciting opportunity and what he hopes to achieve in his new role.
Godel Shortlisted for a Computing DevOps Excellence Award 2021 for Outstanding DevOps Services.
Godel is delighted to have been shortlisted in the “Best DevOps Services Company” category at this year’s Computing DevOps Excellence Awards. The awards recognise outstanding achievement, personalities and solutions that have “successfully applied DevOps methodologies.”