In a well-architected system, an application oftentimes needs to execute tasks outside of a traditional web request. Some examples might include:
- Updating a search index
- Rolling-up log tables into aggregate tables, for reporting
- Sending email messages
- Executing tasks dependent on a 3rd-party
These tasks might need to run on a schedule, or in response to certain events, or continuously. They may need to touch .NET code, or run SQL queries or procedures, or interact with 3rd-party services.
Setting up a “task server”, for the purposes of performing these duties, is common practice. This may not seem like a worthwhile use of money, though, if your tasks are simple.
With Azure Webjobs, we can use our existing web server to run these utilities. This has multiple benefits:
- Familiar API with other Azure web services
- No extra cost, since we can configure a Webjob to run on our web server’s spare CPU
- No extra infrastructure to build and maintain
- Built-in logging and management with the Azure dashboard
Here are some scenarios in which Webjobs have come in handy:
AUTOMATING REPEATED TASKS
In a recent project, we built a WebJob to call several SQL stored procedures in order to send our data up to Azure Search. With Azure Search, we are able to take full advantage of its features on the frontend, allowing users to apply by distance, date, category and more.
Here’s our WebJob function, which is called on the cron schedule specified in settings.job:
namespace AzureSearchIndexer { public class Functions { private readonly ISearchManager searchManager; [NoAutomaticTrigger] public void SyncSearchIndexes() { // Update search indexes with new data searchManager.SynchronizeSearchIndexes(); } } }
settings.job file:
{ "schedule": "0 */5 * * * *" }
MESSAGE QUEUEING AND PROCESSING
In another project, we had to build an “analytics” system in order to track visits to a certain page over a given time period. Using an Azure Queue and a continuously-running Azure WebJob, queued messages are automatically pulled off the queue and stored to the appropriate analytics storage. From these analytics, we are able to determine statistics on which page is being visited more or less frequently, or determine which days experience higher traffic.
// Log a ViewedEvent to a queue pageEventsProxy.OnViewed(new ViewedEvent() { ImpressionType = ImpressionType.Detail, Ids = new long[] { request.Id }, Timestamp = DateTime.UtcNow, IpAddress = request.IpAddress, UserAgent = request.UserAgent, UserToken = request.UserIdentity });
When a new message appears on the queue:
namespace TelemetryProcessor { public class Functions { private readonly IPageEventsManager pageEventsManager; // This function will be triggered when a message // arrives on the queue public void ProcessMessage([QueueTrigger("queue")] ViewedEvent message) { // process the event pageEventsManager.OnPageViewed(message); } } }
KEEPING OUR ANALYTICS DATA UP TO DATE
Now that we have stored impressions for page views, we used an Azure WebJob to merge this impression data into our analytics database which has been sorted for the purpose of building statistics. The process of merging impression data is taken care of via SQL stored procedures which we trigger via a repeated WebJob. In this case, we’ve made use of a TimerTrigger, which tells our SyncFacts method to run every day at 7am UTC.
namespace OlapSync { public class Functions { // This function will run using a cron schedule - daily @ 7 AM UTC public static void SyncFacts([TimerTrigger("0 0 7 * * *")] TimerInfo timer) { using (var conn = new SqlConnection("some connection string")) { conn.Open(); using (var cmd = new SqlCommand("olap.SyncFacts", conn)) { cmd.CommandType = CommandType.StoredProcedure; cmd.ExecuteNonQuery(); } } } } }
To recap, WebJobs are useful to automate indexing of our data for searching, to queue impressions of site/page visits, and to keep analytics data up to date as it comes in to our system. If you are looking to do something similar, please reach out—we would love to help!