Vainolo's Blog

Archive for the ‘Azure’ tag

Azure Functions – Part 1: Getting Started

leave a comment

I love to write software. And because I’m not a software developer anymore I don’t get much time to do it, and it has always bugged me the incredible amount of work that is needed to configure, maintain, and in general, manage the infrastructures on which the software runs, which is what I really want to work on. So It’s no surprise that I’m a HUGE fan of server-less programming, as it is provided by Azure Functions (and others like Amazon Lambda or Google Cloud Functions). Since I get some free credits from Microsoft, I had to give this a try… and I’m loving it :-).

I assume that you have an Azure subscription, but if not, just go to https://signup.azure.com and get yours free. At the time of writing this, you even get a $200 credit valid for 30 days. Great to start playing.

The end goal of this tutorial is to have a full blown application for the analysis of financial data, all on top of a fully server-less infrastructure. Let’s see how far I can go. I’m very verbose in my tutorials because I’m a big consumer of them (both of other but also of mine :-)), and I hate it when they don’t show all the steps needed to perform something and you need to guess steps and many times end in an incorrect state that breaks the whole flow.

So let’s get started creating our first Azure Function! Navigate to https://portal.azure.com/, and after logging in you should be inside the Azure portal which looks something like this:

clip_image001[4]

To keep things organized, I have created a resource group to keep all of the resource associated with the tutorials in one place, and this is the tile shown in the dashboard (which I also created for these tutorials). A resource group is a logical container of resources in Azure, and helps maintain your subscription organized.

So let’s click on the nice blue "Create Resources" button, and see where we go:

clip_image001[6]

This screen is probably different for every person depending on what they have in their subscription, and I can’t see Azure Functions here, so I’ll do a search:

clip_image001[8]

And yes, there it is, the first result in the search. Clicking on the "Function App" row will open a new "blade" (yes, that’s how those vertical panes are called) that gives a short description of what is a Function App:

clip_image001[10]

This is what I came for, so I’ll click on the create button, which as expected opens a new "blade" (you can see that a new blade is created by looking at the horizontal scroll bar that keeps on growing on the bottom of the screen) where we can enter the details for the Function App we are creating:

clip_image001[12]

Let’s fill in the details for the App. First, the App name, must be unique over all azure websites (on top of which Azure Functions run), so let’s call it "vainolo-azfun" (as expected, it’s not taken :-)). I’m going to use my existing subscription to pay for this App and add it to the resource group from where I started this whole process (you can come to this screen from a different place and there you will have to chose the Resource Group). The Hosting Plan is how you pay for the App. There are two options: Consumption or App Service. Consumption is a "pay-as-you-go" charge where you are billed by the number of requests to the App, and App Service let’s you reserve compute resources to service your App. For tutorials and testing purposes, I’m assuming that Consumption plan is cheaper that App Service (need to check this!), and regardless of this it is also more straightforward so I chose Consumption. I leave the location as it is because I don’t really care, but when you are developing a serious service you need to have your Apps close to your clients, so this is important. Lastly, we need to create a Storage account, the place where Azure stores the code that is deployed in the function. I’ll just name this "vainoloazfun" (no dashed allowed here! And this also needs to be unique over all Storage services in Azure). I’m not going to enable Application Insights because this is not a real production App, but I’ll investigate its capabilities in the future.

Ok, this is what I have now:

image

I’ll go ahead an click on Create, and see what happens… Ah, the creation blade closes and after some time a message pops out of the little bell on the top right corner of the screen. When I click on it I see that the deployment of my Function App was successful:

clip_image001[16]

I’ll just go ahead and click on the "Go to Resource" button to navigate to my new, nice and shiny Function App, that looks like this:

clip_image001[18]

Great. But talk about overhead – to create a simple function I need a resource group, a storage account, a function app, and then only can I create a function. Man, we could do this way better. But let’s go on.

Let’s create our first function.. Click on the "Functions" text from the drop-down list from "vainolo-azfun" and you will get the Functions screen where a new Function can be created:

clip_image001[20]

I’ll go ahead and click on "+ New Function", getting the following screen:

clip_image001[22]

There are many options here and I will surely investigate them in the future, but for our first example we’ll create a simple Function and responds "Hello World" to an HTTP request. This is done using the "HTTP trigger" template. I’ll click on that option and a new blade opens asking me the name of the Function and the language in which I want to write it:

clip_image001[24]

I’ll select C# as I’m more familiar with it than JS (where is Java???), and I’ll call my function "HelloWorld". Once the language of the function is defined, a new option may suddenly appear in the blade (bad UX practice!) asking for the Authorization level. I’ll select Anonymous because it’s the simplest way to test the function:

clip_image001[26]

I now click "Create", and after a short wait this is what I get:

clip_image001[28]

Instead of getting a blank function (what I expected), Azure already fills my function with sample code that does much more than I wanted, by parsing the request and responding based on the query parameters. I think this is great because it gives you an idea of what you are doing :-). I’ll just delete everything except the first and last line of the function, change the return value to "Hello World":

using System.Net;

public static async Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)
{
    log.Info("C# HTTP trigger function processed a request.");
    return req.CreateResponse(HttpStatusCode.OK, "Hello World");
}

clip_image001[30]

That’s it. My function is ready! But does it work? It’s easy to check that! Clicking on "Save and Run" button, the function is compiled and a simple test is executed:

clip_image001[32]

And just as expected, my function returns "Hello World" as output.

But I’m a skeptic, and want to see this working outside of the Azure portal. To do this, I need to get the URL of the function, which is fetched by clicking on the "Get Function URL" link on the top of the editor, but it is also easy to create manually: https://vainolo-azfun.azurewebsites.com/api/HelloWorld (https://AppURL/api/FunctionName):

clip_image001[34]

I’ll open a new browser window and navigate to this URL, and Voila!

image

Not exactly what I expected… What is returned by the server is an XML message that contains "Hello World", and the reason for this is because the client (Chrome in this case) requests HTML or XML as response first (and other things later) and the server automagically decides to translates the text response to XML so that it can make the browser happy. So let’s try another way. Using a tool I found called https://www.hurl.it, I can test it again and here the response is once again as was shown in the Azure portal:

clip_image001[36]

And there you have it, my first Azure Function ;-).

In summary, we created in this tutorial a Function App, which is a container for multiple Azure Functions, and a simple Azure Function that returns "Hello World" when called. It’s a good place to start!

And I hope to get some time to play with this some more very soon.

Written by vainolo

February 6th, 2018 at 11:39 pm

Taking Azure Redis Cache for a ride – testing a simple Producer/Consumer program

2 comments

In my “younger” days, I worked in a team that developed/maintained an application written in Ada, which used Oracle 8 as its database. Our performance requirements didn’t allow us to query the DB all the time, so the team had developed an in-memory cache on top of the database. We had a full-blown ORM mapping from the DB to Ada, with in-memory querying mechanisms, cache refreshes, and all the good stuff.

But why am I talking about this now? Because this taught me to really like caching mechanism. And now that Azure Redis cache came out, I had to get my hands dirty. I’ve heard a lot about Redis from many sources, but I never had time to learn about it. Last week I decided to give it a try.

The specific use case I had in mind was using it as a FAST medium of communication between servers, in a publish-subscribe fashion. What a delight it was to find that this use case is supported by Redis! I wanted to know how much time it took for a message to go from a consumer to a producer, so I created a simple ping-pong program, where one server sends ping and the other responds pong. I did this double trip because I don’t trust clock sync algorithms enough, and wanted to do all time measurements in the same machine. Since my code is very simple, just dividing by 2 will give the approximate time it takes for a message to get from one server to the other.

I tried the two different cache tiers provided by Azure: the basic tier with no replication, and the standard tier with replication. Both have caches that are shared (not dedicated) and dedicated (size above 1GB). For my tests, I tried both the Basic and Standard tiers, with 1GB size so that in both cases it is a dedicated cache.

The test code is fairly simple: two azure worker roles, the Producer creates messages and pushes them to a Redis pub/sub queue, and the Consumer subscribes to the messages, and when it receives one, it pushes it back to another Redis queue to which the Producer is subscribed. The Producer checks that the value received is the same as the value sent, and counts the time it took for the message to go back and forth. I used the StackExchange.Redis nuget to access Redis.

This is the code for the Producer:

using System.Diagnostics;
using System.Net;
using System.Threading;
using Microsoft.WindowsAzure.ServiceRuntime;
using StackExchange.Redis;

namespace Producer
{
    public class WorkerRole : RoleEntryPoint
    {
        Stopwatch watch;
        ISubscriber subscriber;
        int count = 0;
        int ITERATIONS = 1000;

        public override void Run()
        {
            Trace.TraceInformation("Starting Test");
            ConnectionMultiplexer connection = ConnectionMultiplexer.Connect("");
            IDatabase cache = connection.GetDatabase();
            subscriber = connection.GetSubscriber();
            watch = new Stopwatch();
            subscriber.Subscribe("pong", (channel, message) => getPong(message));
            watch.Start();
            subscriber.Publish("ping", count);

            while (true)
            {
                Thread.Sleep(1000);
            }
        }

        private void getPong(RedisValue message)
        {
            watch.Stop();
            if (!count.ToString().Equals(message))
            {
                Trace.TraceInformation("Got wrong message. Sent:" + count + ", Received:" + message);
            }
            count++;
            if (count == ITERATIONS)
            {
                Trace.TraceInformation("Finished Test. Running " + count + " iterations. Total time:" + watch.ElapsedMilliseconds + ". Average: " + watch.ElapsedMilliseconds / count);
                finished = true;
                return;
            }
            watch.Start();
            subscriber.Publish("ping",count);
        }
    }
}

And the code of the Consumer (which is pretty simple) is this:

using System.Diagnostics;
using System.Threading;
using Microsoft.WindowsAzure.ServiceRuntime;
using StackExchange.Redis;namespace Consumer
{
    public class WorkerRole : RoleEntryPoint
    {
        public override void Run()
        {
            Trace.TraceInformation("Consumer started");
            Subscribe(ConnectionMultiplexer.Connect(""));
            while (true)
            {
                Thread.Sleep(1000);
            }
        }        public void Subscribe(ConnectionMultiplexer connection)
        {
            IDatabase cache = connection.GetDatabase();
            ISubscriber subscriber = connection.GetSubscriber();
            subscriber.Subscribe("ping", (channel, message) => { subscriber.Publish("pong", message); });
        }
    }
}

The result of the tests in both cases were the same. After running 1000 iterations, both caches gave an average of 15ms latency from the full round trip, and assuming that there was no delay inside the consumer (or that it is negligible), this means that a message takes around 7.5ms to get to its destination. That is pretty damn fast!

I hope to have some more time to learn about Redis and it’s uses, and to have a real project where I can test its capabilities in a real-world environment. But for starters, it is very, very promising.

Written by vainolo

August 26th, 2014 at 10:32 pm

Assimilation has been Achieved – Vainolo.com is Now Hosted @ Azure

2 comments

I was never a person to use Microsoft technologies as a platform for development – I am pretty comfortable with Java, Linux (for Dev, not desktop – Windows rules there), etc. But since I started working at MS, I decided to give them a try – specifically Microsoft Azure (I also get some freebies because of being an MS employee, so that gave me a push. Oh, and my last hosting service had a very bad response time, and was also close to renewal. In short, many reasons to try new things).

So I decided to migrate my blog to Azure. First I tries exporting and importing all the blog’s content as explained in the WordPress Codex but that didn’t work. My export file was too large to import it. OK… so I’ll export it in parts (posts, pages, feedback). That didn’t seem to work either. After trying a number of “automatic” solutions that didn’t work, I hat to get my hands dirty.

So I downloaded the contents of my blog from my previous host (using FTP) and created a backup of the database (which I also downloaded). Then, using the WebMatrix tools provided by Azure, I downloaded my clean Azure blog, and overwrote it the contents of the blog I had downloaded before. Lastly, I had to import the database I had saved earlier into my local MySQL instance (which I did using the MySQL Workbench, a really nice product). So I had a locally working wordpress install on my computer, just by changing small things some basic settings in the wp-config.php and changing some obvious rows in the wp-options table.

From here it was very easy – just hit the “Publish” button in WebMatrix – and Voila! My site was up. Now I had to move my domain – and this was also very easy, following the instructions shown here.

After a couple of days it dawned on me that I also have to move my mail server to a new provider, since I am cancelling my current hosting. Once again I decided to try the MS way, and found this very helpful tutorial. It simply works!

So no more paying for hosting, and I also hope that the response time of my site improves. And thank you Microsoft for all the free services (but the UI of GMail is still better than yours). Sorry for not doing a step-by-step tutorial on how I managed the transfer – I didn’t write it down and not it is impossible to replay all the steps 🙂

“We are the Borg … Resistance is futile” –

Enhanced by Zemanta

Written by vainolo

May 1st, 2013 at 10:24 pm

%d bloggers like this: