No-Code Container Chassis Tracking Dashboard Implemented with Azure IoT Plug and Play

Normally on the roads, we will see trailer trucks, which are the combination of a prime mover and a container chassis to carry freight. Container chassis is an important asset of a trucking company. It is usually an unpowered vehicle towed by another. If you still have no idea what it is, please watch the video below.

Ocean Trailer presents CIMC Combo Container Chassis.

Tracking container chassis is not a simple problem to solve. We do not only need to build trackers, which are IoT devices to send back telemetry and sensor data collected from the container chassis, but also need to have another system to store, process, and display the data. This does not sound like a system that can be easily built within, let’s say, 5 minutes.

Now what if we can turn our smart phones into trackers and then install one of them on the container chassis? Also, what if we can make use of Microsoft Azure to provide a IoT data dashboard for us in just a few clicks?

Azure IoT Plug and Play

With the newly introduced IoT Plug and Play from Microsoft, we can do a very simple container chassis tracking dashboard without any coding.

Few days ago, Microsoft release a mobile app called IoT Plug and Play on both Android and iOS.

So, you may ask, why is this IoT Plug and Play interesting? This is because it can turn our iOS or Android device into an IoT device without any coding or device modeling. Our phones can then seamlessly connect to Azure IoT Central or IoT Hub with telemetry and sensor data from the devices will be automatically uploaded to the Azure in a defined delivery interval.

In this post, I am just going to share what I have tried out so far. Hopefully it helps my friends who are looking for similar solutions.

Setup Azure IoT Central

Before we proceed further, we need to understand that even though the example I use here may sound simple to you, but the services, such as Azure IoT Central is actually meant for production use so that the industries can use it to build enterprise-grade IoT applications on a secure, reliable, and scalable infrastructure.

When we are setting up Azure IoT Central, we can have a quick start by directly applying templates which are all industry focused examples available for these industries today. For example, using the templates on Azure, logistics company can create an Azure IoT Central application to track shipments in real time across air, water, and land with location and condition monitoring. This will play an important role in the logistics industry because the technology can then provide total end-to-end supply chain enablement.

Dr Robert Yap, the Executive Chairman of YCH Group, shared about their vision of integrating the data flows in the supply chain with analytics capabilities.

In my example, I will start with a customised template which has nothing inside. We then can proceed to the “Devices” page to add a devices for our phones.

First look of the Azure IoT Central dashboard.

Connect with Azure IoT Plug and Play

Now, how do we turn our phones into IoT devices?

First of all, we just need to download the IoT Plug and Play app (from Google Play Store or Apple App Store) to our phones. After that, we simply just pair the new devices on the Azure IoT Central to our phones by scanning the corresponding QR code. That’s all, we now should be able to see the telemetry and sensor data collected from the phones on our dashboard, as shown in the following screenshot.

Data collected from accelerometer, gyroscope, magnetometer, and barometer on my phone.

Rules and Triggers

We are also able to specify rules in the Azure IoT Central so that there will be an action triggered when the defined conditions are met. We can also integrate the rule with Power Automate and Azure Logic Apps to perform relevant automated workflows.

We can also have Azure IoT Central to send us an email when the device is running on low battery, for example.

Scheduled Jobs

Another cool feature in Azure IoT Central is that we can send the commands back to the devices. In addition, we can send the commands in a scheduled manner. For example, in the following screenshot, the “lightOn” will be sent to all the devices in the Device Group and thus the connected phones in the Device Group will switch on their flashlight at 11.30pm in the midnight.

Don’t be scared if there is flashlight suddenly coming from chassis in midnight.

Image Upload

In the IoT Plug and Play app, we can also try out the image upload feature which allows us to submit images to the cloud from the IoT devices. As shown in the screenshot below, each IoT Central app can only link with one Azure Storage container. Hence, in the container, there will be folder for each of the registered IoT devices so that files uploaded will be categorised into their own folder accordingly.

We need to link Azure IoT Central to a container in the Azure Storage.

So with the phones setup as IoT devices, we can now install them on the container chassis to continuously send back the location data to the Azure IoT Central. The business owner can thus easily figure out where their container chassis is located at by looking at the dashboard.

References

Image Based CAPTCHA using Jigsaw Puzzle on Blazor

In this article, I will share about how I deploy an image based CAPTCHA as a Blazor app on Azure Static Web App.

PROJECT GITHUB REPOSITORY

The complete source code of this project can be found at https://github.com/goh-chunlin/Lunar.JigsawPuzzleCaptcha.

Project Motivation

CAPTCHA, which stands for “Completely Automated Public Turing test to tell Computers and Humans Apart”, is a type of challenge-response test used in computing to determine whether or not the user is human. Since the day CAPTCHA was invented by Luis von Ahn’s team at Carnegie Mellon University, it has been a reliable tool in separating machines from humans.

In 2007, Luis von Ahn’s team released a programme known as reCAPTCHA which asked users to decipher hard-to-read texts in order to distinguish between human and bots.

The words shown in reCAPTCHA come directly from old books that are being digitized. Hence, it does not only stop spam, but also help digitise books at the same time. (Source: reCAPTCHA)

A team led by Prof Gao Haichang from Xidian University realised that, with the development of automated computer vision techniques such as OCR, traditional text-based CAPTHCAs are not considered safe anymore for authentication. During the IEEE conference in 2010, they thus proposed a new way, i.e. using an image based CAPTCHA which involves in solving a jigsaw puzzle. Their experiments and security analysis further proved that human can complete the jigsaw puzzle CAPTCHA verification quickly and accurately which bots rarely can. Hence, jigsaw puzzle CAPTCHA can be a substitution to the text-based CAPTCHA.

Xidian University, one of the 211 Project universities and a high level scientific research innovation in China. (Image Source: Shulezetu)

In 2019, on CSDN (Chinese Software Developer Network), a developer 不写BUG的瑾大大 shared his implementation of jigsaw puzzle captcha in Java. It’s a very detailed blog post but there is still room for improvement in, for example, documenting the code and naming the variables. Hence, I’d like to take this opportunity to implement this jigsaw puzzle CAPTCHA in .NET 5 with C# and Blazor. I also host the demo web app on Azure Static Web App so that you all can access and play with the CAPTCHA: https://jpc.chunlinprojects.com/.

Today, jigsaw puzzle CAPTCHA is used in many places. (Image Credit: Hirabiki at HoYoLAB)

Jigsaw Puzzle CAPTCHA

In a jigsaw puzzle CAPTCHA, there is usually a jigsaw puzzle with at least one misplaced piece where users need to move to the correct place to complete the puzzle. In my demo, I have only one misplaced piece that needs to be moved.

Jigsaw puzzle CAPTCHA implementation on Blazor. (Try it here)

As shown in the screenshot above, there are two necessary images in the CAPTCHA. One of them is a misplaced piece of the puzzle. Another image is the original image with a shaded area indicating where the misplaced piece should be dragged to. What users need to do is just dragging the slider to move the misplaced piece to the shaded area to complete the jigsaw puzzle within a time limit.

In addition, here the CAPTCHA only needs user to drag the missing piece horizontally. This is not only the popular implementation of the jigsaw puzzle CAPTCHA, but also not too challenging for users to pass the CAPTCHA.

Now, let’s see how we can implement this in C# and later deploy the codes to Azure.

Retrieve the Original Image

The first thing we need to do is getting an image for the puzzle. We can have a collection of images that make good jigsaw puzzle stored in our Azure Blob Storage. After that, each time before generating the jigsaw puzzle, we simply need to fetch all the images from the Blob Storage with the following codes and randomly pick one as the jigsaw puzzle image.

public async Task<List<string>> GetAllImageUrlsAsync() 
{
    var output = new List<string>();

    var container = new BlobContainerClient(_storageConnectionString, _containerName);

    var blobItems = container.GetBlobsAsync();

    await foreach (var blob in blobItems) 
    {
        var blobClient = container.GetBlobClient(blob.Name);
        output.Add(blobClient.Uri.ToString());
    }

    return output;
}

Define the Missing Piece Template

To increase the difficulty of the puzzle, we can have jigsaw pieces with different patterns, such as having tabs appearing on different sides of the pieces. In this demo, I will stick to just one pattern of missing piece, which has tabs on the top and right sides, as shown below.

The missing piece template.

The tabs are basically two circles with the same radius. Their centers are positioned at the middle point of the rectangle side. Hence, we can now build a 2D matrix for the pixels indicating the missing piece template with 1 means inside of the the piece and 0 means outside of the piece.

In addition, we know the general equation of a circle of radius r at origin (h,k) is as follows.

Hence, if there is a point (i,j) inside the circle above, then the following must be true.

If the point (i,j) is outside of the circle, then the following must be true.

With these information, we can build our missing piece 2D matrix as follows.

private int[,] GetMissingPieceData()
{
    int[,] data = new int[PIECE_WIDTH, PIECE_HEIGHT];

    double c1 = (PIECE_WIDTH - TAB_RADIUS) / 2;
    double c2 = PIECE_HEIGHT / 2;
    double squareOfTabRadius = Math.Pow(TAB_RADIUS, 2);

    double xBegin = PIECE_WIDTH - TAB_RADIUS;
    double yBegin = TAB_RADIUS;

    for (int i = 0; i < PIECE_WIDTH; i++)
    {
        for (int j = 0; j < PIECE_HEIGHT; j++)
        {
            double d1 = Math.Pow(i - c1, 2) + Math.Pow(j, 2);
            double d2 = Math.Pow(i - xBegin, 2) + Math.Pow(j - c2, 2);
            if ((j <= yBegin && d1 < squareOfTabRadius) || (i >= xBegin && d2 > squareOfTabRadius))
            {
                data[i, j] = 0;
            }
            else
            {
                data[i, j] = 1;
            }
        }
    }

    return data;
}

After that, we can determine the border of the missing piece easily too from just the template data above. We then can draw the border of the missing piece for better user experience when we display it on screen.

private int[,] GetMissingPieceBorderData(int[,] d)
{
    int[,] borderData = new int[PIECE_WIDTH, PIECE_HEIGHT];

    for (int i = 0; i < d.GetLength(0); i++)
    {
        for (int j = 0; j < d.GetLength(1); j++)
        {
            if (d[i, j] == 0) continue;

            if (i - 1 < 0 || j - 1 < 0 || i + 1 >= PIECE_WIDTH || j + 1 >= PIECE_HEIGHT) 
            {
                borderData[i, j] = 1;

                continue;
            }

            int sumOfSourrounding = 
                d[i - 1, j - 1] + d[i, j - 1] + d[i + 1, j - 1] + 
                d[i - 1, j] + d[i + 1, j] + 
                d[i - 1, j + 1] + d[i, j + 1] + d[i + 1, j + 1];

            if (sumOfSourrounding != 8) 
            {
                borderData[i, j] = 1;
            }
        }
    }

    return borderData;
}

Define the Shaded Area

Next, we need to tell the user where the missing piece should be dragged to. We will use the template data above and apply it to the original image we get from the Azure Blob Storage.

Due to the shape of the missing piece, the proper area to have the shaded area needs to be in the region highlighted in green colour below. Otherwise, the shaded area will not be shown completely and thus give users a bad user experience. The yellow area is okay too but we don’t allow the shaded area to be there to avoid cases where the missing piece covers the shaded area when the images first load and thus confuses the users.

Random random = new Random();

int x = random.Next(originalImage.Width - 2 * PIECE_WIDTH) + PIECE_WIDTH;
int y = random.Next(originalImage.Height - PIECE_HEIGHT);
Green area is where the top left of the shaded area should be positioned at.

Let’s assume the shaded area is at the point (x,y) of the original image, then given the original image in a Bitmap variable called originalImage, we can then have the following code to traverse the area and process the pixels in that area.

...
int[,] missingPiecePattern = GetMissingPieceData();

for (int i = 0; i < PIECE_WIDTH; i++)
{
    for (int j = 0; j < PIECE_HEIGHT; j++)
    {
        int templatePattern = missingPiecePattern[i, j];
        int originalArgb = originalImage.GetPixel(x + i, y + j).ToArgb();

        if (templatePattern == 1)
        {
            ...
            originalImage.SetPixel(x + i, y + j, FilterPixel(originalImage, x + i, y + j));
        }
        else
        {
            missingPiece.SetPixel(i, j, Color.Transparent);
        }
    }
}
...

Now we can perform the image convolution with kernel, a 3×3 convolution matrix, as shown below in the FilterPixel method. Here we will be using Box Blur. A Box Blur is a spatial domain linear filter in which each pixel in the resulting image has a value equal to the average value of its neighboring pixels in the input image. By the Central Limit Theorem, repeated application of a box blur will approximate a Gaussian Blur.

Kernels used in different types of image processing.

For the kernel, I don’t really follow the official Box Blur kernel or Gaussian Blur kernel. Instead, I dim the generated colour by forcing three pixel to be always black (when i = j). This is to make sure the shaded area is not only blurred but darkened.

private Color FilterPixel(Bitmap img, int x, int y)
{
    const int KERNEL_SIZE = 3;
    int[,] kernel = new int[KERNEL_SIZE, KERNEL_SIZE];

    ...

    int r = 0;
    int g = 0;
    int b = 0;
    int count = KERNEL_SIZE * KERNEL_SIZE;
    for (int i = 0; i < kernel.GetLength(0); i++)
    {
        for (int j = 0; j < kernel.GetLength(1); j++)
        {
            Color c = (i == j) ? Color.Black : Color.FromArgb(kernel[i, j]);
            r += c.R;
            g += c.G;
            b += c.B;
        }
    }

return Color.FromArgb(r / count, g / count, b / count);

What will happen when we are processing pixel without all 8 neighbouring pixels? To handle this, we will take the value of the pixel at the opposite position which is describe in the following diagram.

Applying kernel on edge pixels.

Since we have two images ready, i.e. an image for the missing piece and another image which shows where the missing piece needs to be, we can convert them into base 64 string and send the string values to the web page.

Now, the next step will be displaying these two images on the Blazor web app.

API on Azure Function

When we publish our Blazor app to Azure Static Web Apps, we are getting fast hosting of our web app and scalable APIs. Azure Static Web Apps is designed to host applications where the API and frontend source code lives on GitHub.

The purpose of API in this project is to retrieve the jigsaw puzzle images and verify user submissions. We don’t need a full server for our API because Azure Static Web Apps hosts our API in Azure Functions. So we need to implement our API as Azure Functions here.

We will have two API methods here. The first one is to retrieve the jigsaw puzzle images, as shown below.

[FunctionName("JigsawPuzzleGet")]
public async Task<IActionResult> Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = "jigsaw-puzzle")] HttpRequest req,
    ILogger log)
{
    log.LogInformation("C# HTTP trigger function processed a request.");

    var availablePuzzleImageUrls = await _puzzleImageService.GetAllImageUrlsAsync();

    var random = new Random();
    string selectedPuzzleImageUrl = availablePuzzleImageUrls[random.Next(availablePuzzleImageUrls.Count)];

    var jigsawPuzzle = _puzzleService.CreateJigsawPuzzle(selectedPuzzleImageUrl);
    _captchaStorageService.Save(jigsawPuzzle);

    return new OkObjectResult(jigsawPuzzle);
}

The Azure Function first retrieve all the images from the Azure Blob Storage and then randomly pick one to use in the jigsaw puzzle generation.

Before it returns the puzzle images back in a jigsawPuzzle object, it also saves it into Azure Table Storage so that later when users submit their answer back, we can have another Azure Function to verify whether the users solve the puzzle correctly.

In the Azure Table Storage, we generate a GUID and then store it together with the location of the shaded area, which is randomly generated, as well as an expiry date and time so that users must solve the puzzle within a limited time.

...
var tableClient = new TableClient(...);

...

var entity = new JigsawPuzzleEntity
{
    PartitionKey = ...,
    RowKey = id,
    Id = id,
    X = x,
    Y = y,
    CreatedAt = createdAt,
    ExpiredAt = expiredAt
};

tableClient.AddEntity(entity);
...

Here, GUID is used as the RowKey of the Table Storage. Hence, later when user submits his/her answer, the GUID will be sent back to the Azure Function to help locate back the corresponding record in the Table Storage.

[FunctionName("JigsawPuzzlePost")]
public async Task<IActionResult> Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "jigsaw-puzzle")] HttpRequest req,
    ILogger log)
{
    log.LogInformation("C# HTTP trigger function processed a request.");

    var body = await new StreamReader(req.Body).ReadToEndAsync();
    var puzzleSubmission = JsonSerializer.Deserialize<PuzzleSubmissionViewModel>(body, new JsonSerializerOptions { PropertyNamingPolicy = JsonNamingPolicy.CamelCase });

    var correspondingRecord = await _captchaStorageService.LoadAsync(puzzleSubmission.Id);

    ...

    bool isPuzzleSolved = _puzzleService.IsPuzzleSolved(...);

    var response = new Response 
    {
        IsSuccessful = isPuzzleSolved,
        Message = isPuzzleSolved ? "The puzzle is solved" : "Sorry, time runs out or you didn't solve the puzzle"
    };

    return new OkObjectResult(response);
}

Since our API is hosted as Azure Function in Consumption Plan, as shown in the screenshot below, we need to note that our code in the Function will be in the serverless mode, i.e. it effectively scales out to meet whatever load it is seeing and scales down when code isn’t running.

The Azure Function managed by the Static Web App will be in Consumption Plan.

Since the Function is in the serverless mode, we will have the issue of serverless cold start. Hence, there will be a latency that users must wait for their function, i.e. the time period starts from when an event happens to a function starts up until that function completes responding to the event. So more precisely, a cold start is an increase in latency for Functions which haven’t been called recently.

Latency will be there when Function is cold. (Image Source: Microsoft Azure Blog)

In this project, my friend feedbacked to me that he had encountered at least 15 seconds of latency to have the jigsaw puzzle loaded.

Blazor Frontend

Now we can move on to the frontend.

To show the jigsaw puzzle images when the page is loaded, we have the following code.

protected override async Task OnInitializedAsync()
{
    var jigsawPuzzle = await http.GetFromJsonAsync("api/jigsaw-puzzle");
    id = jigsawPuzzle.Id;
    backgroundImage = "data:image/png;base64, " + jigsawPuzzle.BackgroundImage;
    missingPieceImage = jigsawPuzzle.MissingPieceImage;
    y = jigsawPuzzle.Y;
}

Take note that we don’t only get the two images but also the GUID of the jigsaw puzzle record in the Azure Table Storage so that later we can send back this information to the Azure Function for submission verification.

Here, we only return the y-axis value of the shaded area location because users are only allowed to drag the missing puzzle horizontally as discussed earlier. If you would like to increase the difficulty of the CAPTCHA by allowing users to drag the missing piece vertically as well, you can choose not to return the y-axis value.

We then have the following HTML to display the two images.

<div style="margin: 0 auto; padding-left: @(x)px; padding-top: @(y)px; width: 696px; height: 442px; background-image: url('@backgroundImage'); background-size: contain;">
    <div style="width: 88px; height: 80px; background-image: url('data:image/png;base64, @missingPieceImage');">
            
    </div>
</div>

We also have a slider which is binded to the x variable and a button to submit both the value of the x and the GUID back to the Azure Function.

<div style="margin: 0 auto; width: 696px; text-align: center;">
    <input type="range" min="0" max="608" style="width: 100%;" @bind="@x" @bind:event="oninput" />
    <button type="button" @onclick="@Submit">Submit</button>

</div>

The Submit method is as follows which will feedback to users whether they solve the jigsaw puzzle correctly or not. Here I use a toast library for Blazor done by Chris Sainty, a Microsoft MVP.

private async Task Submit()
{
    var submission = new PuzzleSubmissionViewModel
    {
        Id = id,
        X = x
    };
    
    var response = await http.PostAsJsonAsync("api/jigsaw-puzzle", submission);

    var responseMessage = await response.Content.ReadFromJsonAsync<Response>();

    if (responseMessage.IsSuccessful)
    {
        toastService.ShowSuccess(responseMessage.Message);
    }
    else
    {
        toastService.ShowError(responseMessage.Message);
    }
        
}

Now we can test how our app works!

Testing Locally

Before we can test locally, we need to provide the secrets and relevant settings to access Azure Blob Storage and Table Storage.

We first need to have a file called local.settings.json in the root of the Api project with the following content. (Remember to have “Copy to output directly” set to “copy if newer” for the file)

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "",
    "FUNCTIONS_WORKER_RUNTIME": "dotnet",
    "CaptchaStorageEndpoint": "...",
    "CaptchaStorageTableName": "...",
    "CaptchaStorageAccountName": "...",
    "CaptchaStorageAccessKey": "...",
    "ImageBlobStorageConnectionString": "...",
    "ImageBlobContainerName": "..."
  },
  "Host": {
    "LocalHttpPort": 7071,
    "CORS": "*"
  }
}

The CORS setting is necessary as well else our Blazor app cannot access the API when we test the web app locally. We don’t have to worry about CORS when we publish it to Azure Static Web Apps because Azure Static Web Apps will automatically configure the app so that it can communicate with the API on Azure using a reverse proxy.

In addition, please remember to exclude local.settings.json from the source control.

In the Client project, since we are going to run our Api at port 7071, we shall let the Client know too. To do so, we first need to specify the base address for local in the Program.cs of the Client project.

builder.Services.AddScoped(sp => new HttpClient { BaseAddress = new Uri(builder.Configuration["API_Prefix"] ?? builder.HostEnvironment.BaseAddress) });

Then we can specify the value for API_Prefix in the appsettings.Development.json in the wwwroot folder.

{
    "API_Prefix": "http://localhost:7071"
}

Finally, please also set both Api and Client projects as the Startup Project in the Visual Studio.

Setting multiple startup projects in Visual Studio.

Deploy to Azure Static Web App

After we have created an Azure Static Web Apps resource and bound it with a GitHub Actions which monitors our GitHub repository, the workflow will automatically build and deploy our app and its API to Azure every time we commit or create pull requests into the watched branch. The steps have been described in my previous blog post about Blazor on Azure Static Web App, so I won’t repeat it here.

Since our API needs to have the information of secrets and connection settings to the Azure Storage, we need to specify them under Application Settings of the Azure Static Web App as well. The values will be accessible by API methods in the Azure Functions.

Managing the secrets in Application Settings.

Yup, that’s all for implementing a jigsaw puzzle CAPTCHA in .NET. Feel free to try it out on my Azure Static Web App and let me know your thoughts about it. Thank you!

Jigsaw puzzle CAPTCHA implementation on Blazor. (Try it here)

References

The code of this Blazor project described in this article can be found in my GitHub repository: https://github.com/goh-chunlin/Lunar.JigsawPuzzleCaptcha.

Publish a Blazor Web App as Azure Static Web App

In 2018, the web framework, Blazor, was introduced. With Blazor, we can work on web UI with C# instead of JavaScript. Blazor can run the client-side C# code directly in the browser, using WebAssembly.

When server-side rendering is not required, we can then deploy our web app on platforms such as Azure Static Web App, a service that automatically builds and deploys full stack web apps to Azure from a code repository, such as GitHub.

In this article, I will share how the website for Singapore .NET Developers Community and Azure Community is re-built as a Blazor web app and deployed to Azure.

PROJECT GITHUB REPOSITORY

The complete source code of this project can be found at https://github.com/sg-dotnet/website.

Blazor Web UI

The community website is very simple. It is merely a single-page website with some descriptions and photos about the community. Then it also has a section showing list of meetup videos from the community YouTube channels.

We will build the website as Blazor WebAssembly App.

Firstly, we will have the index.html defined as follows. Please take note that the code snippet below uses CSS file which is not shown in this post. The complete and updated project can be viewed on the GitHub repo.

<!DOCTYPE html>
<html>

<head>
    <title>Singapore .NET Developers Community + Azure Community</title>
    <meta charset="utf-8" />
    <meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=no" />

    ...

    <link rel="icon" href="images/favicon.png" type="image/png">
    <link rel="stylesheet" href="css/main.css" />

    <base href="/" />
    http://_framework/blazor.webassembly.js
</head>

<body>
    <div id="app">
        <div style="position:absolute; top:30vh; width:100%; text-align:center">
            <h2>Welcome to dotnet.sg</h2>
            <div style="width: 50%; display: inline-block; height: 20px;">
                <div class="progress-line"></div>
            </div>
            
            <p>
                The website is loading...
            </p>
        </div>
    </div>

    <div id="blazor-error-ui">
        An unhandled error has occurred.
        <a href="" class="reload">Reload</a>
        <a class="dismiss">🗙</a>
    </div>

    <!-- Scripts -->
    http://javascript/jquery.min.js
</body>

</html>

Secondly, if we hope to have a similar UI template across all the web pages in the website, then we can define the HTML template under, for example, MainLayout.razor, as shown below. This template means that the header and footer sections can be shared across different web pages.

@inherits LayoutComponentBase

<!-- Header -->
<header id="header" class="alt">
    <div class="logo"><a href="/">SG <span>.NET + Azure Dev</span></a></div>
</header>

@Body

<!-- Footer -->
<footer id="footer">
    <div class="container">
        <ul class="icons">
           ...
        </ul>
    </div>
    <div class="copyright">
        &copy; ...
    </div>
</footer>

Finally, we simply need to define the @Body of each web page in their own Razor file, for example the Index.razor for the homepage.

In the Index.razor, we will fetch the data from a JSON file hosted on Azure Storage. The JSON file is periodically updated by Azure Function to fetch the latest video list from the YouTube channel of the community. Instead of using JavaScript, here we can simply write a C# code to do that directly on the Razor file of the homepage.

@code {
    private List<VideoFeed> videoFeeds = new List<VideoFeed>();

    protected override async Task OnInitializedAsync()
    {
        var allVideoFeeds = await Http.GetFromJsonAsync<VideoFeed[]>("...");

        videoFeeds = allVideoFeeds.ToList();
    }

    public class VideoFeed
    {
        public string VideoId { get; set; }

        public Channel Channel { get; set; }

        public string Title { get; set; }

        public string Description { get; set; }

        public DateTimeOffset PublishedAt { get; set; }
    }

    public class Channel
    {
        public string Name { get; set; }        
    }
}

Publish to Azure Static Web App from GitHub

We will have our codes ready in a GitHub repo with the following structure.

  • .github/workflows
  • DotNetCommunity.Singapore
    • Client
      • (Blazor client project here)

Next, we can proceed to create a new Azure Static Web App where we will host our website at. In the first step, we can easily link it up with our GitHub account.

We need to specify the deployment details for the Azure Static Web App.

After that, we will need to provide the Build details so that a GitHub workflow will be automatically generated. That is a GitHub Actions workflow that builds and publishes our Blazor web app. Hence, we must specify the corresponding folder paths within our GitHub repo, as shown in the screenshot below.

In the Build Details, we must setup the folder path correctly.

The “App location” is to point to the location of the source code for our Blazor web app. For the “Api location”, although we are not using it in our Blazor project now, we can still set it as follows so that in the future when we can easily setup the Api folder.

With this setup ready, whenever we update the codes in our GitHub repo via commits or pull requests, our Blazor web app will be built and deployed.

Our Blazor web app is being built in GitHub Actions.

Custom Domains

For the free version of the Azure Static web app, we are only allowed to have 2 custom domains per app. Another good news is that Azure Static Web Apps automatically provides a free SSL/TLS certificate for the auto-generated domain name and any custom domains we add.

CNAME record validation is the recommended way to add a custom domain, however, it only works for subdomains, such as “www.dotnet.sg” in our case.

For root domain, which is “dotnet.sg” in our case, by right we can do it in Azure Static Web App by using TXT record validation and an ALIAS record.

Take note that we can only create an ALIAS record if our domain provider supports it.

However, since there is currently no support of ALIAS or ANAME records in the domain provider that I am using, I have no choice but to have another Azure Function for binding “dotnet.sg”. This is because currently there is no IP address given in Azure Static Web App but there are IP address and Custom Domain Verification ID available in Azure Function. With these two information, we can easily map an A Record to our root domain, i.e. “dotnet.sg”.

Please take note that A Records are not supported for Consumption-based Function Apps. We must pay for the “App Service Plan” instead.

IP address and Custom Domain Verification ID on Azure Function. The root domain here is also SSL enabled.

After having the Azure Function ready, we need to perform URL redirect from “dotnet.sg” to “www.dotnet.sg”. With just a Proxy, we can create a Response Override with Status Code=302 and add a Header of Location=https://www.dotnet.sg, as shown in the following screenshot.

HTTP 302 on Azure Function Proxy.

With all these ready, we can finally get our community website up and running at dotnet.sg.

Welcome to the Singapore .NET/Azure Developer Community at dotnet.sg.

Export SSL Certificate For Azure Function

This step is optional. I need to go through this step because I have a Azure App Service managed certificate in one subscription but Azure Function in another subscription. Hence, I need to export the SSL certificate out and then import it back to another subscription.

We can export certificate from the Key Vault Secret.

In the Key Vault Secret screen, we then need to choose the correct secret version and download the certificate, as shown in the following screenshot.

Downloading the certificate as pfx.

After that, as mentioned in an online discussion about exporting and importing Azure App Service Certificate which has no password, we shall use tool such as OpenSSL to regenerate a pfx certificate with password that Azure Function can accept with the following commands.

> openssl pkcs12 -in .\old.pfx -out old.pem -nodes

> openssl pkcs12 -export -out .\new.pfx -in old.pem

We will be prompted for a password after executing the first command. We simply press enter to proceed because the certificate, as mentioned above, has no password.

OpenSSL command prompt.

With this step done, I finally can import the cert to the Azure Function in another subscription.

Yup, that’s all for hosting our community website as a Blazor web app on Azure Static Web App!

References

The code of this Blazor project described in this article can be found in our community GitHub repository: https://github.com/sg-dotnet/website.

Modern Data Warehouse with Azure

This month marks my third year in port and logistics industry.

In April, I attended a talk organised by NUS Business School on the future-ready supply chain. The talk is delivered by Dr Robert Yap, the YCH Group Executive Chairman. During the talk, Dr Yap mentioned that they innovated to survive because innovation was always at the heart of their development and growth. To him and his team, technology is not only an enabler for the growth of their business, but also a competitive advantage of the YCH Group.

In YCH Group, they have a vision of integrating the data flows in the supply chain with their unique analytics capabilities so that they can provide a total end-to-end supply chain enablement and transformation. Hence, today I’d like to share about how, with Microsoft Azure, we can build a data pipeline and modern data warehouse which helps to enable logistics companies to gear towards a future-ready supply chain.

Dr Yap shared about the The 7PL™ Strategy in YCH Group.

Two months ago, I also had the opportunity to join an online workshop to learn from Michelle Xie, Microsoft Azure Technical Trainer, about Azure Data Fundamentals. The workshop consists of four modules. In the workshop, we learnt core data concepts, relational and non-relational data offerings in Azure, modern data warehouses, and Power BI. Hence, I will share with you what I have learned in the workshop in this article as well.

About Data

Data is a collection of facts, figures, descriptions, and objects. Hence, data can be texts written on papers, or it can be in digital form and stored inside the electronic devices, or it could be facts that are in our mind. Data can be classified as follows.

Unstructured data like image is frequently used in combination with Machine Learning or Azure Cognitive Services capabilities to extract data.

ETL Data Pipeline

To build an data analytical system, we normally will have the following steps in a data pipeline to perform ETL procedure. ETL stands for Extract, Transform and Load. ETL loads data first into the staging storage server and then into the target storage system, as shown below.

ETL procedure in a data processing pipeline.
  • Data Ingestion: Data is moved from one or many data sources to a destination where it can be stored and further analysed;
  • Data Processing: Sometimes the raw data may not in the format suitable for querying. Hence, we need to transform and clean up the data;
  • Data Storage: Once the raw data has been processed, all the cleaned and transformed data will be stored to different storage systems which serve different purposes;
  • Data Exploration: A way of analysing performance through graphs and charts with business intelligence tools. This is helpful in making informed business decisions.
A map in the Power BI report showing the location of a prime mover within a time period.

In the world of big data, raw data often comes from different endpoints and the data is stored in different storage systems. Hence, there is a need of a service which can orchestrate the processes to refine these enormous stores of raw data into actionable business insights. This is where the Azure Data Factory, a cloud ETL service for scale-out serverless data integration and data transformation, comes into picture.

There are two ways of capturing the data in the Data Ingestion stage.

The first method is called the Batch Processing where a set of data is first collected over time and then fed into an analytics system to process them in group. For example, the daily sales data collected is scheduled to be processed every midnight. This is not just because midnight is the end of the day but also because the business normally ends at night and thus midnight is also the time when the servers are most likely to have more computing capacity.

Another method will be Streaming model where data is fed into analytics tools as it arrives and the data is processed in real time. This is suitable for use cases like collecting GPS data sent from the trucks because every piece of new data is generated in continuous manner and needs to be sent in real time.

Modern Data Warehouse

A modern data warehouse allows us to gather all our data at any scale easily, and to get insights through analytics, dashboard, and reports. The following image shows the data warehouse components on Azure.

Architecture diagram
Azure modern data warehouse architecture. (Image Source: Azure Docs)

For a big data pipeline, the data is ingested into Azure through Azure Data Factory in batches, or streamed near real-time using Apache Kafka, Event Hub, or IoT Hub. This data will then land in Azure Data Lake Storage long term persisted storage.

The Azure Data Lake Storage is an enterprise-wide hyper-scale repository for large volume of raw data. It is a suitable staging storage for our ingested data before the data is converted into a format suitable for data analysis. Thus, it can store any data in its native format, without requiring any prior transformations. Data Lake Storage can be accessed from Hadoop with the WebHDFS-compatible REST APIs.

As part of our data analytics workflow, we can use Azure Databricks as a platform to run SQL queries on the data lake and provide results for the dashboards in, for example, PowerBI. In addition, Azure Databricks also integrates with the MLflow machine learning platform API to support the end-to-end machine learning lifecycle from data preparation to deployment.

In the logistics industry, the need to store spatial data is greater than ever.

Let’s say a container trucking company collects data about each container delivery through an IoT device installed on the vehicle. Information such as the location and the speed of the prime mover is constantly sent from the IoT device to Azure Event Hub. We then can use Azure Databricks to correlate of the trip data, and also to enrich the correlated data with neighborhood data stored in the Databricks file system.

In addition, to process large amount of data efficiently, we can also rely on the Azure Synapse Analytics, which is basically an analytics service and a cloud data warehouse that lets us scale, compute, and store elastically and independently, with a massively parallel processing architecture.

Finally, we have Azure Analysis Services, which is an enterprise-grade analytics engine as a service. It is used to combine the data, define metrics, and secure the data in a single, trusted tabular semantic data model with enterprise-grade data models. As mentioned by Christian Wade, the Power BI Principal Program Manager in Microsoft, in March 2021, they have brought Azure Analysis Services capabilities to Power BI.

Pricing tiers available for Azure Analysis Services.

Relational Database Deployment Options on Azure and HOSTING COST

On Azure, there are two database deployment options available, i.e. IaaS and PaaS. IaaS option means that we have to host our SQL server on their virtual machines. For PaaS approach, we are able to either use Azure SQL Database, which is considered as DBaaS, or Azure SQL Managed Instance. Unless there is a need for the team to have OS-level access and control to the SQL servers, PaaS approach is normally the best choice.

Both PaaS and IaaS options include base price that covers underlying infrastructure and licensing. In IaaS, we can reduce the cost by shutting down the resources. However, in PaaS, the resources are always running unless we drop and re-create our resources when they are needed.

Cloud SQL Server options: SQL Server on IaaS, or SaaS SQL Database in the cloud.
The level of administration we have over the infrastructure and by the degree of cost efficiency. (Image Source: Azure Docs)

SQL Managed Instance is the latest deployment option which enables easy migration of most of the on-premises databases to Azure. It’s a fully-fledged SQL instance with nearly complete compatible with on-premise version of SQL server. Also, since SQL Managed Instance is built on the same PaaS service infrastructure, it comes with all PaaS features. Hence, if you would like to migrate from on-premise to Azure without management overhead but at the same time you require instance-scoped features, such as SQL Server Agent, you can try the SQL Managed Instance.

Andreas Wolter, one of the only 7 Microsoft Certified Solutions Masters (MCSM) for the Data Platform worldwide, once came to Singapore .NET Developers Community to talk about the SQL Database Managed Instance. If you’re new to SQL Managed Instance, check out the video below.

Spatial Data Types

Visibility plays a crucial role in the logistics industry because it relates to the ability of supply chain partners to be able to access and share operation information with other parties. Tracking the asset locations with GPS is one of the examples. However, how should we handle the geography data in our database?

Spatial data, also known as geospatial data, is data represented by numerical values in a geographic coordinate system. There are two types of spatial data, i.e. the Geometry Data Type, which supports Euclidean flat-earth data, and the Geography Data Type, which stores round-earth data, such as GPS latitude and longitude coordinates.

In Microsoft SQL Server, native spatial data types are used to represent spatial objects. In addition, it is able to index spatial data, provide cost-based optimizations, and support operations such as the intersection of two spatial objects. This functionality is also available in Azure SQL Database and Azure Managed Instances.

geom_hierarchy
The geometry hierarchy upon which the geometry and geography data types are based. (Image Source: SQL Docs)

Let’s say now we want to find the closest containers to a prime mover as shown in the following map.

The locations of 5 containers (marked as red) and location of the prime mover (marked as blue).

In addition, we have a table of container positions defined with the schema below.

CREATE TABLE ContainerPositions
(
    Id int IDENTITY (1,1),
    ContainerNumber varchar(13) UNIQUE,
    Position GEOGRAPHY
);

We can then simply use spatial function as shown below like STDistance, which will return the shortest distance between the two geography locations, in our query to sort the containers having the shortest to the longest distance from the prime mover.

The container “HKXU 200841-9” is the nearest container to the prime mover.

In addition, starting from version 2.2, Entity Framework Core also supports mapping to spatial data types using the NetTopologySuite spatial library. So, if you are using EF Core in your ASP .NET Core project, for example, you can easily get the mapping to spatial data types.

In the .NET Conference Singapore 2018, we announced the launch of Spatial Extension in EF Core 2.2.

Non-Relational DatabaseS on Azure

Azure Table Storage is one of the Azure services storing non-relational structured data. It provides a key/attribute store with a schema-less design. Since it’s a NoSQL datastore, it is suitable for datasets which do not require complex joins and can be denormalised for fast access.

Each of the Azure Table Storage consists of relevant entities, similar to a database row in RDBMS. Then each entity can have up to 252 properties to store the data together with a partition key. Entities with the same partition key will be stored in the same partition and the same partition server. Thus, entities with the same partition key can be queried more quickly. This also means that batch processing, the mechanism for performing atomic updates across multiple entities, can only operate on entities stored in the same partition.

In Azure Table Storage, using more partitions increases the scalability of our application. However, at the same time, using more partitions might limit the ability of the application to perform atomic transactions and maintain strong consistency for the data. We can then make use of this design to store, for example, data from each of the IoT devices in a warehouse, into different partition in the Azure Table Storage.

For a larger scale of the system, we can also design a data solution architecture that captures real-time data via Azure IoT Hub and store them into Cosmos DB which is a fast and flexible distributed database that scales seamlessly with guaranteed latency and throughput. If there is existing data in other data sources, we can also import data from data sources such as JSON files, CSV files, SQL database, and Azure Table storage to the Cosmos DB with the Azure Cosmos DB Migration Tool.

Azure Cosmos DB Migration Tool can be downloaded as a pre-compiled binary.

Globally, supply chain with Industry 4.0 is transformed into a smart and effective procedure to produce new outlines of income. Hence, the key impression motivating Industry 4.0 is to guide companies by transforming current manual processes with digital technologies.

Hard-copy of container proof of delivery (POD), for example, is still necessary in today’s container trucking industry. Hence, storing images and files for document generation and printing later is still a key feature in the digitalised supply chain workflow.

Proof of Delivery is now still mostly recorded on paper and sent via email or instant messaging services like Whatsapp. There is also no acceptable standard for what a proof of delivery form should specify. Each company more or less makes up their own rules.

On Azure, we can make use of Blob Storage to store large, discrete, binary objects that change infrequently, such as the documents like Proof of Delivery mentioned earlier.

In addition, there is another service called Azure Files available to provide serverless enterprise-grade cloud file shares. Azure Files can thus completely replace or supplement traditional on-premises file servers or NAS devices.

Hence, as shown in the screenshot below, we can upload files from a computer to the Azure File Share directly. Then the files will be accessible in another computer which is also connected to the Azure File Share, as shown below.

We can mount Azure File Share on macOS, Windows, and even Linux.

The Data Team

Setting up a new data team, especially in a startup, is a challenging problem. We need to explore roles and responsibilities in the world of data.

There are basically three roles that we need to have in a data team.

  • Database Administrator: In charge of operations such as managing the databases, creating database backups, restoring backups, monitoring database server performance, and implementing data security and access rights policy.
    • Tools: SQL Server Management Studio, Azure Portal, Azure Data Studio, etc.
  • Data Engineer: Works with the data to build up data pipeline and processes as well as apply data cleaning routine and transformations. This role is important to turn the raw data into useful information for the data analysis.
    • Tools: SQL Server Management Studio, Azure Portal, Azure Synapse Studio.
  • Data Analysis: Explores and analyses data by creating data visualisation and reporting which transforms data into insights to help in business decision making.
    • Tools: Excel, Power BI, Power BI Report Builder

In 2016, Gartner, a global research and advisory firm, shared a Venn Diagram on how data science is multi-disciplinary as shown below. Hence, there are some crucial technical skills needed, such as statistics, querying, modelling, R, Python, SQL, and data visualisation. Besides the technical skill, the team also needs to be equipped with business domain knowledge and soft skills.

The data science Venn Diagram. (Image source: Gartner)

In addition, the data team can also be organised in two manners, according to Lisa Cohen, Microsoft Principal Data Science Manager.

  • Embedded: The data science teams are spread throughout the company and each of the teams serves specific functional team in the company;
  • Centralised: There will be a core data team providing services to all functional teams across the company.

References