Cloud Migration – Lessons learned

Enterprise cloud, hybrid-cloud, SaaS, PaaS… What do they all mean? Our technologist provides insight from his experience of doing a cloud migration for a non-profit organization and the lessons learned from this experience.

“In the Twenty 20’s, legacy business processes and infrastructure will be one of the biggest target for organizations looking to optimize and cut operational costs. Symple IT Solutions can help your organization navigate the rapidly changing and unpredictable technology landscape.”

Symple Chief Technology Officer

The cloud we will be going over today is the one offered by Microsoft, known as Microsoft Azure. Azure competes directly with Amazon’s AWS and holds the second-largest cloud market share. Microsoft has made tremendous strides embracing open-source technologies and baking it right into their products; a wise strategy, as we see the overall industry embracing open-source, and decentralization.

Serverless

If you’re an IT pro, or have been following the latest industry trends there is a good chance that you’ve heard or read the word serverless. Instead of giving you the google definition of what serverless actually is, we will speak from our experience on building out serverless services for an organization. The reference cloud provider here is Azure, but AWS and other providers have similar services usually under a different name.

  • Functions
    • Event Driven
    • Schedules
  • Infrastructure
    • Database
    • App Services

Through our wealth of knowledge of traditional business processes and most commonly used data integration schemes at small/medium sized businesses we are able to build a full picture on how we are able to deliver serverless services; not just for being trendy, but also providing business value in driving down costs and increasing reliability and up-time.

The case-study: Our client has various programs that execute through out the day on a schedule basis. These tasks are centralized on a single Windows Server and are executed by what is known as Task Scheduler in Windows. Upon initial analysis, we discover that a lot of these processes fail to run, are not instant (when they should be), and usually the business users have to wait 24 hours+ for the process to be executed to be able to complete tasks such as reporting, data entry, etc. The code behind these tasks are compiled .dll/.exe files so we can’t easily find out what version of the application/service is actually the compiled version running on the server. These issues are what we call ‘low hanging fruits’, and our techs actioned upon these immediately.

Our process: We started off by first locating the source code for all of these services. All of the code was written in C#/,NET framework, so we started off by first converting this code to .NET core aka .NET standard. This was not a must-do step, but the benefits provided by migrating the legacy code to a modern and platform-agnostic framework was a no-brainer. Not to mention the savings provided by running code on a Linux environment, vs. a Windows environment. Luckily, the libraries used in our client’s services were all simple HTTP and JSON libraries and were easily portable to .NET core so the migration was fairly straightforward. We identified the critical services that need to be run on an event basis and not scheduled and immediately discovered that our trigger service (the database that is first modified in the flow of the service) had this cool technology called Webhooks. Webhooks are essentially abstracted triggers that reside on the application layer and monitor changes to the database (add/update/remove), This provided the source application a way to send HTTP Post requests, with a consistent JSON payload, to our custom function app written in .NET core every time an event occurred; in this case it was every time a user completed a course on our client’s Learning Management System. Once the webhook fired, our code immediately ran processing all of our logic. We were able to test and verify this locally, by using a template called Function apps in Visual Studio (our preferred IDE); this template provides a way to run a local server that listens to post requests made to a localhost endpoint, and we were able to use Postman to send JSON data to our localhost endpoint. Once we had our code tested, we were ready to move to the cloud. Visual Studio provides an intuitive way to publish directly from the template project known as Function Apps, directly on Microsoft Azure all within the IDE; No Dev-Ops, Sysadmin required!. Once we selected our resource and geographical location for where to host this function, our function app was successfully published on the cloud, easy! Having our custom integrations run on the cloud on a serverless environment offers a wide range of wonderful capabilities such as queuing, 99.99% uptime, load distribution, intelligent error reporting/handling, and built in verbose logging.