The Cloud – Anagnorisis and Peripeteia

In my work here at Cloudstep we have two distinct sides to our business, a consulting practice “Jtwo Solutions” and a cloud modelling software and services practice “Cloudstep”. Working on both sides of these businesses affords me the benefit of hands on consulting, technical architecture and implementation as well as scenario based cost modelling activities with a wide range of government and commercial customers.

Recently, I’ve been reflecting on what it is that makes me happy about working with customers within these businesses.  I decided to set my self the challenge of coming up with just two words that could be used to articulate this in a concise form.

After reflecting on this for some time, two words come to mind, “Anagnorisis and Peripeteia”. After sleeping on it for a few days, these words seem to have stuck.

So what the hell is Anagnorisis and Peripeteia. . . ? In short, Aristotle made these words famous (for me anyway).

Aristotle

Anagnorisis:  the transition or change from ignorance to knowledge.

Peripeteia: a sudden or unexpected reversal of circumstances or situation.

When considering the meaning of these two words, I think they elegantly describe the two way street that is IT consulting and cost modelling. I’ve always enjoyed the excitement of the changing IT landscape, ever evolving, disruptive yet inspiring and endlessly yielding of new opportunities.

Opportunity is what business thrives on, competitive advantage can be found here. Businesses that capitalise on the right new knowledge / technology win. The trouble is, that new is only short lived and you have to stay ahead of the curve. In the fast paced, evolving IT space, anagnorisis is something you are constantly chasing.  

I repeatedly find myself in the position of educator and student, both assisting clients with the relentless learning and learning myself. This is delightful, challenging and terrifying all at the same time, but it’s what makes IT interesting and enjoyable for me.

This brings me to the second word. . . peripeteia.  Cloudstep provides customers with a multi-dimensional view of the cost of delivery of application workloads. We do this by modelling, teams of people, the functions they carry out, the applications they use, the infrastructure the applications live on and the underlying hosting costs of the infrastructure (servers, storage, networks, data centres).

With this data we can accurately articulate the true cost of a specific workload and conduct fair comparison with alternative delivery models like software as a service or a public cloud implementation.

Anagnorisis happens here too, but what is really beautiful is the peripeteia that this knowledge can enable. Cloudstep helps provide businesses with clarity and can enable them to see the most cost effective path forward. For me, I find happiness is the situation where a business can shift their focus from any undifferentiated workloads and shift the focus of their IT resources towards workloads that are specific to their core business, directing efforts towards innovation in their own space.

The future in IT that, I imagine, is one where we don’t have to spend as much time on undifferentiated workloads rather, one where we have more time to thrive on the new opportunities that are yet to come.

A career with a flammable CV

Planned Obsolescence

A baked in part of the design of technology products and an unavoidable side-effect of a career in IT

In a discussion with a colleague recently we reflected on how our careers and our CVs race ahead while the invisible fuse line of obsolescence comes along from behind and renders cherished skillsets and competencies burn away.  We have intimate knowledge of technologies nobody cares about anymore.  We were deeply familiar with products from companies now confined to a fringe article on Wikipedia.  We have programming languages on our CVs we’ll never use again, in fact when they are mentioned in a meeting we resist the urge to admit any knowledge.  The bullet points of our CVs settle over time into a thing we just call “experience”.    The fact that we have to reinvent ourselves every 5 years is exhausting but also exhilarating.  In some areas of IT that cycle is down to 12 or 18 months (Scriptaculous and Prototype, really?  All the cool kids use React, jQuery and Bootstrap now).

Few industries suffer from this planned obsolescence like IT.  Other professions are made redundant by change.  Ours has the redundancy built right in.  Through the decades, we have several careers in one.

There are two ways we can deal with this reality and only one that offers a clear path forward.

Option 1: Build a Moat (bad)

We can hunker down with our CV and resist change.  This is comfortable for a while, we end up being that heroic guru that saves the day every time.  The march of progress continues though and while we can fight change, it eventually overwhelms us.  What made us special, essential even is all of a sudden not needed anymore.  The reaction to this to build a moat around our technology or skillset.  We white-ant suggestions of anything new and act to engineer a climate of fear of change.  It’s not that the technology we work on is wrong, flawed or not in use anymore.  its just that improvements, efficiencies and lower costs can no longer be ignored.  The cost and risk of change is eventually outweighed by the benefits that can be realised.  The moat strategy comes unstuck.  This is often coupled with an unfortunate correlation between the point when you believe you are indispensable and the day you get your pink slip.  You end up being the COBOL programmer you used to consider a dinosaur.

Option 2: Be Willing to Experiment (good)

Another way of relentless change is to make it part of our career.  We should focus on the problem at hand, not the tool we use to solve it.  When we become involved and invested in a particular technology it often becomes the focus and we forget why we use it in the first place.  Load balancers and highly available database services with big arrays of web servers in the middle are great but their purpose is to deliver a website to people so they can go about their business more effectively.  It doesn’t mean that the technology and toolset isn’t important, but it is inescapable that they are a means to an end and no more.  We need to be prepared to throw away what we know and embrace something new if it’s a better solution to our problem.  If we look at what we do this way, the business will inevitably respect us for being part of the solution, not a roadblock.  None of this means that you throw everything out when something new comes along.  There is still the rule of “if it ain’t broke, don’t fix it”.  There’s a balance between keeping what works and being open to what’s new.

Observing this in the wild

Many of us in IT are consultants though.  We work in a wide range of organisations from large public companies and government agencies through to non-profits and medium sized businesses.  This broad experience across different industries with different cultures is challenging and fascinating but never dull.  What sticks out is the similarities.  We see Moat Builders and Experimenters everywhere.  In our practice we talk a lot about public cloud services in relation to traditional on-premises solutions.  This quickly flushes out the moat builders and the experimenters.  We look around meeting tables and pick who’s who based on the body language.  Crossed arms and leaning back are a good indicator.  But there are some who are open, that lean in and have open arms.  They engage with the conversation and want to learn.

In advocating for new technologies and practices it is part of our role to persuade people that this don’t represent a threat but an opportunity.  We should encourage people to try new things, to return to being out of their depths for a while in order to progress.  Ultimately the effort is well worth it.

How do we make Moat Builders into Experimenters

People’s livelihood, self respect and satisfaction comes from being useful, making a difference and feeling like they contribute to something.  There is a lot at stake so people need to feel comfortable and they need to be motivated.

  • Sell the change.  People need to buy in and for that to happen they need to be sold on the idea.  Explain to them why this new way of doing this is better than before.
  • Appeal to laziness.  Explain how it is easier than before to do the same thing.  Be careful though not to scare them into thinking that their job will be factored out.
  • Don’t call their baby ugly.  People’s skills and experience are hard won and their accomplishments should be respected.  Don’t belittle how its done now, explain how it could be better.
  • Keep going until they start convincing you.  What you’re looking for is people to start echoing back the value of what you’re telling them.  You want them to agree with you and be an advocate.

This has all happened before

Looking outside of IT we see many example of skills, professions and whole industries disappearing into history.  The industrial revolution changed the nature of work and mechanised manufacturing altered what it meant to be a craftsman.  At each point, people were freed from mundane, unfulfilling and often dangerous work.  Upheaval of this nature has consequences for individuals but society and civilisation moved on.  Whaling is no longer a sought after skill and neither is understanding X25 protocol communications.

Don’t be frightened of a changing CV, just be prepared to be up for the challenge of reinventing yourself over and over again.

Cognito authentication integration with Django using authorization code grant.

Note: Assumed knowledge of AWS Cognito backend configuration and underlying concepts, mostly it’s just the setup from an application integration perspective that is talked about here.

Recently we have been working on a Django project where a secure and flexible authentication system was required, as most of our existing structure is on AWS we chose Cognito as the backend.

Below are the steps we took to get this working and some insights learned on the way.

Django Warrant

The first attempt was using django_warrant, this is probably going to be the first thing that comes up when you google ‘how to django and cognito’.

Django_warrant works by injecting an authentication backend into django which does some magic that allows your username/password to be submitted and checked against a configured user pool, on success it authenticates you and if required creates a stub django user.

The basics of this were very easy to get working and integrated but had a few issues such as:

  • We still see username/password requests and have to send them on.
  • By default can only be configured for one user pool.
  • Does not support federated identity provider workflows.
  • Github project did not seem super active or updated.

Ultimately we chose not to use this module, however inspiration was taken from its source code to do some of the user handling stuff we implemented later on.

Custom authorization_code workflow implementation

This involves using the cognito hosted login form, which does both user pool and connected identity provider authentication (O365/Azure, Google, Facebook, Amazon) .

The form can be customised with HTML, CSS, images and put behind a custom URL, other aspects of the process and events can be changed and reacted upon using triggers and lambda.

Once you are authenticated in cognito it redirects you back to the page of your choosing (usually your applications login page or custom endpoint) with a set of tokens, using these tokens you then grab the authenticated users details and authenticate them within the context of your app.

The difference between authorization code grant and implicit grant are:

  • Implicit grant
    • Intended for client side authentication (javascript applications mostly)
    • Sends both the id_token (JWT) and acccess_token in the redirect response
    • Sends the tokens with an #anchor before them so it is not seen by the web server
    • https://your-app/login#id_token=n&auth_token=n
  • Authorization code grant
    • Intended for server side authentication
    • Sends a authorization code in the redirect response
    • Sends this as a normal GET parameter
    • https://your-app/login?code=n
    • Your application holds a preconfigured secret
    • Code + secret get turned into id_token token and access_token via oauth2/token endpoint

We chose to use the authorization code grant workflow, it takes a bit more effort to setup but is generally more secure and alleviates any hacky javascript shenanigans that would be needed to get implicit grant working with a django server based backend.

After these steps you can use boto3 or helpers to turn those tokens into a set of attributes (email, name, other custom attributes) kept by cognito. Then you simply hook this up to your internal user/session logic by matching them with your chosen attributes like email, username etc.

I was unable to find any specific library support to handle some aspects of this, like the token handling in python or the django integration so i have included some code which may be useful.

Code

This can be integrated into a view to get the user details from Cognito based on a token, this will be sitting at the redirect URL that cognito returns from.

import warrant
import cslib.aws

def tokenauth(request):
    authorization_code = request.GET.get("code")
    token_grabber = cslib.aws.CognitoToken(
        <client_id>
        <client_secret>
        <domain>
        <redir>
        <region>?
    )

    id_token, access_token = token_grabber.get(authorization_code)

    if id_token and access_token:
        # This uses warrant (different than django_warrant)
        # A helper lib that wraps cognito
        # Plain boto3 can do this also.  
        cognito = warrant.Cognito(
            <user_pool_id>
            <client_id>
            id_token=id_token,
            access_token=access_token,
        )

        # Their lib is a bit broken, because we dont supply a username it wont
        # build a legit user object for us, so we reach into the cookie jar....
        # {'given_name': 'Joe', 'family_name': 'Smith', 'email': 'joe@jtwo.solutions'}
        data = cognito.get_user()._data
        return data
    else:
        return None

Class that handles the oauth/token2 workflow, this is mysteriously missing from the boto3 library which seems to handle everything else quite well…

from http.client import HTTPSConnection
from base64 import b64encode
import urllib.parse
import json

class CognitoToken(object):
    """
    Why you no do this boto3...
    """
    def __init__(self, client_id, client_secret, domain, redir, region="ap-southeast-2"):
        self.client_id = client_id
        self.client_secret = client_secret
        self.redir = redir
        self.token_endpoint = "{0}.auth.{1}.amazoncognito.com".format(domain, region)
        self.token_path = "/oauth2/token"

    def get(self, authorization_code):
        headers = {
            "Authorization" : "Basic {0}".format(self._encode_auth()),
            "Content-type": "application/x-www-form-urlencoded",
        }

        query = urllib.parse.urlencode({
                "grant_type" : "authorization_code",
                "client_id" : self.client_id,
                "code" : authorization_code,
                "redirect_uri" : self.redir,
            }
        )

        con = HTTPSConnection(self.token_endpoint)
        con.request("POST", self.token_path, body=query, headers=headers)
        response = con.getresponse()

        if response.status == 200:
            respdata = str(response.read().decode('utf-8'))
            data = json.loads(respdata)
            return (data["id_token"], data["access_token"])

        return None, None

    def _encode_auth(self):
        # Auth is a base64 encoded client_id:secret
        string = "{0}:{1}".format(self.client_id, self.client_secret)
        return b64encode(bytes(string, "utf-8")).decode("ascii")

Further reading

Azure PowerShell ‘Az’ Module

https://azure.microsoft.com/en-us/blog/azure-powershell-az-module-version-1/

Microsoft released a new PowerShell module specifically for Azure late last year called “Az”. On the plus side Az ensures that Windows PowerShell and PowerShell Core users can get the latest Azure tooling from PowerShell on every platform be it Windows PowerShell or PowerShell Core for my preferred operating system macOs.

Microsoft state that the Az module will be updated on a two-week cadence and will always be up-to-date, so that’s nice.

I’ve resisted upgrading to the new Az module until the completion of a recent customer engagement so as to avoid any complexity that a switch in modules may introduce. Call me risk adverse I know. . .So now that the project is complete, I’m excited to make the switch.

Ok so how do I upgrade from AzureRM to Az?

If you’ve been using PowerShell for Azure, you undoubtedly already have the AzureRM module installed. So its out with the old and in with the new. . . To accomplish this task I made use of some simple PowerShell to find the modules installed with a name like AzureRM and then uninstall them. Here is the code I lazily leached from my colleague Arran Peterson after he successfully uninstalled the old modules.

Remove all the old AuzreRM modules first . . .

$azurerm = get-module -ListAvailable | ? {$_.Name -like “AzureRM*”}

ForEach ($module in $azurerm) {

$name = $module.Name

$version = $module.Version

Uninstall-Module -Name $Name -MaximumVersion $version -Force

}

At the time of writing this blog the latest version available from the PowerShell Gallery is 1.5.0 https://www.powershellgallery.com/packages/Az/1.5.0

To install the module simply open PowerShell on your machine and enter:

Install-Module -Name Az

Boom its that easy. . .

Ok Great, but wont this break all my scripts?

So when I first heard of the new module and the change in cmdlet namespace, my first reaction was shock. .  I’ve produced loads of PowerShell for customers over the past couple of years that use the “azurerem” named cmdlets.

Microsoft state on their PowerShell Az blog that ‘Users are not required to migrate from AzureRM, as AzureRM will continue to be supported. However, it is important to note that all new Azure PowerShell features will appear only in ‘Az’.’  So my old stuff would continue to work, but they also state ‘Az and AzureRM cannot be executed in the same PowerShell session.’ So I’d need to make customers aware that they cannot mix AzureRm and Az cmdlets within a single session.

This all sounds like a bunch of annoying conversations and explanations I’d be faced with, I began to feel frustrated and was questioning why Microsoft saw the need to rename all of their cmdlets. I could feel a hate blog brewing. . .

However, as I read more I came across a diamond in the rough. . .AzureRM Aliases. Ah someone at Microsoft has considered my pain. . I could feel the catharsis as I read the official migration guide https://github.com/Azure/azure-powershell/blob/master/documentation/migration-guides/Az.1.0.0-migration-guide.md and came across the following statement. ‘To make the transition to these new cmdlet names simpler, Az introduces two new cmdlets, Enable-AzureRmAlias and Disable-AzureRmAlias. Enable-AzureRmAlias creates aliases from the older cmdlet names in AzureRM to the newer Az cmdlet names. The cmdlet allows creating aliases in the current session, or across all sessions by changing your user or machine profile.’

What Now?

Its time for a coffee then back to more PowerShell. . Happy Days. .

SD-WAN made easy

I’ll start by asking you two questions:

Are you paying too much for your Wide Area Network (WAN)?

And, is it the best method of connecting to the public Cloud?

At cloudstep.io we are constantly looking for ways to improve our customers connectivity to the public cloud. We consider cloud network connectivity a foundation service that must be implemented at the beginning of a successful cloud journey. Getting it right at the start is imperative to allow any cloud service adoption to truely reach its potential and not suffer from underserved network issues like latency, bandwidth and round-trip.

If the public cloud is going to become your new datacenter then why not structure your network around it.

What if I could solve your cloud connectivity and WAN connectivity in a single solution. Azure WAN is a service that offers you a centralised software defined managed network. Connect all your sites via VPN or ExpressRoute to Azure WAN and let Microsoft become your network layer 3 cloud that traditional telco providers are probably charging you hand over fist for. Who better to become your network service provider for your software defined network (SDN) then one of the largest software companies in the world! Microsoft.

Commodity business grade internet services are becoming cheaper now thanks to things like the NBN where it is truely a race to the bottom in regards to price in my opinion, which is great for the consumer… finally! Procuring NBN business grade connections for each of your office locations and then use Azure WAN to quickly deploy a secure network for site-to-site and site-to-Azure connectivity.

I believe that a service like this is really here to disrupt traditional network service providers and add great value to existing or new Microsoft Azure customers.

We are always looking to save money in a move to the cloud, potentially your network cost could be your biggest reduction. Get in contact with us at cloudstep.io to see if we can help you reform your network.