It’s all about the community
Vendor community days are always a mixed bag. Typically, these events can be marketing focused with many new and unproven technologies but can also provide an opportunity to network with people. The AWS community day that I attended recently was refreshingly different.
This event was attended by a range of AWS users, start-ups, and big corporates to create a truly community-driven event. Almost two thousand attendees gathered in London. I truly relished the opportunity to learn and share experiences with like-minded individuals and I was amazed to study the diverse use cases and cloud applications.
This is why being a part of the AWS community is so beneficial. It is an opportunity to support one another through these growth times in our careers. The beauty of the community approach, is no matter what level you are, you can learn and share your knowledge with others.
The AWS community day had two work streams: Introductory and Expert. I chose to attend a mix of both work streams to not only test my knowledge on AWS but to get a better flavour of the diverse challenges that other organisations are facing today.
In the spirit of community, here are the key lessons I learnt:
Leverage key features from AWS to achieve digitisation from masses of data
A global airline transformed their real time flight data using AWS.This was achieved by leveraging features such as Lambda, API Gateway, VPC, CloudWatch, Code Deploy, Code Pipeline, Kubernetes, Dynamo, Aurora and S3. The flight data is transformed into Dashboards using this infusion of services provided by AWS, which displays information such as aircraft altitude and speed used by engineers and the flight control officers on a daily basis. The number of companies who have architectures that deliver at this scale are few and far between. The next time I am dealing with large scale databases I aim to transform data within real-time environments.
AWS brings robots to life
I have been reading about DeepLens, which is a deep learning enabled camera that uses AI models to create computer vision based applications such as picture scanners. But this was my opportunity to see its AI capabilities in action.
By uploading a few images of different birds, the scanner (external hardware) scanned any image in the room using its camera. Once the DeepLens had figured out which bird it was, it would send the output using the speaker of the device. This was achieved using AWS SageMaker, DeepLens and Tensorflow, combined with a mix of supervised, unsupervised and continuous learning. This gave a blank piece of hardware AI capabilities, but also gave it machine learning capabilities. DeepLens is not one of those go-to-features for AI or Machine learning, but its capabilities still remain undiscovered. Hence this use case is not seen in many organisations, but when implemented, it proves that it is a model that can learn from various methods such as images and also by converting text to speech, to learn and develop.
Scan, contain and repeat is the new mantra for containerisation
A simple concept and yet most commonly overlooked when scaling up services is the scanning of containers before utilising them. By scanning the container before utilisation helps proactively remove the threats even before they occur. This also removes any errors that might come up during pre, post or during production using AWS ECR, Lambda and CodePipeline. This is such a simple concept, yet it astonishes me as to why most organisations don’t adopt it.
This was a refreshing event as it changed my perspective on vendor-driven events. It’s our time as young architects to not only bring a fresh perspective into the technology space, but also be innovative and creative with our ideas to help shape the future of technology.
Email Shahrukh Khan for more information.