- Scalable cloud based solution
- Automated document generation
- Enabling product innovation
Legacy technology often becomes a blocker for companies looking to innovate and scale revenues from their digital products.
Here, we provide some insight into a client and project where legacy black box technology had a high cost of ownership that prevented them innovating and showcasing their excellence in the domain of environmental intelligence data. Makemedia were engaged to research the problems and deliver technical architecture and software to address key aspects that limited the business.
The client in this case produces a range of environmental reports such as home buyers reports that are based on complex environmental and GIS data. Internal teams were spending a lot of time maintaining obsolete technology with regular outages and general poor performance.
As part of a long term program of works, our research took the time to understand their operational workflows and GIS report generation business, the data behind it and the processes of obtaining this data. The primary development objective was to rebuild the underlying report generation process, and to then extend and integrate the benefits of this system to other aspects of the business.
GIS data and mapping technologies have different and complex infrastructure requirements, combined with the disparate legacy systems already in place, our delivery program involved a collaboration between our developers, their developers and their Amazon Web Services (AWS) dev-ops team.
The nature of the products meant that we needed to ensure high availability of services at peak times whilst keeping a low TCO (total cost of ownership) system running.
The problem was fascinating. How to automate the business workflow with a lot of separate data inputs, a variety of data formats and consultant commentary in to the mix; leading to production of an automated, dynamic and flexible PDF document. The systems needed to be resilient, scalable and allow for future innovation of products.
To tackle the problem, Makemedia used the pilot phase to research and understand the existing workflows, data flows and data repositories by observing customer facing consultants engage with their end users and produce the complex environmental reports.
This observation and understanding was then supported by an in depth systems architecture investigation. The combined results of contextual enquiry and analytical analysis of the systems produced a full and thorough picture of the existing workflow.
Working with the customer technical teams we then proceeded to identify risk areas in the existing systems, known areas of instability and the unsupported black box solutions.
Proof of Concept
One of the key outputs from the observational research was the amount of wasted productivity time of consultants creating and converting documents ready for their end customers. Automating this process would be key and identified as an area of technical uncertainty and complexity.
To tackle this complex problem early, Makemedia worked with the client to engineer a Proof of Concept to automate the automated JSON to PDF solution. It was felt that by focusing on addressing this issue first, consultant productivity would improve at the same time as providing a technical solution that allows for product innovation over the long term.
The PoC was then demonstrated to internal stakeholders and acted as a gate to the next stage of the project. With the research and Proof of Concept in place, we were then able to move in to an iterative development cycle with confidence.
As we moved in to full production, we iterated a version of our Smart Q to manage the data workflow. The Smart Q was an essential requirement to facilitate the management and orchestration of the workflow i.e What tasks can run when dependant on other tasks completing; a simple one-in, one-out, type queue would not be able to asynchronously handle multiple data sources, multiple tasks and processes with elegant alerting and monitoring.
Process monitoring was surfaced using a flexible and customisable Kibana dashboard on top of Elasticsearch (used as a NoSQL datastore), and so we were able to provide a reporting system that gave the customer services division more visibility of the back end operational performance than previously available.
Distributed environment and AWS
Having previously implemented the Smart Q in both Azure and AWS successfully, we knew that hosting the services in AWS wouldn’t be a technical challenge. However, in a dedicated staging environment, we were able to test a multi-node distributed environment on multiple platforms to ensure scalability and resilience
The power of the Smart Q and Elasticsearch was such that we were able to use AWS instances, developer laptops and in house servers to create a virtual mesh network that handled peak volumes of traffic easily when load tested.
The ability to scale the environments up and down according to traffic and system load provided the client and the central IT with a cost effective and Highly Available cloud hosting solution.
In addition, our client benefitted through:
- Personalised and complex PDF reports: Complex PDF reports produced automatically using an HTML template structure. This enabled the client to personalise reports, update the design and create new products.
- Rapid report generation: Report generation produced in a fraction of the previous time with improved accuracy
- Consultant productivity was increased significantly.
- Cost effective scaleable hosting: Installed in AWS as a scalable cloud based architecture solution.
- Real-time monitoring: Dashboard reporting for real-time queue and reporting metrics. Customer services now have improved visibility of the reporting system up time, bottlenecks and queue load.
- Future proof platform: Paving the way for product innovation!