Alscient added value with our team of data scientists initially exploring tools available within the market and working with our client to initially categorise the data and then to transform it into a format that was digestible by the solution. The data sets were extensive in size so we initially ran the data sets through a pre-processing engine to pull out just the salient data items which were of most value to the analysis.
The data was initially staged in S3 and then loaded into AWS where it was then ingested by a machine learning service to build a predictive model. The trained model was then accessed via API calls from the client’s own system to get a real-time result which gave a recommendation based on the specific characteristics presented into the model.
A data visualisation tool was also provided to our client using SAS Visual Analytics which provides a best of breed visualisation solution. This tool also ran within AWS on an EC2 instance. Since go live we have been responsible for the ongoing maintenance for the pipeline which extracts data, pre-processes it and then pushes it up to Amazon Comprehend for ingestion by the model.
We have also been responsible for performing upgrades to the solution and ongoing management and resolution of incidents via our ISO 20000 ITIL certified service management function.
We initially held a series of discovery workshops with the client to catalog their key data sources and to understand their primary strategic drivers for the solution. Any highly sensitive data was redacted within the model.
We also conducted competitor analysis using publicly available information published by the auctioneers to establish who our client’s main competitors were in terms of lot size and quality grade of material bought. We then completed price analysis, again using publicly available information from the auctioneers, and powerful processing technology from AWS and SAS to identify key pricing trends over a three-year period.
A design document was then created which explained the proposed solution architecture based on our understanding of our client’s requirements. This design was based on similar implementations for our other clients and drew on our vast data analytics experience.
The customer then approved the design, following which a series of development sprints were conducted to build the core functionality. Several weeks were spent in the data exploration phase to fine tune the developed model. The solution was then system tested by our team and then handed over to the customer for them to perform their own User Acceptance Testing.
The customer was provided with access to dedicated development, test and production environments with a strict change management process in force to move changes between environments.
A workflow pipeline (using CodePipeline) was also built to augment the built model to ensure it was constantly being updated over time.
The system is monitored by the Alscient service desk function with alerts and monitoring in place to proactively monitor the service. Our client uses this same team for periodic improvements to the analytic services which have been developed.