Data Delivery and Data Pipelines

  • Determine as-is and target architectures.
  • Establish the volume and types of data in scope, including delivery timeline requirements.
  • Identify critical data services and map system dependencies.
  • Recommend platform technologies, including a roadmap and migration approach.
  • Provide robust and highly resilient data engineering services.
  • Enable machine learning and deep learning models to have robust feature pipelines that just work.

Tooling and Automation

  • Determine as-is and target tooling & automation. 
  • Identify target release and deployment models, putting the ops into DataOps.
  • Recommend a roadmap and migration approach from your existing data estate to a target data platform.
  • Enable a high degree of ML and DL model promotion and integration ensuring that model piplines are given the care that the data pipelines are.

Governance and Operations

  • Determine as-is and target operational environment, including security, tracing and logging requirements.
  • Determine current governance mechanisms and how these can be reaching the right balance between control and innovation.
  • Establish data retention requirements.
  • Provide a model governance layer to ensure models are performing as tested and drift is monitored that can be actioned.

Technical Skills and Training

  • Identify technical skills gaps.
  • Detail approach to resolving missing skills and technical knowledge.
  • Define job roles and key skills.
  • Build targeted training modules.
  • Transform Product Managers to Data Product Managers.

Platform and Data Security

  • Identify data security levels and restrictions.
  • Identify services handling sensitive and personal data.
  • Understand any geographical constraints around handling data.
  • Determine appropriate security model and environment.

Insights from our experts