All Industries

Manufacturing

Data solutions for discrete and process manufacturing. From supply chain visibility to predictive maintenance, powered by AI agents for continuous quality and optimization.

Supply ChainODS MigrationPredictive MaintenanceQuality Analytics
All solutions powered by Claude Agents for Data Management
Learn About Our Agent Framework
Use Cases

Data Solutions for Manufacturing

Technical approaches with architecture patterns and code examples.

Supply Chain Analytics

End-to-End Visibility & Optimization

Build a unified view of your supply chain from suppliers to customers, enabling real-time visibility, demand sensing, and optimization.

The Challenge

Manufacturing supply chains generate massive amounts of data across ERP, MES, WMS, TMS, and IoT systems. Without integration, companies operate blind to disruptions, inventory imbalances, and optimization opportunities.

  • Siloed data across ERP, MES, WMS, and TMS systems
  • No real-time visibility into supplier performance
  • Manual demand forecasting with poor accuracy
  • Inability to simulate supply chain scenarios

Agent-Powered Solution

Deploy Claude Agents to discover, profile, and integrate supply chain data sources, then build a unified analytics layer for visibility and optimization.

Data Discovery
Catalogue all supply chain data sources and their relationships
DQ Profiler
Profile inventory, shipment, and supplier data for quality issues
Data Modeller
Design the supply chain data model with demand signals
DQ Recommender
Generate validation rules for supply chain data integrity

Architecture


┌─────────────────────────────────────────────────────────────────┐
│                 SUPPLY CHAIN DATA PLATFORM                       │
└─────────────────────────────────────────────────────────────────┘

Source Systems:
┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐
│   ERP   │ │   MES   │ │   WMS   │ │   TMS   │ │   IoT   │
│ (SAP)   │ │ (Shop)  │ │(Warehouse│ │(Logistics│ │ Sensors │
└────┬────┘ └────┬────┘ └────┬────┘ └────┬────┘ └────┬────┘
     │           │           │           │           │
     └───────────┴───────────┴───────────┴───────────┘
                             │
              ┌──────────────▼──────────────┐
              │     Discovery Agent         │
              │  • Source inventory         │
              │  • Schema mapping           │
              │  • Relationship detection   │
              └──────────────┬──────────────┘
                             │
              ┌──────────────▼──────────────┐
              │     Bronze Layer            │
              │  • Raw ingestion            │
              │  • CDC streaming            │
              │  • IoT time-series          │
              └──────────────┬──────────────┘
                             │
              ┌──────────────▼──────────────┐
              │     Profiler Agent          │
              │  • Inventory accuracy       │
              │  • Lead time patterns       │
              │  • Demand signal quality    │
              └──────────────┬──────────────┘
                             │
              ┌──────────────▼──────────────┐
              │     Silver Layer            │
              │  ┌───────────────────────┐  │
              │  │ dim_supplier          │  │
              │  │ dim_product           │  │
              │  │ dim_location          │  │
              │  │ fact_inventory        │  │
              │  │ fact_shipments        │  │
              │  │ fact_demand_signals   │  │
              │  └───────────────────────┘  │
              └──────────────┬──────────────┘
                             │
              ┌──────────────▼──────────────┐
              │     Gold Layer              │
              │  • Supply chain metrics     │
              │  • Demand forecasts         │
              │  • Optimization models      │
              └─────────────────────────────┘

Implementation Example

-- Supply Chain Analytics Models (dbt)
-- models/gold/supply_chain_metrics.sql

{{
  config(
    materialized='incremental',
    unique_key='metric_date_key',
    partition_by={'field': 'metric_date', 'data_type': 'date'},
    tags=['supply_chain', 'analytics']
  )
}}

WITH inventory_positions AS (
  SELECT
    location_id,
    product_id,
    DATE(snapshot_timestamp) as metric_date,
    quantity_on_hand,
    quantity_in_transit,
    quantity_on_order,
    safety_stock_level
  FROM {{ ref('silver_inventory_snapshots') }}
  {% if is_incremental() %}
  WHERE DATE(snapshot_timestamp) > (SELECT MAX(metric_date) FROM {{ this }})
  {% endif %}
),

demand_signals AS (
  SELECT
    product_id,
    location_id,
    DATE(signal_date) as metric_date,
    SUM(forecast_quantity) as forecasted_demand,
    SUM(actual_orders) as actual_demand
  FROM {{ ref('silver_demand_signals') }}
  GROUP BY 1, 2, 3
),

supplier_performance AS (
  SELECT
    supplier_id,
    product_id,
    DATE(delivery_date) as metric_date,
    AVG(DATEDIFF('day', order_date, delivery_date)) as avg_lead_time,
    COUNT(CASE WHEN on_time_flag THEN 1 END)::FLOAT / COUNT(*) as otif_rate,
    AVG(quality_score) as avg_quality_score
  FROM {{ ref('silver_purchase_orders') }}
  GROUP BY 1, 2, 3
)

SELECT
  {{ dbt_utils.generate_surrogate_key(['ip.metric_date', 'ip.location_id', 'ip.product_id']) }} as metric_date_key,
  ip.metric_date,
  ip.location_id,
  ip.product_id,
  -- Inventory Metrics
  ip.quantity_on_hand,
  ip.quantity_in_transit,
  ip.quantity_on_order,
  ip.safety_stock_level,
  -- Days of Supply
  CASE
    WHEN COALESCE(ds.forecasted_demand, 0) > 0
    THEN ip.quantity_on_hand / (ds.forecasted_demand / 30.0)
    ELSE NULL
  END as days_of_supply,
  -- Stockout Risk
  CASE
    WHEN ip.quantity_on_hand < ip.safety_stock_level THEN 'HIGH'
    WHEN ip.quantity_on_hand < ip.safety_stock_level * 1.5 THEN 'MEDIUM'
    ELSE 'LOW'
  END as stockout_risk,
  -- Demand Metrics
  ds.forecasted_demand,
  ds.actual_demand,
  ABS(ds.forecasted_demand - ds.actual_demand) / NULLIF(ds.actual_demand, 0) as forecast_error_rate,
  -- Supplier Metrics
  sp.avg_lead_time,
  sp.otif_rate,
  sp.avg_quality_score,
  CURRENT_TIMESTAMP() as _loaded_at
FROM inventory_positions ip
LEFT JOIN demand_signals ds
  ON ip.product_id = ds.product_id
  AND ip.location_id = ds.location_id
  AND ip.metric_date = ds.metric_date
LEFT JOIN supplier_performance sp
  ON ip.product_id = sp.product_id
  AND ip.metric_date = sp.metric_date

Expected Outcomes

40% improvement in forecast accuracy
Real-time visibility across 500+ suppliers
25% reduction in safety stock levels
$15M annual savings from inventory optimization

ODS Modernization

Legacy to Cloud-Native Platform

Migrate operational data stores from legacy on-premises systems to modern cloud platforms while ensuring zero data loss and business continuity.

The Challenge

Manufacturing ODS systems often run on aging infrastructure with complex ETL logic, undocumented transformations, and critical dependencies. Migration risks include production disruptions, data quality degradation, and loss of business logic.

  • Complex ETL dependencies across production systems
  • Undocumented business logic in stored procedures
  • Real-time integration with shop floor systems
  • Parallel operation requirements during cutover

Agent-Powered Solution

Use Claude Agents to reverse-engineer legacy ODS, document all transformations, generate migration validation rules, and orchestrate the cutover.

Data Discovery
Map all ODS objects, dependencies, and downstream consumers
DQ Profiler
Create baseline profiles for migration validation
DQ Recommender
Generate reconciliation rules for each table
Data Modeller
Design target architecture on Databricks/Snowflake

Architecture


┌─────────────────────────────────────────────────────────────────┐
│                   ODS MODERNIZATION JOURNEY                      │
└─────────────────────────────────────────────────────────────────┘

PHASE 1: Discovery & Assessment
┌───────────────────┐
│   Legacy ODS      │
│   (Oracle/SQL)    │
│                   │
│ ┌───────────────┐ │
│ │ 200+ Tables   │ │
│ │ 500+ Views    │ │
│ │ 1000+ Procs   │ │
│ └───────────────┘ │
└────────┬──────────┘
         │
         ▼
┌─────────────────────┐     ┌─────────────────────┐
│  Discovery Agent    │────▶│  Dependency Graph   │
│  • Object catalog   │     │  • ETL lineage      │
│  • Proc analysis    │     │  • Consumer map     │
│  • Usage patterns   │     │  • Criticality      │
└─────────────────────┘     └─────────────────────┘

PHASE 2: Profiling & Baseline
┌─────────────────────┐     ┌─────────────────────┐
│  Profiler Agent     │────▶│  Quality Baseline   │
│  • Row counts       │     │  • Column profiles  │
│  • Distributions    │     │  • Anomaly patterns │
│  • Key analysis     │     │  • Business rules   │
└─────────────────────┘     └─────────────────────┘

PHASE 3: Target Design
┌─────────────────────┐     ┌─────────────────────┐
│  Modeller Agent     │────▶│  Modern Platform    │
│  • Medallion design │     │  ┌───────────────┐  │
│  • dbt scaffolding  │     │  │ Bronze (raw)  │  │
│  • Delta Lake/Ice   │     │  │ Silver (clean)│  │
└─────────────────────┘     │  │ Gold (curated)│  │
                            │  └───────────────┘  │
                            └─────────────────────┘

PHASE 4: Migration & Validation
┌───────────────────┐       ┌─────────────────────┐
│   Legacy ODS      │──────▶│  Databricks/        │
└───────────────────┘       │  Snowflake          │
         │                  └──────────┬──────────┘
         │                             │
         └──────────┬──────────────────┘
                    ▼
         ┌─────────────────────┐
         │  Recommender Agent  │
         │  • Row reconciliation│
         │  • Aggregate checks │
         │  • Business rules   │
         └─────────────────────┘

Implementation Example

-- ODS Migration Reconciliation Framework
-- Agent-generated validation queries

-- 1. Row Count Reconciliation
CREATE OR REPLACE TABLE migration_validation.row_counts AS
WITH legacy_counts AS (
  SELECT
    'PRODUCTION_ORDERS' as table_name,
    COUNT(*) as legacy_count,
    CURRENT_TIMESTAMP() as checked_at
  FROM legacy_ods.production_orders
  UNION ALL
  SELECT 'WORK_ORDERS', COUNT(*), CURRENT_TIMESTAMP()
  FROM legacy_ods.work_orders
  UNION ALL
  SELECT 'INVENTORY_TRANSACTIONS', COUNT(*), CURRENT_TIMESTAMP()
  FROM legacy_ods.inventory_transactions
),
target_counts AS (
  SELECT
    'PRODUCTION_ORDERS' as table_name,
    COUNT(*) as target_count
  FROM modern_dwh.bronze_production_orders
  UNION ALL
  SELECT 'WORK_ORDERS', COUNT(*)
  FROM modern_dwh.bronze_work_orders
  UNION ALL
  SELECT 'INVENTORY_TRANSACTIONS', COUNT(*)
  FROM modern_dwh.bronze_inventory_transactions
)
SELECT
  l.table_name,
  l.legacy_count,
  t.target_count,
  l.legacy_count - t.target_count as difference,
  CASE
    WHEN l.legacy_count = t.target_count THEN 'PASSED'
    WHEN ABS(l.legacy_count - t.target_count) <= 10 THEN 'WARNING'
    ELSE 'FAILED'
  END as validation_status,
  l.checked_at
FROM legacy_counts l
JOIN target_counts t ON l.table_name = t.table_name;

-- 2. Aggregate Reconciliation (Numeric Columns)
CREATE OR REPLACE TABLE migration_validation.aggregate_checks AS
SELECT
  'PRODUCTION_ORDERS' as table_name,
  'quantity_produced' as column_name,
  -- Legacy aggregates
  (SELECT SUM(quantity_produced) FROM legacy_ods.production_orders) as legacy_sum,
  (SELECT AVG(quantity_produced) FROM legacy_ods.production_orders) as legacy_avg,
  -- Target aggregates
  (SELECT SUM(quantity_produced) FROM modern_dwh.bronze_production_orders) as target_sum,
  (SELECT AVG(quantity_produced) FROM modern_dwh.bronze_production_orders) as target_avg,
  -- Validation
  CASE
    WHEN ABS(
      (SELECT SUM(quantity_produced) FROM legacy_ods.production_orders) -
      (SELECT SUM(quantity_produced) FROM modern_dwh.bronze_production_orders)
    ) < 0.01 THEN 'PASSED'
    ELSE 'FAILED'
  END as sum_validation,
  CURRENT_TIMESTAMP() as checked_at;

-- 3. Business Rule Validation
CREATE OR REPLACE TABLE migration_validation.business_rules AS
SELECT
  'NEGATIVE_QUANTITIES' as rule_name,
  'PRODUCTION_ORDERS' as table_name,
  (SELECT COUNT(*) FROM legacy_ods.production_orders WHERE quantity_produced < 0) as legacy_violations,
  (SELECT COUNT(*) FROM modern_dwh.bronze_production_orders WHERE quantity_produced < 0) as target_violations,
  CASE
    WHEN (SELECT COUNT(*) FROM legacy_ods.production_orders WHERE quantity_produced < 0) =
         (SELECT COUNT(*) FROM modern_dwh.bronze_production_orders WHERE quantity_produced < 0)
    THEN 'PASSED'
    ELSE 'REVIEW'
  END as validation_status
UNION ALL
SELECT
  'ORPHAN_WORK_ORDERS',
  'WORK_ORDERS',
  (SELECT COUNT(*) FROM legacy_ods.work_orders wo
   WHERE NOT EXISTS (SELECT 1 FROM legacy_ods.production_orders po WHERE po.order_id = wo.production_order_id)),
  (SELECT COUNT(*) FROM modern_dwh.bronze_work_orders wo
   WHERE NOT EXISTS (SELECT 1 FROM modern_dwh.bronze_production_orders po WHERE po.order_id = wo.production_order_id)),
  'CHECK';

Expected Outcomes

100% data reconciliation on 500+ tables
80% reduction in infrastructure costs
Zero production downtime during cutover
10x query performance improvement

Predictive Maintenance

IoT Analytics & Failure Prediction

Leverage sensor data and machine learning to predict equipment failures before they occur, reducing unplanned downtime and maintenance costs.

The Challenge

Manufacturing equipment generates terabytes of sensor data that goes largely unused. Without predictive analytics, companies rely on time-based maintenance schedules that either waste resources on unnecessary maintenance or miss critical failures.

  • High-volume IoT data without analytics infrastructure
  • No correlation between sensor patterns and failures
  • Reactive maintenance culture with high downtime costs
  • Siloed data between OT and IT systems

Agent-Powered Solution

Build an IoT data platform with Claude Agents for data quality, feature engineering, and model monitoring.

Data Discovery
Catalogue all sensor feeds and maintenance records
DQ Profiler
Profile sensor data for anomalies and quality issues
DQ Recommender
Generate validation rules for sensor data integrity

Architecture


┌─────────────────────────────────────────────────────────────────┐
│               PREDICTIVE MAINTENANCE PLATFORM                    │
└─────────────────────────────────────────────────────────────────┘

IoT Edge Layer:
┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐
│Vibration│ │ Temp    │ │Pressure │ │ Power   │
│ Sensors │ │ Sensors │ │ Sensors │ │ Meters  │
└────┬────┘ └────┬────┘ └────┬────┘ └────┬────┘
     │           │           │           │
     └───────────┴───────────┴───────────┘
                       │
            ┌──────────▼──────────┐
            │   Edge Gateway      │
            │   • Data collection │
            │   • Edge filtering  │
            │   • Buffering       │
            └──────────┬──────────┘
                       │
            ┌──────────▼──────────┐
            │   Streaming Layer   │
            │   (Kafka/Kinesis)   │
            └──────────┬──────────┘
                       │
     ┌─────────────────┼─────────────────┐
     ▼                 ▼                 ▼
┌─────────┐     ┌───────────┐     ┌─────────────┐
│ Bronze  │     │ Profiler  │     │ Real-time   │
│ (Raw)   │     │ Agent     │     │ Anomaly     │
│         │     │ • Quality │     │ Detection   │
└────┬────┘     │ • Patterns│     └─────────────┘
     │          └───────────┘
     ▼
┌───────────────────────────────────────┐
│   Silver Layer                        │
│   • Cleaned sensor readings           │
│   • Equipment master data             │
│   • Maintenance history               │
└──────────────────┬────────────────────┘
                   │
┌──────────────────▼────────────────────┐
│   Feature Engineering                  │
│   • Rolling statistics (1h, 24h, 7d)  │
│   • Trend indicators                   │
│   • Anomaly scores                     │
└──────────────────┬────────────────────┘
                   │
┌──────────────────▼────────────────────┐
│   ML Models                            │
│   • Remaining Useful Life (RUL)       │
│   • Failure probability               │
│   • Maintenance scheduling            │
└───────────────────────────────────────┘

Implementation Example

-- Predictive Maintenance Feature Engineering
-- models/silver/sensor_features.sql

{{
  config(
    materialized='incremental',
    unique_key='feature_key',
    partition_by={'field': 'reading_hour', 'data_type': 'timestamp'},
    tags=['predictive_maintenance', 'iot']
  )
}}

WITH sensor_readings AS (
  SELECT
    equipment_id,
    sensor_type,
    reading_timestamp,
    reading_value,
    DATE_TRUNC('hour', reading_timestamp) as reading_hour
  FROM {{ ref('bronze_sensor_readings') }}
  {% if is_incremental() %}
  WHERE reading_timestamp > (SELECT MAX(reading_hour) FROM {{ this }})
  {% endif %}
),

-- Rolling statistics for pattern detection
rolling_stats AS (
  SELECT
    equipment_id,
    sensor_type,
    reading_hour,
    -- Current hour statistics
    AVG(reading_value) as avg_1h,
    STDDEV(reading_value) as stddev_1h,
    MIN(reading_value) as min_1h,
    MAX(reading_value) as max_1h,
    COUNT(*) as readings_1h,
    -- Rolling 24-hour statistics
    AVG(reading_value) OVER (
      PARTITION BY equipment_id, sensor_type
      ORDER BY reading_hour
      ROWS BETWEEN 23 PRECEDING AND CURRENT ROW
    ) as avg_24h,
    STDDEV(reading_value) OVER (
      PARTITION BY equipment_id, sensor_type
      ORDER BY reading_hour
      ROWS BETWEEN 23 PRECEDING AND CURRENT ROW
    ) as stddev_24h,
    -- Rolling 7-day statistics
    AVG(reading_value) OVER (
      PARTITION BY equipment_id, sensor_type
      ORDER BY reading_hour
      ROWS BETWEEN 167 PRECEDING AND CURRENT ROW
    ) as avg_7d
  FROM sensor_readings
  GROUP BY equipment_id, sensor_type, reading_hour, reading_value
),

-- Trend and anomaly indicators
trend_features AS (
  SELECT
    rs.*,
    -- Trend direction (comparing to 24h average)
    CASE
      WHEN rs.avg_1h > rs.avg_24h * 1.1 THEN 'INCREASING'
      WHEN rs.avg_1h < rs.avg_24h * 0.9 THEN 'DECREASING'
      ELSE 'STABLE'
    END as trend_direction,
    -- Volatility indicator
    CASE
      WHEN rs.stddev_1h > rs.stddev_24h * 2 THEN 'HIGH'
      WHEN rs.stddev_1h > rs.stddev_24h * 1.5 THEN 'ELEVATED'
      ELSE 'NORMAL'
    END as volatility_level,
    -- Anomaly score (z-score based)
    CASE
      WHEN rs.stddev_24h > 0
      THEN ABS(rs.avg_1h - rs.avg_24h) / rs.stddev_24h
      ELSE 0
    END as anomaly_score
  FROM rolling_stats rs
)

SELECT
  {{ dbt_utils.generate_surrogate_key(['equipment_id', 'sensor_type', 'reading_hour']) }} as feature_key,
  tf.*,
  -- Risk classification
  CASE
    WHEN tf.anomaly_score > 3 THEN 'CRITICAL'
    WHEN tf.anomaly_score > 2 THEN 'HIGH'
    WHEN tf.anomaly_score > 1 THEN 'MEDIUM'
    ELSE 'LOW'
  END as risk_level,
  CURRENT_TIMESTAMP() as _processed_at
FROM trend_features tf

Expected Outcomes

45% reduction in unplanned downtime
30% decrease in maintenance costs
Real-time equipment health monitoring
$8M annual savings from optimized maintenance

Quality Management Analytics

SPC, Defect Analysis & Traceability

Implement data-driven quality management with statistical process control, root cause analysis, and complete product traceability.

The Challenge

Quality issues are often detected too late in the manufacturing process, leading to costly recalls, rework, and customer complaints. Manual quality processes cannot keep pace with production volumes.

  • Delayed detection of quality deviations
  • Limited traceability for root cause analysis
  • Manual SPC with inconsistent application
  • Siloed quality data across plants and processes

Agent-Powered Solution

Build a unified quality data platform with automated SPC, AI-powered defect detection, and complete batch traceability.

DQ Profiler
Profile inspection and measurement data for quality patterns
Data Discovery
Map all quality-related data sources and genealogy
DQ Recommender
Generate SPC rules and control limits from historical data

Architecture


┌─────────────────────────────────────────────────────────────────┐
│                  QUALITY ANALYTICS PLATFORM                      │
└─────────────────────────────────────────────────────────────────┘

Data Sources:
┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐ ┌─────────┐
│ MES     │ │ LIMS    │ │ Vision  │ │ CMM     │ │ SPC     │
│ Systems │ │ (Lab)   │ │ Systems │ │ (Metrol)│ │ Manual  │
└────┬────┘ └────┬────┘ └────┬────┘ └────┬────┘ └────┬────┘
     │           │           │           │           │
     └───────────┴───────────┴───────────┴───────────┘
                             │
              ┌──────────────▼──────────────┐
              │     Quality Data Lake       │
              │  ┌────────────────────────┐ │
              │  │ fact_inspections       │ │
              │  │ fact_measurements      │ │
              │  │ fact_defects           │ │
              │  │ dim_product            │ │
              │  │ dim_equipment          │ │
              │  │ dim_batch              │ │
              │  └────────────────────────┘ │
              └──────────────┬──────────────┘
                             │
         ┌───────────────────┼───────────────────┐
         ▼                   ▼                   ▼
   ┌───────────┐      ┌───────────┐      ┌───────────┐
   │ SPC       │      │ Defect    │      │ Trace-    │
   │ Analytics │      │ Analysis  │      │ ability   │
   │ • Control │      │ • Pareto  │      │ • Batch   │
   │   charts  │      │ • Root    │      │   genealogy│
   │ • Cpk/Ppk │      │   cause   │      │ • Forward/│
   └───────────┘      └───────────┘      │   backward│
                                         └───────────┘

Implementation Example

-- Quality Analytics: SPC Control Chart Data
-- models/gold/spc_control_charts.sql

{{
  config(
    materialized='incremental',
    unique_key='measurement_key',
    tags=['quality', 'spc']
  )
}}

WITH measurements AS (
  SELECT
    m.measurement_id,
    m.product_id,
    m.characteristic_id,
    m.batch_id,
    m.measurement_value,
    m.measurement_timestamp,
    c.specification_lower,
    c.specification_upper,
    c.nominal_value
  FROM {{ ref('silver_measurements') }} m
  JOIN {{ ref('dim_characteristics') }} c
    ON m.characteristic_id = c.characteristic_id
  {% if is_incremental() %}
  WHERE m.measurement_timestamp > (SELECT MAX(measurement_timestamp) FROM {{ this }})
  {% endif %}
),

-- Calculate control limits (using moving range method)
control_limits AS (
  SELECT
    product_id,
    characteristic_id,
    AVG(measurement_value) as x_bar,
    AVG(measurement_value) + 3 * (AVG(moving_range) / 1.128) as ucl,
    AVG(measurement_value) - 3 * (AVG(moving_range) / 1.128) as lcl,
    AVG(moving_range) / 1.128 as sigma_estimate
  FROM (
    SELECT
      *,
      ABS(measurement_value - LAG(measurement_value) OVER (
        PARTITION BY product_id, characteristic_id
        ORDER BY measurement_timestamp
      )) as moving_range
    FROM measurements
  )
  GROUP BY product_id, characteristic_id
),

-- Calculate process capability
capability AS (
  SELECT
    cl.product_id,
    cl.characteristic_id,
    -- Cp: Potential capability
    (m.specification_upper - m.specification_lower) / (6 * cl.sigma_estimate) as cp,
    -- Cpk: Actual capability (accounts for centering)
    LEAST(
      (m.specification_upper - cl.x_bar) / (3 * cl.sigma_estimate),
      (cl.x_bar - m.specification_lower) / (3 * cl.sigma_estimate)
    ) as cpk
  FROM control_limits cl
  JOIN measurements m ON cl.product_id = m.product_id
    AND cl.characteristic_id = m.characteristic_id
  GROUP BY cl.product_id, cl.characteristic_id, m.specification_upper,
           m.specification_lower, cl.x_bar, cl.sigma_estimate
)

SELECT
  {{ dbt_utils.generate_surrogate_key(['m.measurement_id']) }} as measurement_key,
  m.*,
  cl.x_bar,
  cl.ucl,
  cl.lcl,
  cap.cp,
  cap.cpk,
  -- Out of control conditions (Western Electric rules)
  CASE
    WHEN m.measurement_value > cl.ucl OR m.measurement_value < cl.lcl THEN 'RULE_1_BEYOND_LIMITS'
    WHEN m.measurement_value > m.specification_upper THEN 'OUT_OF_SPEC_HIGH'
    WHEN m.measurement_value < m.specification_lower THEN 'OUT_OF_SPEC_LOW'
    ELSE 'IN_CONTROL'
  END as control_status,
  -- Capability status
  CASE
    WHEN cap.cpk >= 1.33 THEN 'CAPABLE'
    WHEN cap.cpk >= 1.0 THEN 'MARGINAL'
    ELSE 'NOT_CAPABLE'
  END as capability_status,
  CURRENT_TIMESTAMP() as _loaded_at
FROM measurements m
JOIN control_limits cl
  ON m.product_id = cl.product_id
  AND m.characteristic_id = cl.characteristic_id
JOIN capability cap
  ON m.product_id = cap.product_id
  AND m.characteristic_id = cap.characteristic_id

Expected Outcomes

60% faster defect detection
35% reduction in scrap and rework
Complete batch traceability for recalls
Automated SPC across 100+ quality characteristics
Success Stories

Manufacturing in Action

Anonymised examples of how we've helped manufacturers transform their data operations.

Global Automotive • Supply Chain

Supply Chain Visibility Platform

Built a unified supply chain analytics platform integrating 200+ suppliers, enabling real-time visibility and demand sensing.

35% reduction in stockouts
£8M annual inventory savings
SnowflakeFivetrandbt
Industrial Equipment • Predictive Maintenance

Asset Health Analytics Platform

Integrated IoT sensor data from 500+ assets with maintenance records, enabling predictive maintenance and reducing unplanned downtime.

40% reduction in unplanned downtime
25% extension of asset lifespan
DatabricksDelta LakeML

Client names withheld by policy. References available under NDA.

Ready to Transform Your Manufacturing Data?

Book a discovery call to discuss your specific operational and data challenges.