Public procurement tooling: from messy tenders to decision-ready data (redacted)

Data Applications

Public procurement tooling: from messy tenders to decision-ready data (redacted)

By Alonso Valdés2025-09-20
#procurement#etl#pipeline#data app#redacted

Context

In public procurement, information is often scattered across multiple sources, with inconsistent schemas and fields that drift over time. The goal of this project was to turn tenders and awards into a coherent, analysis-ready view for monitoring, prioritization and reporting.

Note: this is a private case. Under NDA, screenshots are redacted and names, amounts and entities are omitted.

What I built

  • Pipeline for ingestion and normalization (validation, deduplication, change tracking).
  • Data model designed for analytics (contracts, suppliers, items, timelines, awards).
  • App for exploration, filters, and exportable reports.
  • Matching rules to unify suppliers and classifications.

Architecture (high level)

Sources CSV / HTML / API Ingestion validation + logs Normalization model + matching Warehouse analytics tables App + Reports filters + export
Simplified flow: sources → ingestion → normalization → warehouse → app.

Redacted UI

Example screen: filters, event table, and detail panels (redacted).

Outcome

  • Lower friction for supplier, timeline, and award analysis.
  • Consistent entities and classifications over time.
  • A reusable base for audits, alerts, and recurring reporting.