BGM Express — Sales Call Playbook

Date: Friday, 18 April 2026, 2pm Contact: Benjamin Pencarski, Leiter Digitalisierung, BGM Express (Hamburg / Bietigheim-Bissingen) Gemma: Bijan Soltani, David Bader


1. Opening — Frame the Conversation

Goal: Position this as “we listened, here’s what we’d build” — not a generic pitch.

“Benjamin, wir haben uns dein Deck und das Gespräch nochmal genau angeschaut. Du willst keine neue Plattform — du willst Kontrolle, Stabilität und Flexibilität. Wir schlagen dir eine Architektur vor, die code-first und herstellerunabhängig ist. Nicht anti-Fabric, sondern: die Probleme lösen, die Fabric dir heute macht.”

Key framing principles: - Don’t position as “anti-Microsoft” — Benjamin said he’d stay on Fabric if it worked - Frame as “code-first, vendor-agnostic” — the principles behind the modern data stack - Emphasize: this solves each specific pain point from his presentation - Show that we took his input seriously (reference his slides directly)


2. Recap — What We Heard

Briefly confirm our understanding. This builds trust and gives Benjamin a chance to correct before we present.

His pain points (from his presentation + call): - Power Query M → Parquet handoff breaks unpredictably (twice in 4 weeks) - Time intelligence limited: single active date relationship in Power BI - Measures can’t be parameterized or reused — every variant manual - No version control, no infrastructure-as-code - Column-level security missing, governance not auditable - Self-service for business users doesn’t exist - AI/NLP: Fabric data agents expensive and underwhelming - Budget must stay at €300–500/month total

His non-negotiables: - Sales team keeps interactive dashboards (they love Power BI) - Must be maintainable by one technical person (Benjamin) - No vendor lock-in - Auditierbar — new team members can onboard quickly

Parking lot (Phase 2+): - Real-time layer (“Kolibri”) — live vehicle positions, live transport availability - Pricing simulation — upload portfolio, calculate at current rates, simulate scenarios - End-to-end pricing workflow (from price determination → agreement → contract → ERP entry)


3. Architecture Proposal

The Stack

┌─────────────────────────────────────────────────────────┐
│                      DATA SOURCES                       │
│              ERP (daily CSV dump or API)                 │
└──────────────────────┬──────────────────────────────────┘
                       │
                       ▼
┌─────────────────────────────────────────────────────────┐
│                   INGESTION (dlt)                        │
│     Python-based, version-controlled, tested            │
│     Runs on small Linux VM (Hetzner/similar)            │
└──────────────────────┬──────────────────────────────────┘
                       │
                       ▼
┌─────────────────────────────────────────────────────────┐
│                WAREHOUSE (Snowflake)                     │
│     EU Frankfurt region, Standard edition               │
│     Auto-suspend, consumption-based pricing             │
│     XS warehouse for dbt | XS warehouse for BI queries  │
└──────────┬───────────────────────────┬──────────────────┘
           │                           │
           ▼                           ▼
┌─────────────────────┐   ┌───────────────────────────────┐
│  TRANSFORMATION     │   │        VISUALIZATION          │
│  (dbt Core)         │   │                               │
│                     │   │  Sales: Power BI (Import mode) │
│  - Version control  │   │  Rest:  Lightdash (self-hosted)│
│  - Tests            │   │  AI:    NLP layer (Phase 2+)  │
│  - Documentation    │   │                               │
│  - Metrics layer    │   │                               │
└─────────────────────┘   └───────────────────────────────┘

Orchestration: Airflow (on same VM as dlt + Lightdash)
All code in git — full audit trail, PR-based changes

Why This Stack

Benjamin’s requirement How this stack solves it
Zeitintelligenz & flexible Zeitvergleiche dbt date spine + unlimited date relationships in Snowflake (no Power BI single-relationship constraint)
Measures & Flexibilität dbt metrics layer — define once, reuse everywhere, parameterizable
Sicherheit & Governance Snowflake RBAC + column-level security + git audit trail
Stabilität No more Power Query M → Parquet breakage — transformations are SQL in dbt, tested before deployment
Wartbarkeit dbt docs auto-generate lineage + documentation; monitoring via Airflow
Auditierbarkeit Every change is a git commit with PR review — new team members read the repo
Kein Lock-in All open-source (dbt, dlt, Lightdash, Airflow). Snowflake data exportable anytime
Self-Service Lightdash: business users explore data without SQL/DAX. Native dbt integration
KI / NLP Built on top of the dbt semantic layer — text-to-SQL on structured, well-documented data
Wirtschaftlichkeit See cost model below — fits within €300–500/month

4. Cost Model

Monthly Estimate

Snowflake (EU Frankfurt, Standard, on-demand): - dbt warehouse (XS, ~10 min/day, auto-suspend 60s): ~€15/month - BI query warehouse (XS, ~1.5 hrs active/day, auto-suspend 10 min): ~€180/month - Storage (10–50 GB compressed): <€1/month - Snowflake total: ~€195/month

Power BI Pro (for sales team): - €14/user/month (price increase since April 2025!) - Assuming 5 sales users: ~€70/month - Note: check if BGM already has M365 E5 (includes Power BI Pro)

Self-hosted infrastructure (Hetzner or similar): - Small VM for Lightdash + Airflow + dlt: ~€20–30/month

Lightdash: - Self-hosted (open source): €0 - (Cloud Pro at $3,000/month is out of budget scope)

dbt Core + dlt + Airflow: €0 (all open source)

Total: ~€285–295/month

If Power BI Pro is already included in existing Microsoft licenses: ~€215–225/month.

Compared to current: Benjamin spends €300–500/month on Fabric today. This stack is at the lower end or below current spend, while delivering significantly more capability.

Cost scaling


5. Power BI Integration — Specifics for Benjamin

Benjamin cares about this. Be prepared to go deep.

How Power BI connects to Snowflake: - Native connector built into Power BI (no third-party drivers) - Recommended: Import mode (not DirectQuery) for the sales team

Why Import mode: - Sub-second visual load times (vs. 89 seconds on DirectQuery for large datasets!) - Full DAX feature set — calculated columns, time intelligence, everything works - No Snowflake credits burned during dashboard interactions (only at refresh time) - 8 scheduled refreshes/day on Pro (every 3 hours) — sufficient for daily ERP batch data - Sales team won’t notice a difference from current Fabric experience

What changes vs. current Fabric setup: - Data freshness: currently near-real-time (Fabric Direct Lake) → now up to 3-hour lag (Import mode refresh cycle). For daily CSV batch data from ERP, this is irrelevant — data was already stale by design. - Semantic model: stays in Power BI, backed by Snowflake views instead of OneLake. Measures stay in DAX for Power BI. But now they’re ALSO defined in dbt (SQL) for Lightdash/AI/other consumers. - The breakage problem goes away: dbt transformations are tested before deployment. No more surprise Parquet schema mismatches.

Gotchas to be transparent about: - Power BI Pro is now €14/user/month (up from €10) - Dataset size limit of 1 GB on Pro — likely fine for BGM’s data volume, but verify - If sales team uses “Analyze in Excel” heavily → Import mode is fine for this - Schema changes in Snowflake require a metadata refresh in Power BI Desktop


6. Lightdash Positioning — Addressing Benjamin’s Skepticism

Benjamin tested Lightdash and was not impressed. We need to address this head-on.

What to say:

“Benjamin, du hattest Lightdash in der Demo getestet und warst nicht überzeugt. Das verstehen wir. Eine Demo-Umgebung ohne echtes Datenmodell dahinter kann keinen Wow-Effekt erzeugen. Lightdash lebt davon, dass das dbt-Modell sauber ist — Metriken, Beschreibungen, Beziehungen. Wenn das steht, wird Lightdash zum Self-Service-Tool, das sich fast von selbst erklärt. Wir zeigen dir gerne eine produktive Instanz bei einem bestehenden Kunden.”

Key arguments: 1. Demo ≠ Production: Lightdash on a well-modeled dbt project is fundamentally different from a generic demo. The semantic layer IS the product. 2. Solves the right problem: Lightdash is for non-sales users (ops, management, finance) who just need tables, filters, and self-service exploration. It’s not competing with Power BI for the sales team. 3. Native dbt integration: Lightdash reads dbt metrics, descriptions, and relationships directly. No duplicate definitions. 4. NLP/AI on the roadmap: Lightdash is actively building AI agents and NLP querying — built on the dbt semantic layer. 5. Offer a POC: Show Benjamin a real Lightdash production instance. If he’s still not convinced after seeing real data models, we can discuss alternatives (Metabase, Evidence).

If he pushes back hard on Lightdash: - Fallback: Metabase (also open source, self-hosted, more mature) - Fallback: Evidence (code-first BI, markdown-based dashboards) - Nuclear option: Power BI for everyone (simpler but more expensive per user, less self-service)


7. Phasing & Timeline

Present as phases so it feels manageable, not overwhelming.

Phase 1: Core Analytics Infrastructure (4–6 weeks)

Phase 2: Self-Service & Expanded Analytics (2–4 weeks)

Phase 3: AI/NLP & Pricing Simulation (4–6 weeks)

Phase 4: Real-Time (“Kolibri”) (scoping TBD)

Implementation target: Benjamin mentioned summer for implementation. Phase 1 could start in May/June, with Phase 2 following immediately.


8. Maintenance Model

Who runs this day-to-day?

“Benjamin, du bist technisch versiert und kannst diese Infrastruktur selbst betreiben. Der große Unterschied zu heute: alles ist Code. Wenn du eine Transformation ändern willst, änderst du ein SQL-File in dbt, schreibst einen Test, und deployst. Keine schwarze Box mehr. Und mit KI-Unterstützung — Claude, Copilot — kannst du dbt-Modelle, Python-Pipelines und SQL deutlich schneller schreiben.”

Two options: 1. Self-maintained: Benjamin maintains the stack with AI assistance (Claude Code, GitHub Copilot). dbt + git make changes safe and reversible. Snowflake is managed (no infra to maintain). The VM needs occasional updates. 2. Gemma maintenance retainer: Small monthly retainer for Gemma to handle monitoring, updates, dbt model changes, and incident response. Gives Benjamin peace of mind without a full-time hire.

Recommendation: Start with Gemma building Phase 1 + 2, then transition to Benjamin for day-to-day with an optional retainer for support. This de-risks the migration and trains Benjamin on the new stack.


9. Objection Handling

“Warum Snowflake? Ist das nicht teuer?”

“Für dein Datenvolumen — tägliche CSV-Dumps aus dem ERP — reden wir über ca. €200/Monat für Snowflake. Das ist weniger als dein aktueller Fabric-Spend. Snowflake ist verbrauchsbasiert: wenn nichts läuft, zahlst du nichts. Und du bekommst dafür eine Enterprise-Grade Warehouse mit Column-Level Security, Time Travel, und Zero-Copy Cloning für Tests.”

“Warum nicht einfach Fabric fixen?”

“Kannst du — aber die Grundprobleme bleiben: Power Query hat keine Versionskontrolle. Der M → Parquet Break ist systemisch. Die Zeitintelligenz-Limitierung ist ein Power BI Design-Constraint, nicht Fabric. Du würdest die gleichen Probleme in einer aufgeräumteren Umgebung haben.”

“Wer betreibt das, wenn ich nicht da bin?”

“Alles ist Code in einem Git-Repo. Jeder Entwickler — oder Gemma im Retainer — kann das Repo klonen und weitermachen. dbt generiert automatisch Dokumentation und Datenlineage. Das ist auditierbar by design.”

“Lightdash hat mich nicht überzeugt”

See Section 6 above — offer a production instance demo.

“Was passiert mit meinen bestehenden Power BI Reports?”

“Die bleiben. Wir migrieren sie auf Snowflake als Datenquelle. Import Mode — das heißt, die Performance bleibt gleich oder wird besser. Dein Vertrieb merkt keinen Unterschied.”

“Können wir erstmal klein anfangen?”

“Ja, genau das schlagen wir vor. Phase 1 migriert nur das, was heute existiert. Kein Feature-Creep, kein Big Bang. Erstmal Stabilität und Kontrolle. Alles weitere bauen wir darauf auf.”

“Was kostet das Projekt insgesamt?” (Gemma fees)

[Bijan: prepare your pricing for the engagement — Phase 1 scope + optional retainer. The infrastructure costs above are Benjamin’s direct costs. Gemma’s fees are separate.]


10. Close / Next Steps

Goal for end of call: Agreement on next step, ideally a scoped Phase 1 proposal.

Proposed close:

“Benjamin, als nächsten Schritt würden wir dir ein konkretes Angebot für Phase 1 schreiben — Scope, Timeline, Kosten. Du kannst dann entscheiden, ob wir im Sommer starten. Passt das?”

Ideal outcomes (in order of preference): 1. Benjamin agrees to a Phase 1 proposal → Bijan sends formal proposal next week 2. Benjamin wants to see a Lightdash production demo first → schedule demo 3. Benjamin wants to think about it → schedule follow-up in 1–2 weeks 4. Benjamin wants to compare with staying on Fabric → prepare a side-by-side comparison


Quick Reference: Key Numbers


Prepared by Gemmbot for Bijan Soltani & David Bader — April 17, 2026