Technology Platform

One Model.
One Language.
One Picture.

Vertallax is purpose-engineered for commercial GCs — a bitemporal, event-driven intelligence platform built around a canonical operating model that compresses institutional knowledge into daily operational clarity.

120+
Schema Tables
11
Functional Domains
118
Registered Event Types
140k+
Lines of Application Code
Core Stack
Django 6 / Python
PostgreSQL / JSONB
Anthropic Claude / Sonnet
Bitemporal / Event-sourced
Polymorphic / Canonical
01 — Platform Architecture
01

From Raw Input
to Decisive Action

All data — structured, semi-structured, unstructured, external — flows through a single import spine into a canonical operating model that surfaces awareness, enables decisions, and captures execution. The loop closes through continuous learning.

Compounding Returns
02 — Event Spine, Polymorphic & Canonical
02

The Backbone
of Every Record

Three interlocking architectural primitives give Vertallax its structural integrity. Together they ensure that every piece of data — regardless of origin, type, or timing — connects to a single, consistent operating model.

Primitive 01

The Event Spine

118 registered event types flow through a single temporal spine. Every state change — whether an Opportunity stage transition, a Task completion, or a Project milestone — writes a typed event record with actor, timestamp, and full payload. Nothing is deleted; history is immutable.

The spine is the single source of truth for everything the firm has ever done, believed, or decided. Pipeline Replay reconstructs any past state by scrubbing the event log.

118 Event Types Immutable Log
Primitive 02

Polymorphic Model

A single event structure captures any object type. An Opportunity state change, a ResourceCapacity update, and a PriceTransaction all resolve to the same event schema — with a discriminator field for the object type and a JSONB payload for type-specific data.

This means the spine never needs to be extended for new object types — only the payload schema changes. The query interface is uniform across the entire domain model.

JSONB Payload Typed Discriminator
Primitive 03

Canonical Objects

A canonical object layer — Company, Person, Project, Opportunity, Resource — anchors all enrichment, activity, and intelligence to stable, normalized identifiers. Every import, integration, and AI response resolves to a canonical record.

Canonical objects survive source system changes. A Procore project and an imported bid tab both resolve to the same canonical Project ID, enabling cross-domain analytics without duplication.

UUID Primary Keys Cross-domain Identity
03 — Import Layer
03

Four Sources.
One Pipeline.

Every external data source — regardless of format, structure, or origin — passes through the same six-step import pipeline into the canonical schema. Data doesn't just arrive; it transforms, validates, and becomes operational intelligence.

API
Enterprise Applications
Precon · Projects · Accounting · ERP

Live API connections with change-data capture (CDC) replication. Delta sync on each run — only changed records are pulled. Conflict-free merge on write with versioned audit trail. Supports Procore, Sage, Viewpoint, and custom REST endpoints. Schema mapping is firm-specific and persisted.

Semi-structured
Spreadsheets & Exports
Excel · Bid Tabs · CSV · Schedule exports

Upload-based ingestion with intelligent header detection and column mapping. PriceMappingRules handle bid tab variations across different owners and estimators. Tabular data is normalized to BossCostConcepts and benchmarked against ConceptBenchmarks on load.

Unstructured
Documents & Field Notes
RFIs · Field Notes · RFPs · Emails · Submittals

NLP-based entity extraction identifies canonical references (Company, Person, Project) within free-text documents. Extracted entities link to canonical records; the original document is stored with provenance. Enables search and retrieval across unstructured operational history.

External
Market & Intelligence Data
News · Public Postings · Economic Indices · LinkedIn

External feeds are normalized and linked to canonical Company and Person records. LinkedIn enrichment via Proxycurl surfaces relationship depth and firm affiliations. Economic indices feed into Price Intelligence benchmarks. All external data is timestamped for temporal attribution.

Import Pipeline — 7 Steps
Connect → Load
01
Connect
Authenticate source system. Establish sync protocol (REST, CDC, upload). Record connector state and last-sync timestamp for integration tracking.
02
Map / Normalize
Apply firm-specific field mappings. Translate source schema to canonical schema. Handle column aliases, renamed fields, and source-system versioning.
03
Parse
Extract structured entities from raw records. Resolve canonical IDs for Company, Person, Project. Detect and flag ambiguous references for human review.
04
Canonize
Resolve every imported record — regardless of source, format, or origin — to a stable, firm-wide identity so that a contact in a field note, a client in your CRM, and a name on a bid tab are recognized as the same person.
05
Enrich
Augment with external intelligence: LinkedIn profiles, market benchmarks, economic indices. Apply derived calculations (win probability, data completeness scores).
06
Validate
Type-check all fields. Verify referential integrity against canonical IDs. Flag data quality issues. Update Data Completeness score for affected records.
07
Load
Write to canonical schema. Fire event to Event Spine (import event type). Update version counter. Log to AskLog if AI-triggered. Reconcile against existing records using conflict-free merge.
04 — Data Model: Bitemporal, Digital Twin & Versioning
04

Hierarchical.
Adaptable. Alive.

The Vertallax data model is not a snapshot — it is a live digital twin of your firm's operational reality, structured hierarchically from raw inputs through derived intelligence, with full temporal precision and version history at every layer.

Architecture / Bitemporal

Two Timelines

Every record carries two timestamps: valid time (when the fact was true in the real world) and transaction time (when it was recorded in the system). This enables exact reconstruction of past states from any future vantage point — including retroactive corrections that preserve the original record.

Architecture / Digital Twin

Live Firm Mirror

The platform maintains a continuously-updated digital representation of your firm — every pursuit, project, resource, relationship, and price signal is modeled as a live object with current state and full history. The twin is queryable at any past moment via Pipeline Replay, and projectable forward via the Intelligence Engine.

Architecture / Versioning

No Hard Deletes

Every record carries a version counter. Updates increment the version and preserve the prior state in the event log. Corrections are recorded as new transactions against the original valid-time period. This means the system always knows what was believed at any point in time — not just what is believed now.

DATE / RISK SPINE RAW INPUT PLANNED / ESTIMATED INTERNAL MILESTONES RISK MONITORING DERIVED / CALCULATED

The Date/Risk Spine is the central axis of temporal truth — branches represent data layers from raw inputs (top) through derived intelligence (roots). Density reflects computational depth.

05 — Agentic AI & Natural Language
05

The Overnight PhD
for General Contractors

The AI layer doesn't just answer questions — it reasons across tools, constructs multi-step queries, and creates structured operational data from natural language. Every response is grounded in your firm's canonical records, not approximations.

Just Ask Interface

Natural language access to the entire firm's operational data via the Just Ask interface at /admin/ask/. Ask about pipeline status, resource availability, project health, pricing benchmarks — get precise, sourced answers from live firm data.

Agentic Tool Loop

The AI doesn't search once — it reasons across a loop of Phase A read-only tools: opportunity lookup, stage query, project status, resource capacity, price intelligence, pipeline summary, and firm context. Multi-step inference, not single-pass retrieval.

NL Opportunity Creation

Describe a new pursuit in plain English and the system populates canonical fields via the get_opportunity_defaults() cascade — stage, owner, lead type, budget range, market sector — sourced from firm history and the canonical operating model.

Structured Output Engine

Every AI response is grounded in structured records — not hallucinated summaries. Answers reference canonical IDs, field values, timestamps. The AskLog model captures every query, tool call, and response for audit and analytics. Decisions you can trace back to a record.

Decision Support

The Intelligence Engine surfaces Go/No-Go signals, at-risk pursuits, expiring opportunities, resource overload warnings, and margin anomalies — presented as actionable intelligence in the Just Ask interface and the Daily Plumb feed.

Compounding Data Moat

Every decision, correction, and enrichment captured becomes training signal for the next query. The longer a firm uses Vertallax, the sharper its intelligence — because the system's context is your firm's unique operational history, not generic benchmarks.

06 — Tracking: Activity, Engagement, Integration & Data
06

Nothing
Goes Unrecorded

Vertallax tracks four distinct dimensions of firm activity — not just what happened operationally, but how the platform itself is being used, how data quality evolves over time, and the health of every connected integration.

Tracking 01

Activity Logging

Every user action that modifies operational data writes to the activity log. The DeliverableActivity and TaskTransaction models capture who did what, when, against which canonical record. Activity feeds are surfaced in the Opportunity Intelligence Panel and Pursuit Hub.

DeliverableActivity model — per-deliverable action log
TaskTransaction model — task state change with actor
AskLog model — AI query audit trail
Event spine — system-level immutable log
Tracking 02

Engagement Tracking

The Firm Health & Engagement score (0–100) measures how actively and completely a firm is using the platform. Sub-scores across time-to-assign, task on-time rate, deliverable completion, and opportunity data staleness roll up to a single adoption health indicator.

Time-to-assign — resource allocation speed
Task on-time rate — execution discipline
Deliverable on-time rate — delivery health
Opportunity data staleness — pipeline hygiene
Tracking 03

Integration Tracking

Every connected data source has a tracked integration state: last-sync timestamp, sync health score, record count, error log, and drift detection. Integration tracking surfaces in the admin dashboard and triggers alerts when source systems change schema or fail to sync.

Connector state per integration
Delta sync audit per run
Schema drift detection
Error log with retry tracking
Tracking 04

Data Quality Tracking

Every canonical record carries a Data Completeness score — computed from required fields, enrichment depth, and freshness. Staleness flags trigger when records haven't been updated within the expected window. Completeness reporting surfaces gaps for remediation.

Field-level completeness scoring
Enrichment depth tracking
Staleness detection per record type
Firm-level data health dashboard
07 — Optimization Toolkit & Data Hound
07
Optimization Toolkit

Sharpen Every Decision

The Optimization Toolkit is a set of decision-support modules that surface improvement opportunities across the firm's core operational dimensions. It draws on the bitemporal event store and the canonical operating model to identify patterns, inefficiencies, and untapped leverage points.

Unlike static reporting, the Toolkit is prescriptive — it tells you not just what happened, but what to change and why.

Bid portfolio optimizer — win rate vs. margin trade-off analysis across pursuit history
Resource allocation optimizer — capacity vs. demand matching across active and projected work
Stage timing optimizer — identifies pursuits moving slower than firm baseline and flags intervention
Subcontractor selection optimizer — surfaces best-fit subs by trade, tier, relationship depth, and price history
Price calibration — benchmark bid assumptions against ConceptBenchmarks and market indices in real time
Go/No-Go scoring — weighted decision criteria aligned to firm strategy and historical win patterns
Data Hound

Know Your Data Before You Need It

Data Hound runs at firm onboarding and on-demand thereafter. It performs a comprehensive assessment of all available data — connected integrations, uploaded files, manually entered records — and produces a Data Readiness Report that tells you exactly what you have, what's missing, and what to do next.

It's how Vertallax turns a blank firm into an operational intelligence platform from day one.

Scans all connected data sources and uploaded files on firm startup
Identifies data gaps across all 11 functional domains — field by field
Generates a Data Readiness Score (0–100) with domain-level breakdown
Prioritizes enrichment actions by operational impact — what to fix first
Recommends integration connections based on detected data patterns
Produces a startup checklist surfaced in the Just Ask interface on first login
08 — Security & Data Isolation
08

Your Data.
Full Stop.

Firm-level isolation is structural — enforced at the ORM layer, not the application layer. No query runs without passing through FirmScopedManager. No cross-firm data is possible by design.

Security 01

ORM-Level Isolation

Every database query filters through FirmScopedManager at the Django ORM level. There is no application-layer bypass, no raw SQL risk — firm boundaries are enforced at the query construction layer, not at the presentation layer.

Enforced at Query Layer
Security 02

Immutable Audit Trail

118 registered event types capture every state change with actor identity, timestamp, and full payload. Nothing is deleted from the event spine — only superseded. Full reconstruction from any historical point is always possible.

Immutable Event Log
Security 03

AI Data Boundaries

The Just Ask AI operates with firm-scoped read-only tools. The AI cannot query across firm boundaries. All tool calls are logged to the AskLog audit model. API keys are stored as environment variables — never hardcoded.

Tool-Scoped Access

Decision-ready
starts here

See how Vertallax compresses decades of institutional intelligence into day-one operational clarity.

Request a Demo Explore the Platform