Vertallax is purpose-engineered for commercial GCs — a bitemporal, event-driven intelligence platform built around a canonical operating model that compresses institutional knowledge into daily operational clarity.
All data — structured, semi-structured, unstructured, external — flows through a single import spine into a canonical operating model that surfaces awareness, enables decisions, and captures execution. The loop closes through continuous learning.

Three interlocking architectural primitives give Vertallax its structural integrity. Together they ensure that every piece of data — regardless of origin, type, or timing — connects to a single, consistent operating model.
118 registered event types flow through a single temporal spine. Every state change — whether an Opportunity stage transition, a Task completion, or a Project milestone — writes a typed event record with actor, timestamp, and full payload. Nothing is deleted; history is immutable.
The spine is the single source of truth for everything the firm has ever done, believed, or decided. Pipeline Replay reconstructs any past state by scrubbing the event log.
118 Event Types Immutable LogA single event structure captures any object type. An Opportunity state change, a ResourceCapacity update, and a PriceTransaction all resolve to the same event schema — with a discriminator field for the object type and a JSONB payload for type-specific data.
This means the spine never needs to be extended for new object types — only the payload schema changes. The query interface is uniform across the entire domain model.
JSONB Payload Typed DiscriminatorA canonical object layer — Company, Person, Project, Opportunity, Resource — anchors all enrichment, activity, and intelligence to stable, normalized identifiers. Every import, integration, and AI response resolves to a canonical record.
Canonical objects survive source system changes. A Procore project and an imported bid tab both resolve to the same canonical Project ID, enabling cross-domain analytics without duplication.
UUID Primary Keys Cross-domain IdentityEvery external data source — regardless of format, structure, or origin — passes through the same six-step import pipeline into the canonical schema. Data doesn't just arrive; it transforms, validates, and becomes operational intelligence.
Live API connections with change-data capture (CDC) replication. Delta sync on each run — only changed records are pulled. Conflict-free merge on write with versioned audit trail. Supports Procore, Sage, Viewpoint, and custom REST endpoints. Schema mapping is firm-specific and persisted.
Upload-based ingestion with intelligent header detection and column mapping. PriceMappingRules handle bid tab variations across different owners and estimators. Tabular data is normalized to BossCostConcepts and benchmarked against ConceptBenchmarks on load.
NLP-based entity extraction identifies canonical references (Company, Person, Project) within free-text documents. Extracted entities link to canonical records; the original document is stored with provenance. Enables search and retrieval across unstructured operational history.
External feeds are normalized and linked to canonical Company and Person records. LinkedIn enrichment via Proxycurl surfaces relationship depth and firm affiliations. Economic indices feed into Price Intelligence benchmarks. All external data is timestamped for temporal attribution.
The Vertallax data model is not a snapshot — it is a live digital twin of your firm's operational reality, structured hierarchically from raw inputs through derived intelligence, with full temporal precision and version history at every layer.
Every record carries two timestamps: valid time (when the fact was true in the real world) and transaction time (when it was recorded in the system). This enables exact reconstruction of past states from any future vantage point — including retroactive corrections that preserve the original record.
The platform maintains a continuously-updated digital representation of your firm — every pursuit, project, resource, relationship, and price signal is modeled as a live object with current state and full history. The twin is queryable at any past moment via Pipeline Replay, and projectable forward via the Intelligence Engine.
Every record carries a version counter. Updates increment the version and preserve the prior state in the event log. Corrections are recorded as new transactions against the original valid-time period. This means the system always knows what was believed at any point in time — not just what is believed now.
The Date/Risk Spine is the central axis of temporal truth — branches represent data layers from raw inputs (top) through derived intelligence (roots). Density reflects computational depth.
The AI layer doesn't just answer questions — it reasons across tools, constructs multi-step queries, and creates structured operational data from natural language. Every response is grounded in your firm's canonical records, not approximations.
Natural language access to the entire firm's operational data via the Just Ask interface at /admin/ask/. Ask about pipeline status, resource availability, project health, pricing benchmarks — get precise, sourced answers from live firm data.
The AI doesn't search once — it reasons across a loop of Phase A read-only tools: opportunity lookup, stage query, project status, resource capacity, price intelligence, pipeline summary, and firm context. Multi-step inference, not single-pass retrieval.
Describe a new pursuit in plain English and the system populates canonical fields via the get_opportunity_defaults() cascade — stage, owner, lead type, budget range, market sector — sourced from firm history and the canonical operating model.
Every AI response is grounded in structured records — not hallucinated summaries. Answers reference canonical IDs, field values, timestamps. The AskLog model captures every query, tool call, and response for audit and analytics. Decisions you can trace back to a record.
The Intelligence Engine surfaces Go/No-Go signals, at-risk pursuits, expiring opportunities, resource overload warnings, and margin anomalies — presented as actionable intelligence in the Just Ask interface and the Daily Plumb feed.
Every decision, correction, and enrichment captured becomes training signal for the next query. The longer a firm uses Vertallax, the sharper its intelligence — because the system's context is your firm's unique operational history, not generic benchmarks.
Vertallax tracks four distinct dimensions of firm activity — not just what happened operationally, but how the platform itself is being used, how data quality evolves over time, and the health of every connected integration.
Every user action that modifies operational data writes to the activity log. The DeliverableActivity and TaskTransaction models capture who did what, when, against which canonical record. Activity feeds are surfaced in the Opportunity Intelligence Panel and Pursuit Hub.
The Firm Health & Engagement score (0–100) measures how actively and completely a firm is using the platform. Sub-scores across time-to-assign, task on-time rate, deliverable completion, and opportunity data staleness roll up to a single adoption health indicator.
Every connected data source has a tracked integration state: last-sync timestamp, sync health score, record count, error log, and drift detection. Integration tracking surfaces in the admin dashboard and triggers alerts when source systems change schema or fail to sync.
Every canonical record carries a Data Completeness score — computed from required fields, enrichment depth, and freshness. Staleness flags trigger when records haven't been updated within the expected window. Completeness reporting surfaces gaps for remediation.
The Optimization Toolkit is a set of decision-support modules that surface improvement opportunities across the firm's core operational dimensions. It draws on the bitemporal event store and the canonical operating model to identify patterns, inefficiencies, and untapped leverage points.
Unlike static reporting, the Toolkit is prescriptive — it tells you not just what happened, but what to change and why.
Data Hound runs at firm onboarding and on-demand thereafter. It performs a comprehensive assessment of all available data — connected integrations, uploaded files, manually entered records — and produces a Data Readiness Report that tells you exactly what you have, what's missing, and what to do next.
It's how Vertallax turns a blank firm into an operational intelligence platform from day one.
Firm-level isolation is structural — enforced at the ORM layer, not the application layer. No query runs without passing through FirmScopedManager. No cross-firm data is possible by design.
Every database query filters through FirmScopedManager at the Django ORM level. There is no application-layer bypass, no raw SQL risk — firm boundaries are enforced at the query construction layer, not at the presentation layer.
Enforced at Query Layer118 registered event types capture every state change with actor identity, timestamp, and full payload. Nothing is deleted from the event spine — only superseded. Full reconstruction from any historical point is always possible.
Immutable Event LogThe Just Ask AI operates with firm-scoped read-only tools. The AI cannot query across firm boundaries. All tool calls are logged to the AskLog audit model. API keys are stored as environment variables — never hardcoded.
Tool-Scoped AccessSee how Vertallax compresses decades of institutional intelligence into day-one operational clarity.
Request a Demo Explore the Platform