Skip to content

Gravitas Module

Gravitas is a dedicated container tracking system for the Gravitas entity, operating independently from the TAI-based shipment pipeline. It provides CSV-based shipment import, multi-container MBL support, and dedicated analytics.


Overview

Unlike the legacy system that depends on TAI webhooks, Gravitas manages its own container lifecycle through:

  • CSV Import: Bulk shipment creation via CSV upload
  • MBL-Centric Tracking: Uses Master Bill of Lading as the primary identifier
  • Multi-Container Support: Automatically discovers all containers under an MBL
  • Separate Data Models: Dedicated tables for shipments, containers, and events
  • Dedicated Scrapers: Terminal scraping specifically for Gravitas containers
  • Independent Dashboard: Separate analytics and KPI tracking

Key Differences from TAI Pipeline

FeatureTAI PipelineGravitas Module
Data SourceTAI WebhookCSV Upload
Primary KeyReference NumberMBL Number
Container DiscoverySingle container per webhookAll containers per MBL via Cargoes Flow
TAI DependencyRequiredNone (independent)
Data ModelsShipmentTai, ContainerGravitasShipment, GravitasContainer
Scraper Taskstrigger_scrapers_for_container_tasktrigger_gravitas_scrapers_task
DashboardUnified with TAISeparate Gravitas dashboard
User AccessRole-based (Admin/Manager/User)Admin import, role-based viewing

CSV Import Format

Expected Columns

ColumnRequiredDescriptionExample
File No.NoInternal file referenceFIL-2026-001
MB/L No.YesMaster Bill of LadingMAEU1234567890
OfficeNoOperating officeGravitas
ConsigneeNoConsignee nameABC Corp
Oversea AgentNoOverseas agentXYZ Logistics
Container No.NoPrimary container (optional)MSCU1234567
ShipperNoShipper nameGlobal Export Ltd

Import Behavior

  1. Upsert by MBL: If a GravitasShipment with the same MBL exists, it updates; otherwise creates new
  2. Multi-Container Discovery: Even if CSV lists one container, system queries Cargoes Flow to discover ALL containers under that MBL
  3. Automatic Tracking Sync: Each shipment triggers sync_tracking() to fetch Cargoes Flow data
  4. Scraper Trigger: For US destination containers, terminal scrapers are automatically triggered

Sample CSV

csv
File No.,MB/L No.,Office,Consignee,Oversea Agent,Container No.,Shipper
FIL-001,MAEU1234567890,Gravitas,ABC Corp,XYZ Logistics,MSCU1234567,Global Export Ltd
FIL-002,MAEU0987654321,Gravitas,DEF Inc,ABC Shipping,MSCU7654321,Asia Manufacturing

Data Models

GravitasShipment

The parent entity representing a shipment from CSV import.

python
class GravitasShipment:
    id: UUID                    # Primary key
    file_no: str                # File reference
    mbl_number: str             # Master Bill of Lading (unique key)
    office: str                 # Operating office (default: "Gravitas")
    consignee: str              # Consignee name
    oversea_agent: str          # Overseas agent
    container_number: str        # Primary container (optional)
    shipper: str                # Shipper name
    created_at: datetime        # Creation timestamp
    updated_at: datetime        # Last update timestamp
    
    # Relationships
    containers: List[GravitasContainer]  # One-to-many

GravitasContainer

Individual container tracking with terminal data and fee calculations.

python
class GravitasContainer:
    id: UUID                    # Primary key
    gravitas_shipment_id: UUID  # FK to GravitasShipment
    container_number: str         # ISO container number
    mbl_number: str             # Master Bill of Lading
    
    # Status & Location
    status: str                 # Container status
    pod_terminal: str             # Terminal name
    terminal_status: str        # available/not-available
    
    # Key Dates
    last_free_day: datetime     # Last Free Day (from scraper)
    last_return_day: datetime   # Last Return Day (from scraper)
    rail_last_free_day: datetime    # Rail LFD
    rail_last_return_day: datetime  # Rail LRD
    eta: datetime               # Estimated arrival
    
    # Fees
    demurrage_fee: Decimal      # Calculated demurrage
    detention_fee: Decimal       # Calculated detention
    daily_fee_rate: Decimal     # Daily rate (default: $150)
    
    # Tags & Status
    shipment_tags: List[str]    # Auto-generated: ["LFD", "LRD", "POD awaiting", ...]
    is_archived: bool           # Archive flag
    
    # Relationships
    shipment_events: List[GravitasContainerEvent]
    gravitas_shipment: GravitasShipment

Computed Properties

PropertyDescription
is_pod_awaitingDischarged at destination, awaiting pickup
is_pod_full_outContainer picked up full from port
is_pod_awaiting_full_out_railDischarged from rail, awaiting pickup
is_completedEmpty gate-in or status = completed/outdated
is_lfd_neededAwaiting at port but missing LFD
is_lrd_neededGated out full but missing LRD
needs_manual_alert_inputMissing fees after discharge/arrival
is_rail_shipmentHas rail dates or rail events
is_in_transitActive, hasn't arrived at destination

GravitasContainerEvent

Timeline events from Cargoes Flow API.

python
class GravitasContainerEvent:
    id: UUID                    # Primary key
    container_id: UUID          # FK to GravitasContainer
    code: str                   # Event code (e.g., "dischargeFromVessel")
    name: str                   # Event name
    actual_time: str            # Actual occurrence time
    estimate_time: str          # Estimated time
    location: str               # Location name
    location_code: str          # Location code
    transport_mode: str        # vessel/rail/truck
    location_role: str          # originPort/destinationPort
    location_terminal_name: str # Terminal name
    carrier_event_name: str     # Carrier description

API Endpoints

Import & Management

EndpointMethodDescriptionAuth
/api/v1/gravitas/importPOSTCSV upload for shipment importAdmin only
/api/v1/gravitas/shipmentsGETList all Gravitas shipmentsJWT
/api/v1/gravitas/shipments/{id}GETGet specific shipmentJWT
/api/v1/gravitas/shipments/{id}/syncPOSTForce tracking syncJWT

Container Operations

EndpointMethodDescriptionAuth
/api/v1/gravitas/containersGETList containers with paginationJWT
/api/v1/gravitas/containers/mbl-groupedGETGrouped by MBL with KPI filtersJWT
/api/v1/gravitas/containers/{id}GETGet specific containerJWT
/api/v1/gravitas/containers/{id}PATCHUpdate container dataJWT
/api/v1/gravitas/containers/{id}/scrapePOSTTrigger manual scraperJWT

Analytics

EndpointMethodDescriptionAuth
/api/v1/gravitas/analytics/dashboardGETDashboard KPIs and summariesJWT

KPI Filter Parameters

The /gravitas/containers/mbl-grouped endpoint supports filtering:

http
GET /api/v1/gravitas/containers/mbl-grouped?kpiFilter=rail-shipments
Filter ValueCondition
rail-shipmentsis_rail_shipment = true
manual-input-lfdis_lfd_needed = true
manual-input-lrdis_lrd_needed = true
demurrage-alertHas LFD, not gated out, not completed
detention-alertHas LRD, not completed
lfd-alertneeds_manual_alert_input = true
pod-full-outis_pod_full_out = true, not completed
pod-awaitingis_pod_awaiting = true, not completed
pod-need-attentionis_pod_awaiting OR is_pod_full_out, not completed
in-transitis_in_transit = true
arriving-todayETA exists and equals today
delayedETA exists and before today
untrackableStatus in ["untrackable", "UNTRACKABLE"]

Celery Tasks

Gravitas-Specific Tasks

Task NameScheduleDescription
poll_all_latest_updates_for_gravitasEvery 1h 20minSync all active Gravitas containers with Cargoes Flow
trigger_gravitas_scrapers_taskOn-demandTrigger terminal scrapers for specific Gravitas container

Task Flow

CSV Import


import_csv() ──► Create/Update GravitasShipment


sync_tracking() ──► Query Cargoes Flow by MBL

    ├─► Create/Update ALL GravitasContainers for MBL
    ├─► Create/Update GravitasContainerEvents


For each container at US destination:


trigger_gravitas_scrapers_task.delay()


Terminal Scraper ──► Update LFD, LRD, Holds, Terminal Status

Dashboard Analytics

The Gravitas dashboard provides KPIs specific to Gravitas operations:

MetricDescription
Total ShipmentsActive Gravitas shipments
Total ContainersNon-archived, non-completed containers
In TransitContainers not yet arrived at destination
Arriving TodayETA equals current date
DelayedETA before current date, not arrived
POD AwaitingDischarged but not picked up
POD Full OutPicked up full from port
Demurrage AlertLFD exists, not gated out, not completed
Detention AlertLRD exists, not completed
Manual Input LFDAwaiting at port, missing LFD
Manual Input LRDGated out, missing LRD
LFD AlertMissing fees at port
Rail ShipmentsRail-specific containers
UntrackableStatus = untrackable

Multi-Container MBL Handling

The Problem

Traditional CSV import assumed one row = one container. However, a single MBL can have multiple containers, and Cargoes Flow tracks them all.

The Solution

python
# During CSV import:
1. Create GravitasShipment from CSV row (MBL as key)

# During sync_tracking():
2. Query Cargoes Flow API by MBL only (not container-specific)
3. API returns ALL containers under that MBL
4. Create/update GravitasContainer for EACH container found
5. Each container gets same gravitas_shipment_id

Result: When you import a CSV with MBL MAEU1234567890 and one container listed, the system automatically discovers and tracks all 3 containers under that MBL.


Integration with Terminal Scrapers

Gravitas containers use the same terminal scrapers as the legacy system, but with Gravitas-specific webhook handling:

  1. Scraper Client (ScraperClient.trigger_scrapers()) sends requests to scraper microservice
  2. Scraper processes container and returns data via webhook
  3. Webhook Handler (process_scraper_webhook_task) identifies container as Gravitas type
  4. Data Update updates GravitasContainer instead of legacy Container

Supported scrapers for Gravitas:

  • Maher Terminal
  • Port Houston
  • Fenix Marine
  • PNCT
  • APM Terminals (LA & Miami)
  • Yusen/YTI
  • eModal
  • POMTOC
  • Conley Terminal
  • Port of Savannah
  • Baltimore Seagirt
  • ITS Terminal

Migration from Legacy to Gravitas

For shipments that need to move from TAI-based tracking to Gravitas:

  1. Export shipment data from TAI or existing system
  2. Format as Gravitas CSV with required columns
  3. Import via /api/v1/gravitas/import
  4. Verify containers appear in Gravitas dashboard
  5. Archive legacy containers if needed (optional)

NOTE

Gravitas and legacy containers can coexist. They use separate tables and dashboards.


Best Practices

For CSV Imports

  • Always include MBL: It's the primary key for Gravitas shipments
  • Container number optional: System will discover all containers via MBL
  • One MBL per row: Even if multiple containers, one CSV row per MBL
  • Consistent office naming: Use standard office names for proper filtering

For Operations

  • Monitor sync status: Check poll_all_latest_updates_for_gravitas task in Flower
  • Manual sync when needed: Use /gravitas/shipments/{id}/sync for urgent updates
  • KPI filters: Use dashboard filters to focus on specific container groups
  • Terminal data: LFD/LRD may take time to populate after initial import (depends on scraper schedule)

For Developers

  • Use transactions: CSV import uses DB transactions for consistency
  • Handle duplicates: MBL-based upsert prevents duplicate shipments
  • Event cleanup: Container sync clears old events before inserting new ones
  • Archival logic: Completed/outdated containers are automatically archived

FreightFlow Platform Documentation