Skip to content

Terminal Scrapers

The FreightFlow Scraper microservice automates data collection from 15+ terminal portals across major US ports. Each scraper is designed to extract container availability, Last Free Day (LFD), customs holds, and terminal location data.


Terminal Support Matrix

#TerminalPortalLFDNotes
1Maher Terminalmahercsp.maherterminals.comCSP portal, Playwright-based
2Port Houstoncsp.porthouston.comLynx system
3Fenix Marine (CA)portal.fenixmarineservices.comAppointment portal
4PNCTpnct.netPort Newark Container Terminal
5Charleston Terminalgoport.scspa.comNo LFD/LRD data available
6APM Terminal LAapmterminals.comGlobal Track & Trace
7TRA PAC (LAX)losangeles.trapac.com-Bot detection prevents access
8Yusen (YTI)lynx.yti.comLynx system
9E-ModalVariousMulti-terminal platform
10Port of Savannahwebaccess.gaports.comGPA Express portal
11POMTOCpomtoc.tideworks.ioTideworks system
12Paul W. Conley (MCT)mct.tideworks.ioTideworks system
13Baltimore Seagirtportsamerica.comPorts America inquiry
14Port of Virginialynx.portofvirginia.comNo LFD data available
15APM Terminal Miamiapmterminals.comTerminal-specific search
16ITS Terminaltms.itslb.comRequires MFA verification
17ONE Carrierecomm.one-line.comWe use ONE carrier for ONE container LFD and terminal name

Active Terminals (15)

1. Maher Terminal

Portal: https://mahercsp.maherterminals.com/CSP/

AttributeValue
LocationPort Newark, NJ
ImplementationPlaywright + DynamicSession
Auth MethodUsername/Password
Data ExtractedLFD, Availability, Customs Holds, Yard Location
Scraper ClassMaherScraperService
Celery Taskscrape_maher_containers

Features:

  • Handles both single container detail view and multiple container list view
  • Extracts customs hold information (CTF, TMF, etc.)
  • Retrieves yard position/location
  • Validates container number format (11 characters)

Environment Variables:

MAHER_TERMINAL_USERNAME=
MAHER_TERMINAL_PASSWORD=
MAHER_CSP_URL=https://mahercsp.maherterminals.com/CSP/

2. Port Houston

Portal: https://csp.porthouston.com/Lynx/Login.aspx

AttributeValue
LocationHouston, TX
ImplementationAPI-based with session management
Auth MethodUsername/Password
Data ExtractedLFD, Availability, Holds, Equipment Info
Scraper ClassPortHoustonScraperService
Celery Taskscrape_port_houston_containers

Features:

  • Lynx CSP platform integration
  • Equipment availability tracking
  • Hold status detection

Environment Variables:

PORT_HOUSTON_USERNAME=
PORT_HOUSTON_PASSWORD=
PORT_HOUSTON_CSP_URL=

3. Fenix Marine Services (Los Angeles)

Portal: https://portal.fenixmarineservices.com/apptmt-app/home

AttributeValue
LocationLos Angeles, CA
ImplementationSelenium/Playwright UI automation
Auth MethodUsername/Password
Data ExtractedLFD, Availability, Customs Holds, Appointments
Scraper ClassFenixMarineScraperService
Celery Taskscrape_fenix_marine_containers

Features:

  • Appointment system integration
  • Real-time availability checking
  • Customs hold detection

Environment Variables:

FENIX_MARINE_USERNAME=
FENIX_MARINE_PASSWORD=
FENIX_MARINE_URL=

4. PNCT (Port Newark Container Terminal)

Portal: https://www.pnct.net/

AttributeValue
LocationNewark, NJ
ImplementationAPI + Legacy support
Auth MethodAPI Token
Data ExtractedLFD, Availability, Terminal Status
Scraper ClassPNCTScraperService
Celery Taskscrape_pnct_containers

Features:

  • Direct API integration where available
  • Priority processing in scraper endpoints list
  • Newark terminal-specific data

5. APM Terminals (Los Angeles)

Portal: https://www.apmterminals.com/track-and-trace/search

AttributeValue
LocationPier 400, Los Angeles, CA
ImplementationPlaywright with human-like behavior
Auth MethodPublic access (no login)
Data ExtractedLFD, Availability, Customs Holds, Yard Position
Scraper ClassApmScraperService
Celery Taskscrape_apm_containers

Features:

  • Shadow DOM piercing for data extraction
  • Human-like mouse movements and scrolling
  • Cookie consent handling
  • Multi-location search (LA → Elizabeth fallback)

Implementation Details:

  • Uses DynamicSession with anti-bot measures
  • Navigates through Region → Country → Location selection
  • Handles both mc-table and legacy table formats

6. Yusen Terminal (YTI)

Portal: https://lynx.yti.com/Login.aspx

AttributeValue
LocationVarious US ports
ImplementationAPI-based
Auth MethodUsername/Password
Data ExtractedLFD, LRD, Availability, Customs Status
Scraper ClassYusenScraperService
Celery Taskscrape_yusen_containers

Features:

  • Lynx system integration
  • Equipment availability
  • Hold status tracking

Environment Variables:

YUSEN_USERNAME=
YUSEN_PASSWORD=
YUSEN_LOGIN_URL=
YUSEN_SEARCH_URL=

7. E-Modal Platform

Portal: Various terminal portals via eModal

AttributeValue
ImplementationAPI-based integration
Auth MethodUsername/Password
Data ExtractedLFD, Availability, Gate Status, Appointments
Scraper ClassEmodalScraperService
Celery Taskscrape_emodal_containers

Features:

  • Multi-terminal platform support
  • Appointment scheduling data
  • Gate status tracking

Environment Variables:

EMODAL_USERNAME=
EMODAL_PASSWORD=

8. POMTOC (Port of Miami Terminal Operating Company)

Portal: https://pomtoc.tideworks.io/fc-POM-AWS/default.do

AttributeValue
LocationMiami, FL
ImplementationTideworks API integration
Auth MethodUsername/Password
Data ExtractedLFD, Availability, Customs Holds
Scraper ClassPomtocScraperService
Celery Taskscrape_pomtoc_containers

Features:

  • Tideworks system integration
  • Miami port-specific data
  • Customs hold detection

Environment Variables:

POMTOC_USERNAME=
POMTOC_PASSWORD=

9. Paul W. Conley Marine Terminal (MCT)

Portal: https://mct.tideworks.io/fc-MCT/default.do

AttributeValue
LocationBoston, MA
ImplementationTideworks API integration
Auth MethodUsername/Password
Data ExtractedLFD, Availability, Customs Holds
Scraper ClassConleyScraperService
Celery Taskscrape_conley_containers

Features:

  • Tideworks system integration
  • Boston port operations
  • Container availability tracking

Environment Variables:

CONLEY_USERNAME=
CONLEY_PASSWORD=

10. Port of Savannah (GPA)

Portal: https://webaccess.gaports.com/express/displayReport.do

AttributeValue
LocationSavannah, GA
ImplementationGPA Express portal automation
Auth MethodUsername/Password
Data ExtractedLFD, Availability, Vessel Info, Customs Status
Scraper ClassSavannahScraperService
Celery Taskscrape_savannah_containers

Features:

  • Georgia Ports Authority integration
  • Vessel schedule information
  • Customs status tracking

Environment Variables:

SAVANNAH_USERNAME=
SAVANNAH_PASSWORD=

11. Baltimore - Seagirt Terminal

Portal: https://www.portsamerica.com/resources/inquiries

AttributeValue
LocationBaltimore, MD
ImplementationPorts America inquiry system
Auth MethodPublic inquiry form
Data ExtractedLFD, Availability, Terminal Status
Scraper ClassBaltimoreSeagirtScraperService
Celery Taskscrape_baltimore_seagirt_containers

Features:

  • Ports America container inquiry
  • Baltimore Seagirt specific
  • Container-by-container lookup

12. APM Terminal Miami

Portal: https://www.apmterminals.com/track-and-trace/import

AttributeValue
LocationMiami, FL
ImplementationPlaywright with terminal-specific search
Auth MethodPublic access (no login)
Data ExtractedLFD, Availability, Customs Holds
Scraper ClassApmMiamiScraperService
Celery Taskscrape_apm_miami_containers

Features:

  • APM Terminals Miami-specific
  • Same core logic as APM LA with location pre-selection
  • Miami port container tracking

13. ITS Terminal (INTL TRANSPORTATION)

Portal: https://tms.itslb.com/tms2/Account/Login

AttributeValue
LocationLong Beach, CA
ImplementationPlaywright with persistent session
Auth MethodUsername/Password + MFA
Data ExtractedLFD, Availability, Customs Status, Location
Scraper ClassITSScraperService
Celery Taskscrape_its_containers

Features:

  • MFA Support: Requires email verification code automation
  • Persistent browser session across tasks
  • Session state saving to minimize re-login
  • Outlook email integration for MFA codes

Environment Variables:

ITS_LOGIN_URL=https://tms.itslb.com/tms2/Account/Login
ITS_AVAILABILITY_URL=https://tms.itslb.com/tms2/Import/ContainerAvailability
ITS_USERNAME=
ITS_PASSWORD=
OUTLOOK_EMAIL=
OUTLOOK_PASSWORD=

MFA Flow:

  1. Scraper logs in with credentials
  2. System sends verification code via email
  3. email_util retrieves code from Redis (populated by PowerAutomate)
  4. Code entered automatically, session established
  5. Session state saved to /app/app/its_session.json

14. ONE Carrier

Portal: https://ecomm.one-line.com/one-ecom/manage-shipment/inbound-master

AttributeValue
LocationGlobal / US Ports
ImplementationScrapling + DynamicSession
Auth MethodUsername/Password (OIDC Flow)
Data ExtractedLFD, Availability, Terminal Name
Scraper ClassOneCarrierScraperService
Celery Taskscrape_one_carrier_containers

Features:

  • Session persistence via one_cookies.json to bypass repetitive OIDC flows
  • Scrapes Inbound Demurrage Free Time for LFD tracking
  • Navigates to custom Cargo Tracking portal to dynamically extract terminal names based on text analysis
  • Automatically handles promotional modals and cookie banners during execution

Environment Variables:

ONE_USERNAME=
ONE_PASSWORD=

Inactive/Blocked Terminals (3)

14. Charleston Terminal (SCSPA)

Portal: https://goport.scspa.com/scspa/

AttributeValue
StatusInactive
ReasonNo LFD/LRD data available in portal
LocationCharleston, SC

Notes:

  • Portal accessible but lacks critical LFD/LRD information
  • Not included in active scraping schedule
  • May be revisited if data availability changes

15. TRA PAC (Los Angeles)

Portal: https://losangeles.trapac.com/quick-check/

AttributeValue
StatusBlocked
ReasonBot detection prevents automated access
LocationLos Angeles, CA

Notes:

  • Advanced bot detection mechanisms
  • Requires sophisticated anti-detection measures
  • Currently not feasible for automated scraping
  • Manual lookup only

16. Port of Virginia

Portal: https://lynx.portofvirginia.com/Pages/Imports/importHaz.aspx

AttributeValue
StatusInactive
ReasonNo LFD data available in portal
LocationNorfolk, VA

Notes:

  • Portal focuses on hazardous cargo information
  • LFD data not exposed through available interfaces
  • Not included in active scraping schedule

Scraper Architecture


Standardized Data Format

All terminal scrapers return data in a standardized format:

python
{
    "Container": "ABC1234567",
    "Available": "Yes" | "No",
    "Customs Holds": "Yes" | "No",
    "Freight Released": "Yes" | "No", 
    "USDA Hold": "Yes" | "No",
    "Other Agency Hold": "Yes" | "No",
    "Terminal Hold": "Yes" | "No",
    "Location": "Terminal Name / Yard Position",
    "lfd": "YYYY-MM-DD" | "",
    "terminal": "Terminal Name"
}

The standardize_container_data() utility function in app/utils/standardize.py ensures consistent formatting across all scrapers.


Security Considerations

CAUTION

Credential Management: All terminal credentials are stored as environment variables and never committed to version control.

IMPORTANT

Rate Limiting: Scrapers implement human-like delays between requests to avoid overwhelming terminal portals.

WARNING

Session Persistence: ITS terminal uses persistent session files. Ensure proper file permissions on /app/app/its_session.json.


Adding New Terminals

To add support for a new terminal:

  1. Create Service Class in app/services/terminal_port_services/:

    python
    class NewTerminalScraperService:
        def scrape_container_info(self, container_numbers: List[str]) -> List[Dict]:
            # Implementation
            pass
  2. Register in __init__.py:

    python
    from .new_terminal_service import NewTerminalScraperService
    __all__ = [..., "NewTerminalScraperService"]
  3. Add Celery Task in app/workers/tasks.py:

    python
    @shared_task(name="app.workers.tasks.scrape_new_terminal_containers")
    def scrape_new_terminal_containers(container_numbers: List[str]):
        # Implementation
        pass
  4. Add Environment Variables for credentials in .env files

  5. Update Documentation in this file

  6. Test Thoroughly before enabling in production

FreightFlow Platform Documentation