Terminal Scrapers
The FreightFlow Scraper microservice automates data collection from 15+ terminal portals across major US ports. Each scraper is designed to extract container availability, Last Free Day (LFD), customs holds, and terminal location data.
Terminal Support Matrix
| # | Terminal | Portal | LFD | Notes |
|---|---|---|---|---|
| 1 | Maher Terminal | mahercsp.maherterminals.com | ✅ | CSP portal, Playwright-based |
| 2 | Port Houston | csp.porthouston.com | ✅ | Lynx system |
| 3 | Fenix Marine (CA) | portal.fenixmarineservices.com | ✅ | Appointment portal |
| 4 | PNCT | pnct.net | ✅ | Port Newark Container Terminal |
| 5 | Charleston Terminal | goport.scspa.com | ❌ | No LFD/LRD data available |
| 6 | APM Terminal LA | apmterminals.com | ✅ | Global Track & Trace |
| 7 | TRA PAC (LAX) | losangeles.trapac.com | - | Bot detection prevents access |
| 8 | Yusen (YTI) | lynx.yti.com | ✅ | Lynx system |
| 9 | E-Modal | Various | ✅ | Multi-terminal platform |
| 10 | Port of Savannah | webaccess.gaports.com | ✅ | GPA Express portal |
| 11 | POMTOC | pomtoc.tideworks.io | ✅ | Tideworks system |
| 12 | Paul W. Conley (MCT) | mct.tideworks.io | ✅ | Tideworks system |
| 13 | Baltimore Seagirt | portsamerica.com | ✅ | Ports America inquiry |
| 14 | Port of Virginia | lynx.portofvirginia.com | ❌ | No LFD data available |
| 15 | APM Terminal Miami | apmterminals.com | ✅ | Terminal-specific search |
| 16 | ITS Terminal | tms.itslb.com | ✅ | Requires MFA verification |
| 17 | ONE Carrier | ecomm.one-line.com | ✅ | We use ONE carrier for ONE container LFD and terminal name |
Active Terminals (15)
1. Maher Terminal
Portal: https://mahercsp.maherterminals.com/CSP/
| Attribute | Value |
|---|---|
| Location | Port Newark, NJ |
| Implementation | Playwright + DynamicSession |
| Auth Method | Username/Password |
| Data Extracted | LFD, Availability, Customs Holds, Yard Location |
| Scraper Class | MaherScraperService |
| Celery Task | scrape_maher_containers |
Features:
- Handles both single container detail view and multiple container list view
- Extracts customs hold information (CTF, TMF, etc.)
- Retrieves yard position/location
- Validates container number format (11 characters)
Environment Variables:
MAHER_TERMINAL_USERNAME=
MAHER_TERMINAL_PASSWORD=
MAHER_CSP_URL=https://mahercsp.maherterminals.com/CSP/2. Port Houston
Portal: https://csp.porthouston.com/Lynx/Login.aspx
| Attribute | Value |
|---|---|
| Location | Houston, TX |
| Implementation | API-based with session management |
| Auth Method | Username/Password |
| Data Extracted | LFD, Availability, Holds, Equipment Info |
| Scraper Class | PortHoustonScraperService |
| Celery Task | scrape_port_houston_containers |
Features:
- Lynx CSP platform integration
- Equipment availability tracking
- Hold status detection
Environment Variables:
PORT_HOUSTON_USERNAME=
PORT_HOUSTON_PASSWORD=
PORT_HOUSTON_CSP_URL=3. Fenix Marine Services (Los Angeles)
Portal: https://portal.fenixmarineservices.com/apptmt-app/home
| Attribute | Value |
|---|---|
| Location | Los Angeles, CA |
| Implementation | Selenium/Playwright UI automation |
| Auth Method | Username/Password |
| Data Extracted | LFD, Availability, Customs Holds, Appointments |
| Scraper Class | FenixMarineScraperService |
| Celery Task | scrape_fenix_marine_containers |
Features:
- Appointment system integration
- Real-time availability checking
- Customs hold detection
Environment Variables:
FENIX_MARINE_USERNAME=
FENIX_MARINE_PASSWORD=
FENIX_MARINE_URL=4. PNCT (Port Newark Container Terminal)
Portal: https://www.pnct.net/
| Attribute | Value |
|---|---|
| Location | Newark, NJ |
| Implementation | API + Legacy support |
| Auth Method | API Token |
| Data Extracted | LFD, Availability, Terminal Status |
| Scraper Class | PNCTScraperService |
| Celery Task | scrape_pnct_containers |
Features:
- Direct API integration where available
- Priority processing in scraper endpoints list
- Newark terminal-specific data
5. APM Terminals (Los Angeles)
Portal: https://www.apmterminals.com/track-and-trace/search
| Attribute | Value |
|---|---|
| Location | Pier 400, Los Angeles, CA |
| Implementation | Playwright with human-like behavior |
| Auth Method | Public access (no login) |
| Data Extracted | LFD, Availability, Customs Holds, Yard Position |
| Scraper Class | ApmScraperService |
| Celery Task | scrape_apm_containers |
Features:
- Shadow DOM piercing for data extraction
- Human-like mouse movements and scrolling
- Cookie consent handling
- Multi-location search (LA → Elizabeth fallback)
Implementation Details:
- Uses
DynamicSessionwith anti-bot measures - Navigates through Region → Country → Location selection
- Handles both
mc-tableand legacy table formats
6. Yusen Terminal (YTI)
Portal: https://lynx.yti.com/Login.aspx
| Attribute | Value |
|---|---|
| Location | Various US ports |
| Implementation | API-based |
| Auth Method | Username/Password |
| Data Extracted | LFD, LRD, Availability, Customs Status |
| Scraper Class | YusenScraperService |
| Celery Task | scrape_yusen_containers |
Features:
- Lynx system integration
- Equipment availability
- Hold status tracking
Environment Variables:
YUSEN_USERNAME=
YUSEN_PASSWORD=
YUSEN_LOGIN_URL=
YUSEN_SEARCH_URL=7. E-Modal Platform
Portal: Various terminal portals via eModal
| Attribute | Value |
|---|---|
| Implementation | API-based integration |
| Auth Method | Username/Password |
| Data Extracted | LFD, Availability, Gate Status, Appointments |
| Scraper Class | EmodalScraperService |
| Celery Task | scrape_emodal_containers |
Features:
- Multi-terminal platform support
- Appointment scheduling data
- Gate status tracking
Environment Variables:
EMODAL_USERNAME=
EMODAL_PASSWORD=8. POMTOC (Port of Miami Terminal Operating Company)
Portal: https://pomtoc.tideworks.io/fc-POM-AWS/default.do
| Attribute | Value |
|---|---|
| Location | Miami, FL |
| Implementation | Tideworks API integration |
| Auth Method | Username/Password |
| Data Extracted | LFD, Availability, Customs Holds |
| Scraper Class | PomtocScraperService |
| Celery Task | scrape_pomtoc_containers |
Features:
- Tideworks system integration
- Miami port-specific data
- Customs hold detection
Environment Variables:
POMTOC_USERNAME=
POMTOC_PASSWORD=9. Paul W. Conley Marine Terminal (MCT)
Portal: https://mct.tideworks.io/fc-MCT/default.do
| Attribute | Value |
|---|---|
| Location | Boston, MA |
| Implementation | Tideworks API integration |
| Auth Method | Username/Password |
| Data Extracted | LFD, Availability, Customs Holds |
| Scraper Class | ConleyScraperService |
| Celery Task | scrape_conley_containers |
Features:
- Tideworks system integration
- Boston port operations
- Container availability tracking
Environment Variables:
CONLEY_USERNAME=
CONLEY_PASSWORD=10. Port of Savannah (GPA)
Portal: https://webaccess.gaports.com/express/displayReport.do
| Attribute | Value |
|---|---|
| Location | Savannah, GA |
| Implementation | GPA Express portal automation |
| Auth Method | Username/Password |
| Data Extracted | LFD, Availability, Vessel Info, Customs Status |
| Scraper Class | SavannahScraperService |
| Celery Task | scrape_savannah_containers |
Features:
- Georgia Ports Authority integration
- Vessel schedule information
- Customs status tracking
Environment Variables:
SAVANNAH_USERNAME=
SAVANNAH_PASSWORD=11. Baltimore - Seagirt Terminal
Portal: https://www.portsamerica.com/resources/inquiries
| Attribute | Value |
|---|---|
| Location | Baltimore, MD |
| Implementation | Ports America inquiry system |
| Auth Method | Public inquiry form |
| Data Extracted | LFD, Availability, Terminal Status |
| Scraper Class | BaltimoreSeagirtScraperService |
| Celery Task | scrape_baltimore_seagirt_containers |
Features:
- Ports America container inquiry
- Baltimore Seagirt specific
- Container-by-container lookup
12. APM Terminal Miami
Portal: https://www.apmterminals.com/track-and-trace/import
| Attribute | Value |
|---|---|
| Location | Miami, FL |
| Implementation | Playwright with terminal-specific search |
| Auth Method | Public access (no login) |
| Data Extracted | LFD, Availability, Customs Holds |
| Scraper Class | ApmMiamiScraperService |
| Celery Task | scrape_apm_miami_containers |
Features:
- APM Terminals Miami-specific
- Same core logic as APM LA with location pre-selection
- Miami port container tracking
13. ITS Terminal (INTL TRANSPORTATION)
Portal: https://tms.itslb.com/tms2/Account/Login
| Attribute | Value |
|---|---|
| Location | Long Beach, CA |
| Implementation | Playwright with persistent session |
| Auth Method | Username/Password + MFA |
| Data Extracted | LFD, Availability, Customs Status, Location |
| Scraper Class | ITSScraperService |
| Celery Task | scrape_its_containers |
Features:
- MFA Support: Requires email verification code automation
- Persistent browser session across tasks
- Session state saving to minimize re-login
- Outlook email integration for MFA codes
Environment Variables:
ITS_LOGIN_URL=https://tms.itslb.com/tms2/Account/Login
ITS_AVAILABILITY_URL=https://tms.itslb.com/tms2/Import/ContainerAvailability
ITS_USERNAME=
ITS_PASSWORD=
OUTLOOK_EMAIL=
OUTLOOK_PASSWORD=MFA Flow:
- Scraper logs in with credentials
- System sends verification code via email
email_utilretrieves code from Redis (populated by PowerAutomate)- Code entered automatically, session established
- Session state saved to
/app/app/its_session.json
14. ONE Carrier
Portal: https://ecomm.one-line.com/one-ecom/manage-shipment/inbound-master
| Attribute | Value |
|---|---|
| Location | Global / US Ports |
| Implementation | Scrapling + DynamicSession |
| Auth Method | Username/Password (OIDC Flow) |
| Data Extracted | LFD, Availability, Terminal Name |
| Scraper Class | OneCarrierScraperService |
| Celery Task | scrape_one_carrier_containers |
Features:
- Session persistence via
one_cookies.jsonto bypass repetitive OIDC flows - Scrapes Inbound Demurrage Free Time for LFD tracking
- Navigates to custom Cargo Tracking portal to dynamically extract terminal names based on text analysis
- Automatically handles promotional modals and cookie banners during execution
Environment Variables:
ONE_USERNAME=
ONE_PASSWORD=Inactive/Blocked Terminals (3)
14. Charleston Terminal (SCSPA)
Portal: https://goport.scspa.com/scspa/
| Attribute | Value |
|---|---|
| Status | ❌ Inactive |
| Reason | No LFD/LRD data available in portal |
| Location | Charleston, SC |
Notes:
- Portal accessible but lacks critical LFD/LRD information
- Not included in active scraping schedule
- May be revisited if data availability changes
15. TRA PAC (Los Angeles)
Portal: https://losangeles.trapac.com/quick-check/
| Attribute | Value |
|---|---|
| Status | ❌ Blocked |
| Reason | Bot detection prevents automated access |
| Location | Los Angeles, CA |
Notes:
- Advanced bot detection mechanisms
- Requires sophisticated anti-detection measures
- Currently not feasible for automated scraping
- Manual lookup only
16. Port of Virginia
Portal: https://lynx.portofvirginia.com/Pages/Imports/importHaz.aspx
| Attribute | Value |
|---|---|
| Status | ❌ Inactive |
| Reason | No LFD data available in portal |
| Location | Norfolk, VA |
Notes:
- Portal focuses on hazardous cargo information
- LFD data not exposed through available interfaces
- Not included in active scraping schedule
Scraper Architecture
Standardized Data Format
All terminal scrapers return data in a standardized format:
{
"Container": "ABC1234567",
"Available": "Yes" | "No",
"Customs Holds": "Yes" | "No",
"Freight Released": "Yes" | "No",
"USDA Hold": "Yes" | "No",
"Other Agency Hold": "Yes" | "No",
"Terminal Hold": "Yes" | "No",
"Location": "Terminal Name / Yard Position",
"lfd": "YYYY-MM-DD" | "",
"terminal": "Terminal Name"
}The standardize_container_data() utility function in app/utils/standardize.py ensures consistent formatting across all scrapers.
Security Considerations
CAUTION
Credential Management: All terminal credentials are stored as environment variables and never committed to version control.
IMPORTANT
Rate Limiting: Scrapers implement human-like delays between requests to avoid overwhelming terminal portals.
WARNING
Session Persistence: ITS terminal uses persistent session files. Ensure proper file permissions on /app/app/its_session.json.
Adding New Terminals
To add support for a new terminal:
Create Service Class in
app/services/terminal_port_services/:pythonclass NewTerminalScraperService: def scrape_container_info(self, container_numbers: List[str]) -> List[Dict]: # Implementation passRegister in
__init__.py:pythonfrom .new_terminal_service import NewTerminalScraperService __all__ = [..., "NewTerminalScraperService"]Add Celery Task in
app/workers/tasks.py:python@shared_task(name="app.workers.tasks.scrape_new_terminal_containers") def scrape_new_terminal_containers(container_numbers: List[str]): # Implementation passAdd Environment Variables for credentials in
.envfilesUpdate Documentation in this file
Test Thoroughly before enabling in production
