Our client is a fast-growing technology company building data-driven systems that power global commerce workflows. Their platform relies on large-scale data extraction, API integrations, and AI-powered automation to support advanced risk, onboarding, and operational pipelines across international markets.
Fully Remote | 9 AM - 5 PM EST
The Backend Engineer will build data-driven systems, scraping pipelines, API integrations, data ingestion services, and backend components for an LLM-powered engine. This role requires 4–5+ years of backend experience, strong data engineering instincts, and familiarity with Node.js, AWS, and Next.js-powered backends.
Data Extraction & Web Scraping
Build systems to scrape external websites for structured and unstructured data.
Automate data pipelines for ingestion, cleaning, and storage.
Ensure scrapers are scalable, resilient to layout changes, and compliant with applicable policies.
API Integrations & Ingestion Systems
Build integrations with third-party APIs (KYC/KYB, fraud, merchant data, etc.).
Handle authentication, retries, rate limits, and robust error handling.
Normalize and store data efficiently for downstream systems.
LLM-Based System Development
Support the development of internal LLM engines for automation, fraud scoring, or data analysis.
Build backend components that feed structured data to AI models.
Collaborate with machine learning teams on data preparation.
Backend & Infrastructure Development
Utilize Node.js and AWS to develop and deploy scalable, high-performance backend services.
Design efficient data models, manage relational and NoSQL databases, and optimize queries for data consistency and performance.
Architect and implement robust data pipelines to ingest, process, and store large datasets.
Collaborate with frontend teams to integrate backend services with Next.js applications for a seamless full-stack experience.
Experience
4–5+ years of backend engineering experience.
Strong experience with Node.js, AWS, REST APIs, and backend architecture.
Experience scraping websites and handling large-scale data ingestion.
Experience storing, normalizing, and querying large datasets.
Skills
Experience integrating with fintech, e-commerce, or payment platforms.
Exposure to LLM integrations or AI model pipelines.
Knowledge of Python for data-heavy tasks is a plus (not mandatory).
Reliable scraping and ingestion systems delivering accurate data.
Stable API integrations powering core business logic.
Backend systems that scale efficiently with large data volumes.
Work in a deeply technical environment, developing data infrastructure and AI-powered backend systems that support global commerce expansion. Apply now!
Fill in the application form
Record a video showcasing your skill sets