View All Jobs 144278

Engineer - Data

Build cross-business-unit data pipelines and analytics for operational decision-making
Appleton, Wisconsin, United States
Mid-Level
yesterday
Warehouse Specialists

Warehouse Specialists

Warehouse Specialists is a provider of third-party logistics, warehousing, and distribution services.

6 Similar Jobs at Warehouse Specialists

Engineer- Data

WSI is accelerating our modern, AI-centered data platform on Microsoft Fabric. We're seeking a hands-on Data Engineer who executes quickly and reliably, with strong data-architecture instincts. The design patterns and target state are defined; your focus is to build, ship, and harden pipelines, data models, and analytics that power Operations, Finance, HR, Customer Service, and our direct customers. A core expectation is delivering unified, cross–business-unit data solutions that span Fulfillment, Traditional, Chemical, and Transportation—regardless of underlying technology stacks. (Fabric experience is mandatory; AWS and Terraform (Infrastructure as Code) are strongly preferred.)

This role is measured on speed to value, data quality, and adoption of reporting & visualizations. You will also embrace AI—from development acceleration to improving the internal user experience and copiloting analytics workflows across the organization

DUTIES AND ACCOUNTABILITIES:

  • Engineer and maintain Microsoft Fabric data products: Create production‑grade data pipelines (Data Factory in Fabric), Lakehouse/Warehouse tables (Delta), notebooks (Py/Spark), Dataflows Gen2, semantic models, and Power BI datasets & reports.
  • Terraform (Infrastructure as Code): Experience to define and manage Azure/AWS resources in version-controlled repos with CI/CD (plan/apply), ensuring consistent, auditable environments.
  • Cross ‑ cloud ingestion: Build reliable connectors from AWS sources (e.g., S3, RDS/Redshift, APIs) into Fabric OneLake with secure credentials and auditable lineage.
  • Execute the blueprint: Apply WSI's established data architecture (medallion layers, naming conventions, SCD strategy, orchestration patterns) with a bias to deliver quickly and iterate.
  • Operationalize analytics: Partner with Operations, Finance/Accounting, HR, Customer Service, and key customers to turn requirements into curated data products, semantic models, and print‑friendly, decision‑ready visuals.
  • Consolidation BI/Reporting Tools: Experience in decommissioning many BI/Reporting tools into one without impacting business expectations.
  • AI in the loop: Use AI to speed development (e.g., code assist, doc generation), automate data documentation & lineage summaries, and enhance analyst/end‑user workflows (Copilot for Power BI, natural‑language query, RAG patterns for help & FAQs).
  • Data quality & reliability: Implement validation, unit/integration tests, observability, and SLAs; monitor freshness, completeness, and pipeline success rates.
  • Performance & cost: Optimize Storage/Compute in Fabric (OneLake, Warehouse/Lakehouse) for performance and cost‑efficiency; right‑size refresh cadences and model design.
  • Security & governance: Apply role‑based access, row‑level/object‑level security, data masking as appropriate; contribute to cataloging & lineage (e.g., Purview or Fabric-native).
  • Release management: Use Git‑enabled Fabric workspaces and CI/CD to promote artifacts across environments with automated checks.
  • Support & enablement: Create runbooks, data dictionaries, and "How to use this report/model" guides; enable analysts and power users to self‑serve safely.

REQUIRED KNOWLEDGE, SKILLS, AND ABILITIES:

  • Microsoft Fabric experience (must‑have) across several of: Data Factory (pipelines), Lakehouse/Warehouse, OneLake, Notebooks (PySpark/Python), Dataflows Gen2, Power BI semantic models and report lifecycle.
  • Strong SQL (T‑SQL/Delta), dimensional modeling (star/snowflake), incremental load patterns (SCD, CDC), and performance tuning.
  • Experience building clean, usable Power BI datasets (calcs, relationships, RLS) and supporting reporting & visualization best practices.
  • Familiarity with Git workflows and CI/CD for analytics (branching, PRs, environment promotion).
  • Evidence of using AI to accelerate delivery (e.g., code assistants, Copilot, prompt‑assisted documentation/testing) and to enhance the end‑user experience.
  • Excellent communicator with the ability to translate business needs into technical back end to execute the business strategy while remaining safe, secure and performant processes across the organization.

PREFERRED EDUCATION AND EXPERIENCE:

  • 2–5+ years of hands‑on data engineering with clear examples of 3rd Party Logistics, Order Management, Warehouse Management or Transportation Management point of emphasis in prior data engineering roles to show business value.

PHYSICAL CAPABILITIES AND REQUIREMENTS:

  • Ability to sit for extended periods within an office environment.
  • Ability to use hands and fingers for computer keyboarding and answer phone calls.
  • Ability to communicate via the telephone using speaking and hearing skills.

BENEFITS AND TOTAL REWARDS:

  • Competitive wages, and opportunities for advancement.
  • Medical, Dental, Vision, Critical Illness, Accident, and Flexible Spending Plans available.
  • Company-paid Short/Long-term Disability, Life Insurance, and Employee Assistance plans.
  • Company-paid Time-Off (PTO), Sick Leave, and Holiday Pay.
  • Retirement 401(k) Plan with Discretionary Employer Match, and Profit Sharing.
  • Referral Bonus, Wellness Programs, Clothing Allowance, Safety Shoes, and Safety Glasses Reimbursement.
+ Show Original Job Post
























Engineer - Data
Appleton, Wisconsin, United States
Engineering
About Warehouse Specialists
Warehouse Specialists is a provider of third-party logistics, warehousing, and distribution services.