View All Jobs 141485

Data Lakehouse And AI/ML Curriculum Engineer - Remote Eligible

Develop and deliver hands-on training modules for data lakehouse and AI/ML platforms
Remote
Senior
18 hours agoBe an early applicant
MinIO

MinIO

Provides high-performance, S3-compatible object storage software for cloud-native, Kubernetes, and enterprise data infrastructure.

Data Lakehouse And AI/ML Curriculum Engineer

MinIO is the industry leader in high-performance object storage and the company behind the world's fastest, most widely deployed object store, powering production infrastructure for more than half of the Fortune 500, including 9 of the 10 largest global automakers and all 10 of the largest U.S. banks. Our enterprise offering, AIStor, is engineered to handle the scale, speed, and pressure of modern AI and analytics, from terabytes to exabytes, all in a single namespace.

The MinIO team is looking for an expert in data lakehouses who also has experience working with AI/ML platforms (Kubeflow, H2O, MLflow), to join us as a data-focused Curriculum Engineer. This is a technical role in which you will build hands-on technical training teaching how MinIO and various technologies interact - both on a product-specific level (MongoDB, SQL Server, Snowflake) but also generally (open table formats like Iceberg, Deltalake, Hudi). In addition to creating training content and hands-on labs, you will be responsible for recording videos of the content that will be edited and published on YouTube and on MinIO Academy, our online learning portal. The training is also delivered live to our enterprise customers and strategic partners.

You will be part of a team of technologists, video editors, and motion graphic designers building a world-class training portfolio for MinIO. This role works hand in hand with MinIO engineering, support, and technical marketing to tell the Kubernetes-native, multi-cloud story that is MinIO object storage as it relates to the datastore.

Previous experience as a data lakehouse engineer or data architect is required, as well as experience building ML pipelines. Previous curriculum development and/or training delivery is helpful but not required. You will need solid oral and written communication skills to make our training relevant and easy to consume.

What You Will Do:

  1. Build real-world, use case driven labs that establish best practices for using MinIO
  2. Support the labs with clear, concise training content and develop real-world and use case-based examples, guided exercises, and practical lab materials
  3. Package the labs and content into short, outcome-oriented training modules
  4. Record videos of modules
  5. Learn new platforms and technologies quickly in order to become a subject matter expert

Your Skills And Experience:

  • 5+ years as a data engineer or data scientist
  • 3+ years of experience with software-defined storage
  • 3+ years using Amazon AWS, including an understanding of S3 and the S3 API
  • 3+ years building and maintaining workflows in git
  • 1+ years developing ML pipelines
  • Solid foundation of Linux skills
  • Strong oral and written communication skills
  • Previous experience delivering webinars and/or recorded technical content is a plus

What We Offer:

  • Health Care Plan (Medical, Dental & Vision)
  • 401K with 3% Contribution
  • Pre-IPO Stock Options
  • At least 12 Public Holidays
  • Flexible Time Off

Equal Opportunity Policy (EEO):

MinIO is proud to be an equal opportunity workplace and an affirmative action employer. We review applications for employment without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, ancestry, citizenship, age, veteran status, genetic information, physical or mental disability, medical condition, marital status, or any other basis prohibited by law.

+ Show Original Job Post
























Data Lakehouse And AI/ML Curriculum Engineer - Remote Eligible
Remote
Engineering
About MinIO
Provides high-performance, S3-compatible object storage software for cloud-native, Kubernetes, and enterprise data infrastructure.