Senior Machine Learning Engineer (Python + SQL) BruntWork

Senior Machine Learning Engineer (Python + SQL)

  • Industry Other
  • Category Architect/Interior Designing
  • Location Nepal
  • Expiry date May 24, 2026 (7 days left)
Job Description

Role Overview

Our client is seeking a Senior Machine Learning Engineer to architect and build a scalable recommendation and decision engine powered by real-time data pipelines and machine learning services. This role will lead the development of a Single Customer View platform that integrates transactional APIs, streaming data infrastructure, and cloud-native machine learning capabilities. This position requires experience building high-performance ML platforms, automated MLOps pipelines, and real-time decisioning systems in modern cloud environments.


Schedule:

  • Flexible schedule during client business hours, Sydney, NSW (40 hours per week)


Responsibilities

Machine Learning Platform & Architecture

  • Productionize machine learning models in collaboration with Data Scientists, ensuring performance, scalability, and reliability.
  • Design and maintain MLOps pipelines that automate model training, testing, validation, and deployment.
  • Architect a Single Customer View platform that enables unified customer profiling and real-time decisioning.
  • Build scalable cloud-based data warehouse architectures supporting analytics and machine learning workloads.
  • Develop serverless functions and services using Python, Go, and SQL.

Data Engineering & Real-Time Processing

  • Design and implement high-performance streaming data pipelines for real-time analytics and decision engines.
  • Develop data ingestion pipelines using API integrations with retry and fault-tolerant mechanisms.
  • Build optimized data models supporting both transactional and analytical workloads.
  • Develop and maintain data orchestration workflows using modern workflow management tools.

Infrastructure, DevOps & Reliability

  • Implement and maintain Infrastructure-as-Code (IaC) using Terraform to provision and manage cloud infrastructure.
  • Build and optimize CI/CD pipelines to enable reliable and automated deployments.
  • Apply Site Reliability Engineering (SRE) practices to maintain system availability and performance.
  • Implement monitoring, alerting, and incident response systems for distributed applications.

Security, Compliance & Cost Optimization

  • Implement secure identity and authentication frameworks for distributed cloud workloads.
  • Ensure infrastructure and data pipelines comply with data privacy and security standards, including GDPR.
  • Drive cloud cost optimization initiatives, including serverless architecture adoption and infrastructure rightsizing.
  • Maintain clear and comprehensive technical documentation, including architecture diagrams and data lineage.


Requirements

Technical Skills

  • Expert-level proficiency in Python and SQL
  • Strong scripting experience using Shell
  • Working knowledge of Go

Cloud & Data Platforms

  • Strong experience working with Google Cloud Platform (GCP) services including:
  • BigQuery
  • Vertex AI
  • Cloud Dataflow
  • Cloud Run
  • Google Kubernetes Engine (GKE)
  • Pub/Sub

Data Engineering

  • Experience building large-scale data pipelines and streaming architectures
  • Experience developing data warehouses and analytical data models
  • Experience building real-time data processing systems

Infrastructure & DevOps

  • Strong experience implementing Infrastructure-as-Code using Terraform
  • Experience managing CI/CD pipelines using GitHub Actions or GitLab
  • Experience with Cloud Build or Jenkins

Databases

  • Hands-on experience with PostgreSQL-based databases
  • Experience with NoSQL databases

Data & Analytics Tools

  • Experience working with Airflow (DAG development)
  • Experience using Looker / LookML
  • Experience developing and managing API integrations

Nice-to-Have Skills

  • Experience designing recommendation systems or ML-driven decision engines
  • Experience building real-time personalization platforms
  • Experience with distributed systems and event-driven architectures
  • Experience implementing serverless architectures

Preferred Certifications

  • GCP Professional Cloud Architect
  • GCP Professional Data Engineer
  • AWS Certified Solutions Architect
  • AWS Certified Data Engineer


Independent Contractor Perks

  • Permanent work from home
  • Immediate hiring
  • Health Insurance Coverage for eligible locations


Note

  • Please click the "Apply" button to complete your application, including the assessment questions, technical check, and voice recording. Your hourly pay rate will be established based on your performance in the application process; submissions with all requirements fulfilled will receive priority review.

Download Our Mobile App