meta-pixel

Senior ETL Engineer

  • Industry Other
  • Category Production/Maintenance /Quality
  • Location Lalitpur District, Nepal
  • Expiry date Mar 01, 2026 (4 days left)
Job Description

Job Title:

Senior ETL Engineer


Job Summary:

We are seeking a talented ETL/Data Engineer with strong experience building and maintaining cloud-based data pipelines and data models. This role will focus on enhancing and supporting ETL workflows primarily on Google Cloud Platform (GCP), while also working with Python-based data pipelines, relational databases such as PostgreSQL, and BigQuery. The ideal candidate is hands-on, detail-oriented, and comfortable collaborating directly with stakeholders to translate reporting and analytics requirements into scalable data solutions.


Job Description:

As a Data Engineer, your responsibilities will include:

  • Maintaining and improving GCP-hosted Cloud Function ETL workflows that intake data and load it into BigQuery
  • Designing and implementing event-driven data pipelines using Google Pub/Sub to enable decoupled, resilient data ingestion and processing
  • Designing, building, and optimizing Python-based ETL pipelines for batch and near-real-time data processing
  • Collaborating with client stakeholders to understand both prototype and production environments within GCP infrastructure
  • Mapping and documenting existing ETL flows and data pipelines
  • Working with business teams to understand different data types, sources, and their business context
  • Partnering with reporting teams to understand dashboard requirements and near-term reporting needs
  • Identifying and understanding key data tables in BigQuery that support business reporting and analytics
  • Designing and implementing new BigQuery table structures while maintaining backward compatibility with legacy systems
  • Building and maintaining data transformation models using dbt or Dataform to implement a layered data architecture (raw, cleaned, curated)
  • Implementing dual-write strategies to ensure smooth transitions from legacy to new data models
  • Resolving data discrepancies by comparing outputs against legacy reporting systems
  • Enhancing ETL processes by adding post-processing functions to enrich data and replace reporting layer logic
  • Expanding data models with additional fields based on evolving business requirements
  • Assisting team members in authoring optimized queries to produce accurate results
  • Supporting or building data workflows orchestrated using Apache Airflow (good to have)
  • Documenting data models, processes, and transformations
  • Ensuring data integrity and accuracy throughout the pipeline
  • Collaborating closely with cross-functional teams to translate business requirements into technical solutions


Job Specification:

Required Skills and Expertise:

  • Education: Bachelor's degree in Computer Science, Engineering, or a related field
  • Experience: 3+ years of experience in data engineering with proven working experience in Google Cloud Platform
  • Event-Driven Architecture: Experience with Google Pub/Sub or equivalent messaging systems for building asynchronous, event-driven data pipelines with dead-letter handling and delivery guarantees
  • ETL Development: Strong experience building, maintaining, and optimizing ETL pipelines in cloud environments
  • SQL Skills: Advanced SQL proficiency, including complex queries, performance optimization, and BigQuery-specific features
  • Programming: Strong experience in Python for building data pipelines and automation
  • Cloud Platforms: Working experience with GCP (BigQuery, Pub/Sub, Cloud Functions) or AWS services such as Lambda and RDS (PostgreSQL)
  • Data Modeling: Proven ability to design and implement efficient and scalable data models that support business reporting needs
  • Data Transformation: Hands-on experience with dbt or Google Dataform for SQL-based data modeling, including writing and maintaining tests, documentation, and incremental materialization strategies
  • Communication: Excellent written and verbal communication skills with ability to work directly with clients and translate technical concepts for non-technical stakeholders
  • Collaboration: Experience working with cross-functional teams including business analysts, reporting teams, and client stakeholders
  • Problem-Solving: Strong analytical skills with experience troubleshooting data discrepancies and pipeline issues
  • Documentation: Ability to create clear technical documentation for data flows, models, and processes


Preferred Skills:

  • Experience with Looker and LookML - this would be a HUGE plus
  • Experience with CI/CD pipelines for data engineering workflows
  • Experience with version control systems (Git)
  • Understanding of data warehouse design patterns and best practices
  • Experience with real-time data processing and streaming architectures
  • Experience with Apache Airflow for workflow orchestration


Soft Skills:

  • Strong analytical thinking and problem-solving abilities
  • Excellent client-facing communication skills - this resource will work closely with client stakeholders
  • Ability to understand business requirements and translate them into technical solutions
  • Self-motivated with ability to work independently and manage multiple priorities
  • Attention to detail and commitment to data quality
  • Proactive approach to identifying and resolving potential issues
  • Collaborative mindset with ability to work effectively across teams
  • Continuous learning mindset to stay current with evolving technologies


What We Offer

  • Employee Health Insurance Plan – Because your well-being truly matters.
  • Office-Provided Healthy Lunch – Nutritious meals to keep you energised and focused at work.
  • Complimentary Beverages – Coffee and tea available throughout the day.
  • Five-Day Work Week – Maintain a healthy work-life balance with a structured Monday-to-Friday schedule.
  • Company Outings & Team Events – Regular activities that strengthen collaboration and team spirit.
  • Great Infrastructure & Recreational Areas – Work in a modern office with spaces designed to relax, recharge, and collaborate.
  • Celebrations & Recognition Milestones – We celebrate achievements, milestones, and the people who make them possible.
  • Career Growth Opportunities – Work alongside experienced professionals on impactful and exciting projects that help you grow your expertise.
  • Exciting Interest-Based Clubs – Connect, unwind, and grow with communities that match your interests.


Application Procedure:

Interested candidates may apply through LinkedIn Easy Apply or email your resume to [email protected] and mention "Senior ETL Engineer" in the email subject.


If you’re shortlisted, our team will get in touch with you and guide you through the next steps.

Download Our Mobile App