ETL Expert / Senior ETL Developer

Department: Big Data Analytics
Location: Remote [Work from Home]
Experience Required: 6+ years in ETL design, development, and data integration
Reports To: Head of Data Engineering

Job Summary

The ETL Expert is responsible for designing, developing, and maintaining robust and scalable data integration workflows that extract, transform, and load data from multiple sources into enterprise data warehouses or analytical platforms. This role requires deep technical expertise in ETL tools, strong SQL and data modeling skills, and a solid understanding of business intelligence and data governance principles.

Key Responsibilities

ETL Development & Data Integration
  • Design, develop, and maintain ETL processes to extract, transform, and load data from various structured and unstructured data sources.
  • Implement data pipelines using industry-standard ETL tools (e.g., Informatica, Talend, SSIS, DataStage, Pentaho, or cloud ETL services like AWS Glue / Azure Data Factory).
  • Create and optimize complex SQL queries, stored procedures, and scripts for data transformation.
  • Ensure data integrity, consistency, and reliability throughout the ETL lifecycle.
  • Build reusable and parameterized ETL frameworks for automation and scalability.
Data Architecture & Modeling
  • Work closely with data architects to design and implement data warehouse and data mart schemas (e.g., star, snowflake).
  • Support data modeling, metadata management, and master data management (MDM).
  • Participate in data profiling, cleansing, and quality assurance.
  • Develop and maintain data lineage, mapping, and documentation.
Performance & Optimization
  • Monitor ETL workflows and troubleshoot performance bottlenecks.
  • Implement error handling, logging, and recovery mechanisms for robust ETL jobs.
  • Optimize data extraction and transformation logic for large-scale and real-time data processing.
Collaboration & Governance
  • Work with data analysts, BI developers, and business users to understand data requirements.
  • Ensure compliance with data governance, privacy, and security standards.
  • Contribute to continuous improvement by identifying opportunities for process automation and tool enhancement.
  • Mentor junior developers and assist in code reviews and technical training.

Required Skills & Qualifications

  • Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field.
  • 6+ years of hands-on experience in ETL design and development.
  • Expertise in one or more ETL tools (Informatica, Talend, SSIS, DataStage, Pentaho, etc.).
  • Strong proficiency in SQL, PL/SQL, and database technologies (Oracle, SQL Server, PostgreSQL, etc.).
  • Solid understanding of data warehousing concepts, data lakes, and dimensional modeling.
  • Experience working with large datasets and optimizing ETL for performance.
  • Familiarity with cloud data platforms (AWS Redshift, Azure Synapse, Snowflake, BigQuery).

Preferred Skills

  • Experience with Python, Shell scripting, or other scripting languages for data automation
  • Exposure to big data technologies (Hadoop, Spark, Kafka).
  • Understanding of CI/CD for data pipelines and version control (Git).
  • Experience with data visualization tools (Tableau, Power BI) for end-to-end data delivery awareness.
  • Certification in ETL or cloud data engineering (e.g., AWS Data Engineer, Informatica Certified Developer).

Soft Skills

  • Strong analytical and problem-solving.
  • Excellent communication and collaboration.
  • Ability to manage multiple projects and deadlines.
  • Detail-oriented with a focus on data accuracy and quality.
  • Proactive mindset with a passion for automation and continuous improvement.
Job Type: Full-Time / Part-Time
Job Location: Remote (Work from Home)

Apply for this position

Allowed Type(s): .pdf, .doc, .docx
Scroll to Top