Data ETL Engineer

Work set-up: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor's or master's degree in computer science, Information Systems, or related field., At least 5 years of experience in ETL/ELT design and development., Proficiency in SQL, PLSQL, and Python for data engineering tasks., Experience with cloud-based data warehouses like Snowflake, Redshift, or similar..

Key responsibilities:

  • Develop and test ELT/ETL solutions using industry-standard tools.
  • Extract and integrate data from multiple sources into a common data model.
  • Collaborate with QA teams to debug and ensure timely delivery of data solutions.
  • Continuously explore new technologies to improve data processes.

IDT Corporation logo
IDT Corporation Large http://www.idt.net
1001 - 5000 Employees
See all jobs

Job description

IDT (www.idt.net) is a communications and financial services company founded in 1990 and headquartered in New Jersey, US. Today it is an industry leader in prepaid communication and payment services and one of the world’s largest international voice carriers. We are listed on the NYSE, employ over 1500 people across 20+ countries, and have revenues in excess of $1.5 billion.

We are looking for a Midlevel Business Intelligence Engineer to join our global team. If you are highly intelligent, motivated, ambitious, ready to learn and make a direct impact, this is your opportunity! The individual in this role will perform data analysis, ELTETL design and support functions to deliver on strategic initiatives to meet organizational goals across many lines of business.

* The interview process will be conducted in English.


Responsibilities:
  • Develop, document, and test ELTETL solutions using industry standard tools (Snowflake, Denodo Data Virtualization, Looker).
  • Recommend process improvements to increase efficiency and reliability in ELTETL development.
  • Extract data from multiple sources, integrate disparate data into a common data model, and integrate data into a target database, application, or file using efficient ELT ETL processes.
  • Collaborate with Quality Assurance resources to debug ELTETL development and ensure the timely delivery of products.
  • Should be willing to explore and learn new technologies and concepts to provide the right kind of solution.
  • Target and result oriented with strong end user focus.
  • Effective oral and written communication skills with BI team and user community.

  • Requirements:
  • 5+ years of experience in ETLELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analytics.
  • Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and largescale data processing.
  • Experience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilities.
  • Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources.
  • Demonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with handson experience in one or more relational database systems and cloudbased database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etc.
  • Proven ability to analyze and optimize poorly performing queries and ETLELT mappings, providing actionable recommendations for performance tuning.
  • Understanding of software engineering principles and skills working on UnixLinuxWindows Operating systems, and experience with Agile methodologies.
  • Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment.
  • Excellent English communication skills.
  • Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling datadriven decisionmaking and strategic insights.

  • Pluses:
  • Experience in developing ETLELT processes within Snowflake and implementing complex data transformations using builtin functions and SQL capabilities.
  • Experience using Pentaho Data Integration (Kettle) Ab Initio ETL tools for designing, developing, and optimizing data integration workflows.
  • Experience designing and implementing cloudbased ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and opensource tools.
  • Experience with reportingvisualization tools (e.g., Looker) and job scheduler software.
  • Experience in Telecom, eCommerce, International Mobile Topup.
  • Education: BSMS in computer science, Information Systems or a related technical field or equivalent industry expertise.
  • Preferred Certification: AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro Core.
  • Please attach CV in English.
    The interview process will be conducted in English.

    Only accepting applicants from INDIA
  • Required profile

    Experience

    Level of experience: Mid-level (2-5 years)
    Spoken language(s):
    English
    Check out the description to know which languages are mandatory.

    Other Skills

    • Analytical Thinking
    • Collaboration
    • Communication
    • Problem Solving

    ETL Developer Related jobs