Data Engineer at Atlas Technica

Work set-up: 
Full Remote
Contract: 
Experience: 
Senior (5-10 years)
Work from: 

Offer summary

Qualifications:

Proven experience as a Data Engineer with 5+ years in the field., Strong proficiency in Python 3 programming, including object-oriented and functional paradigms., Experience designing and implementing ETL workflows and working with SQL databases, especially MSSQL., Knowledge of Azure cloud services, including Azure Functions, Storage Accounts, and SQL Database..

Key responsibilities:

  • Build and maintain efficient ETL workflows using Python.
  • Develop and deploy Azure cloud-native solutions, including Azure Functions and databases.
  • Collaborate with cross-functional teams to ensure data pipeline reliability and security.
  • Implement data quality checks, optimize database schemas, and troubleshoot complex data issues.

Atlas Technica logo
Atlas Technica Scaleup https://www.atlastechnica.com/
201 - 500 Employees
See all jobs

Job description

Position Name: Data Engineer
Reports to: Cloud DevOps Manager
Location/Type: Full-time Remote

Atlas Technica's mission is to shoulder the burden of IT management, user support, and security compliance for our clients, so the firm can focus on investments and other areas of business management. Founded in 2016, we have grown 100% year over year through our uncompromising focus on service.  

We value ownership, execution, growth, intelligence, and camaraderie. We are looking for people who share our Core Values, thrive, and contribute to this environment while putting the customer first. At Atlas Technica, we offer a competitive salary, comprehensive benefits, and great perks to our global Team. We strive to maintain a professional yet friendly environment while promoting professional and career development for our Team Members. Join Atlas Technica now! 

As a Data Engineer, you will be responsible for designing, implementing, and maintaining robust data pipelines and cloud-native solutions that support scalable analytics and operational efficiency. This role requires deep expertise in Python programming, Azure cloud services, and SQL-based data modeling, with a strong emphasis on automation, reliability, and security.

Responsibilities:

  • Build and maintain efficient ETL workflows using Python 3, applying both object-oriented and functional paradigms.
  • Write comprehensive unit, integration, and end-to-end tests; troubleshoot complex Python traces.
  • Automate deployment and integration processes.
  • Develop Azure Functions, configure and deploy Storage Accounts and SQL Databases.
  • Design relational schemas, optimize queries, and manage advanced MSSQL features including temporal tables, external tables, and row-level security.
  • Author and maintain stored procedures, views, and functions.
  • Collaborating with cross-functional teams

Requirements:

  • English level – B2 or higher
  • 5+ years of proven experience as a Data engineer
  • Programming
    • Proficient in Python 3, with both object-oriented and functional paradigms
  • Design and implement ETL workflows using sensible code patterns
  • Discover, navigate and understand third-party library source code
  • Author unit, integration and end-to-end tests for new or existing ETL (pytest, fixtures, mocks, monkey patching)
  • Ability to troubleshoot esoteric python traces encountered in the terminal, logs, or debugger
  • Tooling & Automation
    • Git for version control and branching strategies
  • Unix-like shells (Nix-based OS) in cloud environments
  • Author CI/CD configs and scripts (JSON, YAML, Bash, PowerShell)
  • Cloud & Serverless Patterns
    • Develop Azure Functions (HTTP, Blob, Queue triggers) using azure-functions SDK
  • Implement concurrency and resilience (thread pools, tenacity, rate limiters)
  • Azure SDKs & Services
    • Deploy and configure:
    • Functions, Web Apps & App Service Plans
    • Storage Accounts, Communication Services
    • SQL Database / Managed Instance
  • Data Security and Reliability
    • Maintain strict secrets, access discipline
  • Implement data quality checks and validation steps
  • Database Administration
    • Relational data modeling & schema design
  • Data partitioning strategies & temporal tables (system-versioned)
  • Query performance tuning (indexes, execution plans)
  • Selection of optimal data types
  • Complex T-SQL (windowing, CTEs, advanced joins)
  • Advanced MSSQL features (External Tables, Row-Level Security)
  • SQL Objects & Schema Management
    • Author and maintain tables, views, Stored Procedures, Functions, and external tables (polybase)
  • Strong analytical and problem-solving skills, with meticulous attention to detail
  • Strong technical documentation skills


        Desirable Qualities:

        • Programming
        • Develop and maintain custom API client modules with robust retry/error-handling/throttling
        • Tooling & Automation
        • Author Docker files and Docker-Compose stacks
        • Cloud & Serverless Patterns
        • Use Azure Functions Core Tools for local dev and testing
        • Integrate custom logging and monitoring frameworks (Application Insights, custom metrics)
        • Azure SDKs & Services
        • Implement and maintain Authentication methods using Azure Identity Systems (DefaultAzureCredential and MSAL)
        • Work with Key Vault (secrets, certificates), Storage (blob, container, queue) and Communication Services
        • Manage resources via ARM templates and Azure DevOps pipelines
        • Data Security and Reliability
        • Enforce sensible type and schema validation (pedantic)
        • Build reusable transformation utils (key-renaming, field-splitting, list-unpacking, "string to native" type coercions)
        • SQL Objects & Schema Management
        • Author and maintain tables, views, Stored Procedures, Functions, and external tables (polybase)
        • Implement Multiple types of Slowly Changing Dimensions (SCDs)
        • Define and utilize User-Defined Types (UDTs) and table-valued parameters
        • Microsoft certifications

          Required profile

          Experience

          Level of experience: Senior (5-10 years)
          Spoken language(s):
          English
          Check out the description to know which languages are mandatory.

          Other Skills

          • Problem Solving
          • Analytical Skills

          Data Engineer Related jobs