Match score not available

Data Analyst

75% Flex
Remote: 
Full Remote
Experience: 
Mid-level (2-5 years)
Work from: 
Georgia (USA), North Carolina (USA)

Offer summary

Qualifications:

4+ years' experience with data pipelines, Experience with iSeries/IBMi integration, Experience with Cloud technologies like ADLS, Azure Databricks.

Key responsabilities:

  • Design and develop data pipelines in Dynamics365
  • Optimize data integration architecture and performance
  • Implement ETL processes and maintain database configuration
Airitos, LLC logo
Airitos, LLC https://www.airitos.com
2 - 10 Employees
See more Airitos, LLC offers

Job description

Logo Jobgether

Your missions

Location: Remote during contracting period; onsite 3x/week in Issaquah, WA or Dallas, TX upon conversion to FTE 

Job Type: 3 Month CTH 

Job Description: 


The Data Engineer is responsible for developing and maintaining data pipelines and/or data integrations across the In Warehouse Manufacturing stack. This includes data ingestion, data transformation, data validation/quality, data pipeline optimization, orchestration; and engaging with DevOps Engineer during CI/CD. This is a new team and will be fast paced, highly visible, supporting the business goals by being an industry leader in this space. This role is focused on data engineering to build and deliver automated data pipelines to and from various data sources, primarily iSeries to and from Microsoft Dynamics 365. The Data Engineer will partner with product owners, engineering and data platform teams as needed to design, build, test and automate data pipelines that are relied upon across the company as the single source of truth.

Job Duties/Essential Functions: 
  • Build or extend data models in Dynamics365
  • Integration architecture and performance optimization
  • Integration Platform Engineering, Admin. and support
  • Azure DevOps component to build VMs for Secure Agents (i.e. Gateways)
  • Data mapping (Source to D365 & D365 to source)
  • Upstream/downstream data flows architecture and design (mainly iSeries, AS400 IBMi as source and destination for data)
  • Develops data pipelines to store data in defined data models and structures to be useable within applications or by other business reporting application or dashboards
  • Identifies ways to improve data reliability, efficiency and quality of data management.
  • Assesses the integrity of data from multiple sources.
  • Manages database configuration including installing and upgrading software and maintaining relevant documentation.
  • Develops and operationalizes data pipelines to create enterprise certified data sets that are made available for consumption.
  • Designs, develops, & implements ETL/ELT processes
  • Uses Azure services such as Azure SQL DW (Synapse), ADLS, Azure Event Hub, Cosmos, Databricks, Delta-Lake to improve and speed up delivery of our data products and services.
  • Implements big data and NoSQL solutions by developing scalable data processing platforms to drive high-value insights to the organization.
  • Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery.
  • Identifies ways to improve data reliability, efficiency and quality of data management.
  • Communicates technical concepts to non-technical audiences both in written and verbal form.
  • Regular and reliable workplace attendance at your assigned location.
  • Ability to operate vehicles, equipment or machinery.
  • Computer, phone, printer, copier, fax
Required:
  • 4+ years’ experience engineering and operationalizing data pipelines with large and complex datasets.
  • 2+ years’ hands-on experience with iSeries / IBMi integration
  • 3+ years’ experience working with Cloud technologies such as ADLS, Azure Databricks, Spark, Azure Synapse, Cosmos DB and other big data technologies.
  • Extensive experience working with various data sources (DB2, SQL,Oracle, flat files (csv, delimited), APIs, XML, JSON.
  • Middleware solutions, such as IBM ACE
  • Advanced SQL skills required. Solid understanding of relational databases and business data; ability to write complex SQL queries against a variety of data sources.
  • 3+ years’ experience with Data Modeling and ETL
  • Strong understanding of database storage concepts (data lake, relational databases, NoSQL, or Realm)
Recommended:
  • Azure Certifications
  • Graph QL
  • D365 data entity models
  • Experience implementing data integration techniques such as event/message based integration (Azure Event Hub, ), ETL
  • Experience with Git / Azure DevOps
  • Experience delivering data solutions through agile software development methodologies
  • Exposure to the retail industry
  • Excellent verbal and written communication skills
  • Successful internal candidates will have spent one year or more on their current team

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Go Premium: Access the World's Largest Selection of Remote Jobs!

  • Largest Inventory: Dive into the world's largest remote job inventory. More than half of these opportunities can't be found on standard platforms.
  • Personalized Matches: Our AI-driven algorithms ensure you find job listings perfectly matched to your skills and preferences.
  • Application fast-lane: Discover positions where you rank in the TOP 5% of applicants, and get personally introduced to recruiters with Jobgether.
  • Try out our Premium Benefits with a 7-Day FREE TRIAL.
    No obligations. Cancel anytime.
Upgrade to Premium

Find more Data Analyst jobs