This is a remote position.
Position Title: Senior Data Engineer
$114,668 to $154,216 annually DOE
Comprehensive health benefits include - medical, dental, vision, 401k, flexible spending account, paid sick leave and paid time off, parental leave, quarterly performance bonus, training, career growth and education reimbursement programs.
At Ziply Fiber, our mission is to elevate the connected lives of our communities every day. We are
delivering the fastest home internet in the Northwest, with a focus on areas traditionally underserved by mainstream internet companies. And as our state-of-the-art fiber network expands in WA, OR, ID and MT, so does our need for team members who can help us grow and realize our goals.
We may be building internet, but we are reaching real people. We strive to build relationships and provide customers and communities with refreshingly great experiences.
We emphasize our values in all our interactions:
Genuinely Caring: Our customers and colleagues are people, and quite possibly our neighbors. We put ourselves in their shoes and give them our full attention.
Empowering You: We empower our customers to choose the products that best meet their needs, and we support our employees to implement solutions that elevate the experiences of our customers and coworkers.
Innovation and Improvement: We always look for ways to make the experiences of our customers – and each other – better.
Earning Your Trust: We earn trust by communicating simply and transparently as real people, not as a corporation.
Job Summary
The Senior Data Engineer will be responsible for designing, building, and maintaining scalable data
pipelines, data models, and infrastructure that support business intelligence, analytics, and operational data needs. This role involves working with various structured and unstructured data sources, optimizing data workflows, and ensuring high data reliability and quality. The ideal candidate will be proficient in modern data engineering tools and cloud platforms bringing innovative solutions to a fast-paced and diverse data infrastructure.
Essential Duties and Responsibilities:
The Essential Duties and Responsibilities listed below are a range of duties performed by the employee and not intended to reflect all duties performed.
- Design, develop, and maintain scalable data pipelines for ingestion, transformation, and storage of large datasets.
- Optimize data models for analytics and business intelligence reporting.
- Build and maintain data infrastructure, ensuring performance, reliability, and scalability.
- Collaborate with data analysts, data scientists, and business stakeholders to understand data
needs and design appropriate solutions.
- Implement best practices for data governance, security, and compliance.
- Work with structured and unstructured data, integrating data from various sources including
databases, APIs, and streaming platforms.
- Troubleshoot and resolve data pipeline and ETL failures, implementing robust monitoring and
alerting systems.
- Automate data workflows to increase efficiency and reduce manual intervention.
- Mentor and train junior engineers, fostering a culture of learning and innovation.
- Develop and maintain documentation for data engineering processes and workflows.
- Performs other duties as required to support the business and evolving organization.
Qualifications:
- A Bachelor’s degree in Computer Science, Engineering, or a related field is required.
- Minimum of eight (8) years of experience in data engineering, ETL development, or related fields.
- Strong proficiency in SQL and database technologies (PostgreSQL, MySQL, Oracle, SQL Server, etc.).
- Experience with big data processing frameworks such as Spark, Hadoop, Flink, and Apache
Hudi.
- Familiarity with Linux/Unix and scripting technologies utilized on them.
- Proficiency in programming languages such as Python, Java, or Scala for data engineering tasks.
- Hands-on experience with cloud platforms such as Microsoft Azure and its data services such as Azure Data Factory, Azure Synapse Analytics, and Azure Databricks.
- Experience working with data warehouses such as Snowflake, Redshift, BigQuery, or Azure SQL Data Warehouse.
- Familiarity with workflow orchestration tools such as Apache Airflow or Azure Data Factory.
- Knowledge of data modeling, schema design, and data architecture best practices.
- Strong understanding of data governance, security, and compliance standards.
- Ability to work independently in a remote environment and collaborate effectively across teams.
- Experience with Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or Azure Resource Manager (ARM) templates.
- Knowledge of containerization and orchestration technologies such as Docker, Kubernetes, and Azure Kubernetes Service (AKS).
- Exposure to GraphQL and RESTful APIs for data retrieval and integration.
- Familiarity with NoSQL databases such as MongoDB, DynamoDB, Cassandra, or Azure Cosmos DB.
- Experience required with:
- real-time analytics databases such as Apache Pinot.
- data transformation tools such as DBT, AWS Glue, or Alteryx.
- metadata management and data discovery tools such as Apache DataHub.
- Data visualization tools such as Tableau, Power BI, or Looker.
- version control software such as GitLab.
Preferred Qualifications:
• Experience working with real-time data streaming technologies like Kafka, Kinesis, or Azure
Event Hubs.
• Knowledge of machine learning pipelines and MLOps best practices, with experience using Azure
Machine Learning.
• Experience working with large-scale distributed systems.
• Familiarity with DevOps practices and CI/CD pipelines for data engineering, including Azure
DevOps.
• Understanding of data privacy regulations such as GDPR and CCPA.
Knowledge, Skills, and Abilities:
- Strong problem-solving and analytical skills.
- Ability to manage multiple priorities and work in a fast-paced environment.
- Excellent verbal and written communication skills.
- Ability to translate business requirements into scalable technical solutions.
- Strong attention to detail and a commitment to data quality.
- Ability to work with Agile methodologies and tools such as Jira, Confluence, and Azure DevOps.
- Strong collaboration skills with cross-functional teams including product managers, software
engineers, and business analysts.
Work Authorization
Applicants must be currently authorized to work in the US for any employer. Sponsorship is not available for this position.
Physical Requirements
The physical demands described here are representative of those that must be met by an employee to perform the essential functions of this job successfully. Reasonable accommodation may be made to enable individuals with disabilities to perform the essential functions.
Essential and marginal functions may require maintaining the physical condition necessary for bending, stooping, sitting, walking, or standing for prolonged periods of time; most of the time is spent sitting in a comfortable position with frequent opportunity to move about. The employee must occasionally lift and/or move up to 25 pounds. Specific vision abilities required by the job include close vision, distance vision, color vision, peripheral vision, depth perception, and the ability to adjust focus.
Work Environment
Work is performed in an office setting with exposure to computer screens and requires extensive use of a computer, keyboard, mouse, and multi-line telephone system. The work is primarily a modern office setting.
At all times, Ziply Fiber must be your primary employer. Unless otherwise prohibited by law, employees may not hold outside employment nor be self-employed without obtaining approval in writing from Ziply Fiber. In holding outside employment or self-employment, employees should ensure that participation does not conflict with responsibilities to Ziply Fiber or its business interests.
Diverse Workforce / EEO:
Ziply Fiber is an equal opportunity employer. Ziply Fiber will consider all qualified candidates regardless of race, color, religion, national origin, gender, age, marital status, sexual orientation, veteran status, and the presence of a non-job-related handicap or disability or any other legally protected status.
Ziply Fiber requires a pre-employment background check as conditions of employment. Ziply Fiber may require a pre-employment drug screening.
Ziply Fiber is a drug free workplace.