Overview:
QSC is seeking a highly skilled Senior Data Engineer/Data Architect to design, build, and maintain our data infrastructure while also leading strategic architectural initiatives. This individual will play a pivotal role in shaping and executing QSC’s data architecture vision, ensuring scalability, security, and efficiency in managing our data resources.
This role will be responsible for architecting robust data solutions, while also executing hands-on data engineering tasks to optimize data pipelines, integrate microservices, and improve data accessibility. This role will support QSC’s mission of becoming a data-driven enterprise by providing high-quality, actionable data to business units, enhancing data governance, and promoting best practices in data management, middleware, and API integration.
Base Salary Range 133,000 – 190,000
The above reflects the pay range that QSC reasonably expects to pay for this role. This pay range also depends on various factors such as job duties and requirements, relevant experience and skills and geographic location. In addition to the base salary range, QSC offers a comprehensive package including but not limited to health benefits, 401K or Roth retirement plans, generous time off and profit sharing
We will be accepting applications until a final candidate is identified.
Responsibilities:
Data Architecture Development:
- Design and implement scalable data architectures that meet the needs of QSC’s growing data landscape, ensuring flexibility, performance, and security.
- Lead efforts to modernize the company’s data architecture, including cloud migration, data lake strategies, and integrating new technologies.
- Establish and maintain data governance standards and best practices for data architecture, modeling, and management.
- Architect data solutions that leverage microservices to provide modular, scalable, and reusable components across the enterprise.
- Collaborate with cross-functional stakeholders to align data architecture with business objectives and to ensure seamless integration across departments.
Data Engineering Execution:
- Build, optimize, and manage ETL/ELT processes to ensure efficient data flow between systems, applications, and data warehouses.
- Develop and maintain robust data pipelines for both structured and unstructured data sources, ensuring data is processed efficiently and made available to business users.
- Build and deploy APIs for seamless integration and consumption of data across different platforms, ensuring security, performance, and scalability.
- Leverage middleware to integrate legacy systems with modern data platforms, ensuring smooth data flow between services.
- Collaborate with software engineering teams to implement data solutions that meet performance and scalability requirements.
Middleware, API, and Microservices Integation:
- Design, develop, and implement API-driven data solutions that enable secure, scalable, and flexible data access.
- Build and manage microservices architecture to support real-time data processing and enhance data delivery across the enterprise.
- Develop middleware solutions to bridge the gap between data services and business applications, ensuring consistent data flow and integration.
- Ensure secure and efficient communication between services using APIs and messaging protocols such as REST, gRPC, or GraphQL.
- Establish and promote best practices for API design, versioning, monitoring, and security within the data infrastructure.
Data Governance and Quality Assurance:
- Establish data quality metrics and ensure compliance with data governance policies across the organization.
- Ensure the security of data systems through strong architectural designs that comply with regulatory standards.
- Execute on best practices to ensure data quality at ingestion and pipeline.
- Proactively monitor, troubleshoot, and improve data infrastructure performance and reliability.
- Advocate for and implement best practices in data stewardship and data lifecycle management.
Collaboration and Stakeholder Management:
- Work closely with business analysts, data scientists, and IT teams to understand data needs and translate them into technical requirements.
- Present complex data architecture solutions to both technical and non-technical stakeholders.
- Serve as a thought leader in the data engineering and architecture space, fostering collaboration and promoting a culture of data-driven decision-making across QSC.
Qualifications:
Education:
Minimum: Bachelor's Degree in Computer Science, Engineering, Information Systems, or a related field.
Preferred: Master's Degree in Computer Science, Data Science, or equivalent technical discipline
Experience:
- 8+ years of hands-on experience in data engineering, data architecture, or related fields.
- Proven experience in designing and managing large-scale data architectures in cloud environments (AWS, Azure, or Google Cloud).
- Deep experience in building and optimizing data pipelines, ETL processes, and data integration workflows.
- Experience in designing and deploying microservices architecture and API-driven solutions.
- Strong experience with both SQL and NoSQL databases, data warehousing, and data lakes.
- Proven track record of successfully leading data architecture initiatives in mid-to-large scale enterprises.
- Experience with data governance and regulatory compliance frameworks.
Knowledge and Skills:
- Deep expertise in database technologies (SQL, NoSQL), data modeling (e.g., Star Schema, Snowflake), and distributed data systems (e.g., Hadoop, Spark).
- Strong proficiency in cloud-based data solutions (AWS Redshift, Google BigQuery, Azure Synapse) and related infrastructure.
- Expertise in ETL/ELT frameworks and tools (e.g., Airflow, Talend, or similar).
- Strong coding skills in languages like Python, Scala, or Java for building and optimizing data pipelines.
- Expertise in building and securing RESTful APIs, as well as integrating with APIs from third-party services.
- Strong familiarity with microservices architecture and how to design for scalability, resiliency, and modularity.
- Experience working with middleware solutions and integrating legacy systems into modern data architectures.
- Familiarity with API security best practices (OAuth, JWT, etc.) and monitoring solutions.
- Familiarity with container orchestration tools like Kubernetes and Docker for deploying scalable microservices.
- Excellent problem-solving skills and ability to design creative, scalable data solutions.
- Strong communication skills with the ability to explain complex technical solutions to business stakeholders.
- Experience in developing strategies for data quality, data security, and disaster recovery.