About the TeamRole
WEX is an innovative global commerce platform and payments technology company looking to forge the way in a rapidly changing environment, to simplify the business of doing business for customers, freeing them to spend more time, with less worry, on the things they love and care about. We are journeying to build a consistent worldclass user experience across our products and services and leverage customerfocused innovations across all our strategic initiatives, including big data, AI, and Risk.
How youll make an impact
Collaborate with stakeholders to understand customer challenges and business requirements, translating them into effective technical solutions that align with organizational goals.
Design, develop, test, and optimize data products, systems, and platforms, focusing on small to medium complexity tasks. Ensure the solutions are highquality, reliable, and scalable to meet the needs of the business.
Build and maintain scalable data pipelines and ETL processes to handle large volumes of data efficiently, ensuring data integrity, performance, and reliability throughout the entire data flow.
Develop and manage CICD pipelines using tools like GitHub Actions to streamline the integration and deployment process. Implement Infrastructure as Code (IaC) using Terraform to ensure efficient and automated infrastructure management.
Implement software development best practices, including TestDriven Development (TDD) and BehaviorDriven Development (BDD), while leveraging Microservices and Vertical Slice Architectures for modular, maintainable codebases.
Support live data products and platforms by promoting proactive monitoring, rapid incident response, and continuous improvement processes to minimize downtime and enhance system performance.
Analyze and optimize existing systems and processes, identifying bottlenecks and opportunities for improvement. Address performance issues in data pipelines, storage systems, and data processing flows to ensure optimal performance.
Mentor peers and foster continuous learning within the team by providing guidance, constructive feedback, and technical expertise. Engage in code reviews, share best practices, and encourage collaboration to improve team performance.
Engage in continuous learning of new technologies, frameworks, and tools, applying this knowledge to enhance workflows, system performance, and overall team productivity. Stay current with industry trends and best practices to drive innovation within the team.
Ensure adherence to team processes and best practices, independently completing tasks of small to medium complexity, and proactively seeking feedback from senior engineers to ensure highquality results.
Lead and participate in technical discussions, ensuring clarity in objectives and solutions. Collaborate with peers to complete tasks and projects efficiently, supporting team goals and ensuring alignment with broader business objectives.
Experience youll bring
Bachelor’s degree in Computer Science, Software Engineering, or a related field, or equivalent practical experience demonstrating technical capabilities and deep understanding.
Solid experience in software engineering with a focus on data engineering, designing and implementing data pipelines and data systems for efficient data processing and storage.
Proficiency in programming languages such as Python, Go or with strong skills in coding, automated testing, debugging, and performance monitoring of datadriven applications.
Experience with building scalable data pipelines and data extraction from diverse sources (APIs, flat files, and NoSQL databases), and implementing ETLELT processes to ensure data is transformed and loaded accurately and efficiently.
Strong understanding of data modeling techniques, including dimensional modeling and schema design for relational databases, with experience optimizing SQL queries for performance and scalability.
Handson experience with big data technologies like Apache Spark or cloudbased data processing platforms such as AWS Glue, Azure Data Factory, or Google Dataflow for handling largescale data processing and analytics workloads.
Proficiency in developing and maintaining CICD pipelines using tools such as GitHub Actions, Jenkins, or similar, ensuring seamless integration, testing, and deployment of data systems.
Experience implementing Infrastructure as Code (IaC) using tools like Terraform or CloudFormation to automate the provisioning and management of infrastructure in cloud environments.
Experience optimizing the performance of data pipelines and queries, identifying bottlenecks, improving data throughput, and finetuning storage and compute resources for efficient processing.
Familiarity with data governance and quality standards, including implementing data quality checks, data lineage, and data validation processes to ensure data integrity, consistency, and compliance.
Passionate about keeping up with modern technologies and design.
Strong willingness and capability to learn new technology and tools quickly when needed.
Passionate about understanding and solving customerbusiness problems.
Plus:
Experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP), including familiarity with cloud services related to computing, storage, and data processing.
Experience with cloudbased data warehousing applications, such as Snowflake, Amazon Redshift, or similar technologies.
Experience in building data pipelines with cloudnative ingestion, orchestration, and transformation applications, leveraging tools and services like Airflow, DBT, AWS glue , Kafka, AWS kinesis etc
Knowledge of AI and machine learning concepts, with experience in leveraging datadriven technologies and tools to improve system capabilities, automate processes, or enhance product features.
Kong Inc
Catalist
Airbnb
INNOVANT
Litera