This is a fully remote position
This job will focus on creating, maintaining, transforming, and decommissioning various systems that cross a wide hybrid IT landscape. As part of this role, it is vital that data flows smoothly, accurately, and completely to ensure general performance and that internal & security project requirements are met. The engineer will also work through optimizing (or changing) existing data workflows & pipelines that are relied on by multiple internal stakeholders.
A complex IT landscape that ranges from on-premises, SaaS, and multiple cloud providers (primarily AWS) is integrated into business processes that need modernizing or re-integrating into modern systems & processes. The goal is to increase the resiliency, data quality, observability, and completeness of these integrations to support the needs of the organization.
While this role does rely on automation/integration between systems, the right candidate will be able to work with internal stakeholders to ensure today's requirements are met with tomorrow's integrations in mind. This person should be able to gather requirements and own the end-to-end process/delivery of a solution. They'll also be able to see a larger data pipeline picture of several integrations and communicate effectively to those involved on possible solutions (as well as eventual pitfalls based on technical decisions/paths), understanding that some changes may rely on future product enhancements and their implementation priority within a development cycle.
Responsibilities:
Requirements
The ideal candidate will be able to write/edit/maintain code that originates in Python/Bash/PowerShell/etc., as well as use DevOps tooling in CI/CD workflows to check into source control systems. Using Linux, Windows, Containers, and cloud infrastructure is required. Being able to integrate, develop, and troubleshoot Splunk (and Splunk Apps, generally Python-based) is highly desired.
Benefits
Accellor
HSO
BlackStone eIT
Flowdesk
TeleSoftas