Key Responsibilities
Developer with hands-on experience to build end to end applications. This involves designing and developing
data extraction/loading, processing, integration, quality layers and UI.
o Write code for delivering functional stories, test cases, infra automation script, security scripts,
monitoring tools and other related use cases.
o Strong Oracle/PL-SQL and strong analytical skills related to working with unstructured datasets.
o Good hands-on experience on ETL tools like Informatica etc.
o Hands-on experience with Unix Scripting, Python
Good Knowledge of Reporting platforms (OBIEE, Power BI, Tableau, etc.)
Understanding of Cloud concepts and Snowflake platform. Proven experience on working with Snowflake /
migrating data from on premise to Snowflake
Good Knowledge and understanding of Serverless Architecture and AWS offering in the area
Core concepts and hands-on writing
o Expertise with CI/CD pipelines and with few of the DevOps tools like Jenkins/Bamboo etc.
o Build & optimize data pipelines, architecture & data sets supporting data transformation, data
structures, metadata, dependency and workload management
o Working knowledge of APIs, caching and messaging
o Experience in software delivery in agile methodologies. TDD & pair programming best practices
to ensure quality certified deliverables
Experience in performing root cause analysis on internal and external data and processes to answer specific
business questions and continuously identify opportunities for improvement.
Experience delivering on data related Non-Functional Requirements like-
o Hands-on experience dealing with large volumes of historical data across markets/geographies.
o Manipulating, processing and extracting value from large disconnected datasets.
o Building water-tight data quality gates on investment management data
o Generic handling of standard business scenarios in case of missing data, holidays, out of
tolerance errors etc.
Good to have knowledge / past experience of
Message queuing, stream processing, and highly scalable data stores on Cloud
Experience with snowflake SQL Writing SQL queries against Snowflake Developing scripts Unix,
Python etc to do Extract, Load and Transform, Snowpipe for bulk distribution
Big data stack - either on Cloud or on-prem. Data analytics & data science/machine learning /
quantitative implementation
Functional understanding of Capital Markets & Investment data
Behavioural
Learnability- Ability to collectively push the boundary & pioneer the adoption and industrialisation
of emerging data technologies in the organisation. Passion for growing your skills and, tackling
challenging problems
Self-motivated to rapidly pick new skills, work directly with senior techies and fellow technology
teams in a cordial environment
Fungibility- Ability to flex in different roles as per project demand & willingness to move to new
roles
Ready to give and receive feedback in a candid way
Responsibilities
This position requires a strong self-starter with solid technical engineering background and influencing skills, one
who loves use of tech & engineering for solving business problems, can assist colleagues with design, best
practices, troubleshooting and other technical challenges related to implementation of a critical business /
customer facing proposition.
Create and maintain optimal data pipeline design & code. Assemble large, complex data sets that
meet functional / non-functional business requirements.
Working with product owners & business stakeholders - identify, design, and implement internal
process improvements: automating manual processes, optimizing data delivery, re-designing
infrastructure for greater scalability, etc.
Build the infrastructure required for optimal extraction, transformation, and loading of data from a
wide variety of data sources using SQL, ETL tools like Informatica etc.
Build analytics tools that utilize the data pipeline to provide actionable insights into customer
acquisition, operational efficiency and other key business performance metrics.
Ensure delivery in a timely, efficient and cost-effective manner but without compromising quality.
Work with Support teams to assist with data-related technical issues and support their data
infrastructure needs
• 4-8+ years of experience with application development on Oracle RDBMS, SQL, PL/SQL
• Strong Experience with Oracle Database
• Good understanding of RDBMS Concepts
• Hands on Experience in writing complex SQL queries
• Good Understanding of Oracle Architecture
• Good experience and hands-on knowledge of PL/SQL (Packages/Functions/Ref cursors)
• Experience in development & low-level design of Warehouse solutions
• Familiarity with Data Warehouse, Datamart and ODS concepts
• Knowledge of data normalisation and Oracle performance optimisation techniques
• Strong knowledge of Data security concepts in Oracle
• Hands on experience in frequent code deployment into multiple environments
Experience and Qualifications Required
4 - 8 years
A graduate or Postgraduate in Engineering or Computer Science from a reputed University.