As a DataStage Developer, you will play a crucial role in designing, building, and implementing ETL (Extract, Transform, Load) solutions using IBM's DataStage interface. You will be responsible for creating and maintaining efficient data integration processes to ensure the smooth flow of data within our organization.
Responsibilities:
1. Design and develop ETL solutions: Collaborate with stakeholders to gather requirements and translate them into scalable and efficient ETL solutions using IBM DataStage.
2. Data integration: Build robust data integration workflows to extract, transform, and load data from various sources into target systems, ensuring data quality and integrity.
3. Data modeling: Design and implement data models and mappings to ensure optimal performance and accuracy of data transformations.
4. Performance optimization: Identify and resolve performance bottlenecks, tune ETL processes, and optimize data loading and transformation for improved efficiency.
5. Error handling and troubleshooting: Implement error handling mechanisms, monitor ETL processes, and troubleshoot issues to ensure the timely resolution of data integration problems.
6. Collaborate with teams: Work closely with cross-functional teams, including data analysts, database administrators, and business stakeholders, to understand requirements and deliver high-quality solutions.
7. Documentation and maintenance: Create technical documentation, including data mapping, process flow diagrams, and user guides. Regularly maintain and update existing ETL processes.
8. Stay updated on industry trends: Keep abreast of the latest trends, tools, and technologies in the field of data integration and ETL to propose innovative solutions and enhance existing processes.
Requirements
1. Bachelor's degree in Computer Science, Information Technology, or a related field.
2. Proven experience as a DataStage Developer or in a similar ETL development role.
3. Strong proficiency in IBM DataStage, including experience with its various stages, transformations, and parallel processing.
4. Solid understanding of data integration concepts, ETL principles, and data warehousing.
5. Proficiency in SQL and database technologies (such as Oracle, SQL Server, or DB2) for data extraction and manipulation.
6. Familiarity with data modeling and schema design.
7. Experience with performance tuning and optimization of ETL processes.
8. Strong problem-solving and troubleshooting skills with the ability to analyze complex data issues and provide effective solutions.
9. Excellent collaboration and communication skills to work effectively within a team and interact with stakeholders.
10. Attention to detail, with a focus on delivering high-quality and accurate results.
11. Ability to work on multiple projects simultaneously and meet deadlines in a fast-paced environment.
12. Knowledge of scripting languages (e.g., Shell scripting, Python) is a plus