If you’re passionate about building a better future for individuals, communities, and our country—and you’re committed to working hard to play your part in building that future—consider WGU as the next step in your career.
Driven by a mission to expand access to higher education through online, competency-based degree programs, WGU is also committed to being a great place to work for a diverse workforce of student-focused professionals. The university has pioneered a new way to learn in the 21st century, one that has received praise from academic, industry, government, and media leaders. Whatever your role, working for WGU gives you a part to play in helping students graduate, creating a better tomorrow for themselves and their families.
Job Profile Summary:
The Big Data Developer will design how data will flow through hybrid data environments comprised of open source big data platforms and
traditional database systems. The core responsibility for this position includes design of data and systems architecture for OLAP (data
warehouse/ODS) and OLTP projects encompassing dimensional and normalized data modeling. The Big Data Developer will improve
technical standards in the environment ensuring optimal use of data warehouse and other data stores to solve business problems. The Big
Data Developer will serve as the go to person for query performance and data quality issues including data profiling.
Essential Functions and Responsibilities:
1. Establish design and methodology for the database build processes
2. Architect and design complete data model solutions
3. Create security models
4. Create validation mechanisms
5. Create extract processes for access layer
6. Translate business problems/information requirements accurately to logical/physical data models aligning with customer’s data architecture standards
7. Perform research and analysis to come up with solutions for complex business problems
8. Monitor query performance and fine tune queries/PL SQL if required on a regular basis
9. Regularly profile data, publish data profiles and take corrective actions if required to ensure high quality data
10. Automate naming standards while script generation
11. Document / describe data migration techniques used
12. Document / reverse engineer / analyze data mapping using data integration code/tools
13. Perform impact analysis using Data Integration/Data Virtualization tool repositories, DB data dictionary, UNIX scripts and front-end code on versioning system
14. Analyze / research data on multiple platforms as wells as multiple heterogeneous databases including custom developed databases
15. Positively impact projects by completing tasks assigned on time
Knowledge, Skill and Abilities:
• Hadoop
• Cloudera or Horton
• Scala
• Impala
• Hive
• AngularJS
• HTML5
• CSS/LESS
SynergisticIT
Public Cloud Group (PCG)