Logo for Georgia IT, Inc.

Data Modeler

Roles & Responsibilities

  • Bachelor's Degree from an accredited college or university (or 4 years of related work experience in lieu of degree).
  • At least seven (7) years of relevant experience.
  • At least two (2) years of experience with Databricks.
  • Experience leading small teams and designing/deploying data applications on cloud platforms such as Azure or AWS.

Requirements:

  • Lead development of scalable data ingestion and transformation pipelines using Databricks, ensuring data quality and consistency across sources.
  • Design and implement data ingestion pipelines; convert existing Informatica ETL code to Databricks where feasible.
  • Mentor and lead entry- and mid-level developers; perform code reviews and provide technical guidance in an Agile/DevOps environment.
  • Create and maintain Databricks queries to support dashboards/reports; monitor performance and optimize the ingest pipeline.

Job description


Hi

Client: IRS (Internal Revenue Services)
Role: Data Modeler
Location: REMOTE
Duration: 3 Years


Job Description Summary

The ideal candidate will have a proven track record as a senior/self-starting data engineer in implementing data ingestion and transformation pipelines for large scale organizations. We are seeking someone with technical skills in Databricks development, performance tuning, and optimization. The candidate will assist in the design and development of high performant data ingestion pipelines from multiple sources using Databricks. The candidate will be involved in all stages of integrating the end-to-end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is always maintained. Candidate will have extensive experience with commercial and open source relational and non-relational data repositories.

•Provide assignments and direction to team members
•Develop scalable and re-usable frameworks for ingestion and transformation of large data sets.
•Design and implement data ingestion pipelines from multiple sources ensuring the quality and consistency of data is always maintained.
•Work with event based / streaming technologies to ingest and process data.
•Work with other members of the project team to support delivery of additional project components (API interfaces, Search).
•Stream and Batch processes in Databricks.
•Work within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints.
•Convert existing Informatica ETL code to Databricks code whenever it is feasible/required.
•Create and maintain Databricks queries to support dashboarding and/or reporting activities.
•SQL query development and optimization as required to support various reporting needs.
•Performance monitoring and diagnosis of the ingest pipeline and suggest continual improvement.

Position is remote within US.

Project Specific Qualifications:

•Bachelor's Degree from an accredited college or university required; an additional four (4) years of related work experience can substitute for a degree
•At least seven (7) years of relevant experience required
•At least two (2) years of experience with Databricks.
•Experience leading small teams
•Expertise in designing and deploying data applications on cloud solutions, such as Azure or AWS
•Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching.
•Experience in building ETL / data warehouse transformation processes
•Hands on experience in performance tuning and optimizing code running in programming languages such as PySpark and Python
•Good understanding of SQL, T-SQL and/or PL/SQL
•Demonstrated analytical and problem-solving skills particularly those that apply to a big data environment
•Experience with Apache Kafka for use with streaming data / event-based data
•Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala)
•Experience working with structured and unstructured data
•Experience working in an Agile DevSecOps environment
•Experience of working with relational databases: (SQL Server, PostgreSQL)
•Experience with non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J)

Preferred experience:
•Knowledge of IRS business systems and data
•Experience working in a command line environment and a general understanding of Red Hat Linux OS, or other Unix-like OS.
•Databricks certification
•Expertise with ELK stack and/or Splunk.
•Knowledge of FedRAMP, FISMA and other Federal IT security guidelines.

Additional requirements as per Contract/Client:
•Candidates must meet requirements to obtain and maintain an IRS Minimum Background Investigation (MBI) clearance (active IRS Moderate Risk MBI clearance is a plus).
•Candidates must be a US Citizen or a Legal Permanent Resident (Green Card status) for at least three (3) years, and Federal Tax compliant


Job Summary
Essential Duties and Responsibilities:
- Lead the development of software solutions that will meet or exceed business requirements; the development effort includes designing and implementing modules to the system specifications, conducting unit testing, troubleshooting issues and producing detailed proposals to resolve issues.
- Evaluate new coding techniques, tools, modules, and implementation as appropriate.
- Lead and mentor entry and mid-level developers.
- Consult on requirements elicitation and definition.
- Design software solutions per systems requirements.
- Code software solutions per designs.
- Code reviews, unit test, and integrate coded modules.
- Assist other developers in resolving issues by providing guidance and training.
- Support testing and remediate defects.
- Support users through troubleshooting and analysis of production logs and data.
- Investigate new solutions, tools, products, and techniques to incorporate into coding standards.
- Perform other duties as assigned by management.

Minimum Requirements:
- Bachelor's degree and 7-10 years of relevant experience or equivalent combination of education and experience required.

Related jobs

Other jobs at Georgia IT, Inc.

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

✨

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.