Logo for Keylent Inc

GA-RC2022031306- Data Enginner

Roles & Responsibilities

  • 4+ years of hands-on experience with Big Data Tools: Hadoop, Spark, Kafka, etc.
  • Proficient with relational SQL and NoSQL databases, including Postgres and Cassandra
  • Experienced with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.
  • At least 2+ years of experience with Microsoft Azure

Requirements:

  • Creates and maintains optimal data pipeline architecture for extraction, transformation, and loading of data from various data sources - both internal and external
  • Builds analytical tools to utilize the data pipeline
  • Assembles large, complex sets of data that meet non-functional and functional business requirements
  • Builds industrialized analytic datasets and delivery mechanisms that utilize the data pipeline to deliver actionable insights

Job description


Visa status: U.S. Citizens and those authorized to work in the U.S. are encouraged to apply.
Tax Terms: W2, 1099
Corp-Corp or 3rd Parties: Yes

 
 
 

Data Engineer- Remote  Job Description: Data Engineers implement efficient data pipelines supporting business intelligence, analytics, and operational needs. They are proficient in investigating, transforming, and combining data of different data sources and types. Ideal candidates are prepared to design appropriate architecture to support business data needs and create consumable analysis-ready data sets.

Data Engineers educate business partners on the use and sources of data. They identify and explore new data sources. Assignments may include supporting data science R&D, model deployment by IT and other technical professionals, overseeing support of other engineers, and collaboration with analytical, technical, and business partners within and across business teams. This position is remote. Some travel may be required.

Key Responsibilities:

· Creates and maintains optimal data pipeline architecture for extraction, transformation, and loading of data from various data sources – both internal and external

· Builds analytical tools to utilize the data pipeline

· Assembles large, complex sets of data that meet non-functional and functional business requirements

· Builds industrialized analytic datasets and delivery mechanisms that utilize the data pipeline to deliver actionable insights

· Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.

· Contributes to the architecture and design for scalable and efficient model deployments

· Investigates data, data sources and performs data quality analysis

· Communicates and maintains master data, metadata, data management repositories, logical data models, data standards

· Works with stakeholders including the executive, product, data, and design teams to support their data infrastructure needs while assisting with data-related technical issues

· Establishes and teaches best practices related to data access and queries for data users

· With appropriate direction, plans, implements, manages, and/or contributes on projects that are moderate complexity and scale using accepted project management standards

Job Requirements:

· 4+ years of hands-on experience with Big Data Tools: Hadoop, Spark, Kafka, etc.

· Proficient with relational SQL and NoSQL databases, including Postgres and Cassandra

· Experienced with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc.

· At least 2+ years of experience with Microsoft Azure.

· Experience with On-Prem to Cloud migrations

· Experienced with Stream-processing systems: Storm, Spark-Streaming, etc.

· 2+ years with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.

· Familiarity with geospatial data: latitude/longitude, geopoint/vector containers, geohash, H3, shapefiles, etc.

· Familiarity with personal lines insurance policy and claim data, vehicle telematics data, OEM vehicle data is strongly preferred

· Experienced with data wrangling and preparation for use within data science, actuarial, business intelligence or similar analytical workflows

· A demonstrable understanding of networking/distributed computing environment concepts

· Strong written and verbal communication skills including the ability to effectively collaborate with multi-disciplinary groups

· High-level organizational and project management skills in order to handle assignments in a timely manner and to monitor own process

· Strong decision-making skills

· Bachelor's degree in information technology, computer science, data science, engineering, mathematics, statistics/applied statistics, or related field


Related jobs

Other jobs at Keylent Inc

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

✨

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.