Data Engineer

Work set-up: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor's degree in Computer Science or related field., 4-6 years of experience in data engineering roles., Proficiency in Python, SQL, and cloud computing platforms like AWS., Strong understanding of data pipelines, big data technologies, and data modeling..

Key responsibilities:

  • Design, maintain, and improve data infrastructure and pipelines.
  • Collaborate with data science and product teams to integrate data sources.
  • Implement and optimize data processing systems for real-time and batch data.
  • Ensure data quality, security, and compliance across data systems.

STAR4RALL IT SOLUTIONS PRIVATE LIMITED logo
STAR4RALL IT SOLUTIONS PRIVATE LIMITED Small startup https://star4rall.com/
2 - 10 Employees
See all jobs

Job description

Job Title: DATA ENGINEER
Location: Remote
Employment Type: Permanent role
Client: Bilge Adam(UK Based client)
Work Time: 2pm11pm
Experience: 46years

Job Description:
The Role
As a Data Engineer youd be working with us to design, maintain, and improve various analytical and operational services and infrastructure which are critical for many other functions within the organization. These include the data lake, operational databases, data pipelines, largescale batch and realtime data processing systems, a metadata and lineage repository, which all work in concert to provide the company with accurate, timely, and actionable metrics and insights to grow and improve our business using data. You may be collaborating with our data science team to design and implement processes to structure our data schemas and design data models, working with our product teams to integrate new data sources, or pairing with other data engineers to bring to fruition cuttingedge technologies in the data space.
Our Ideal Candidate
We expect candidates to have indepth experience in some of the following skills and technologies and be motivated to build up experience and fill any gaps in knowledge on the job. More importantly, we seek people who are highly logical, with a balance of respect for best practices and using their own critical thinking, adaptable to new situations, capable of working independently to deliver projects endtoend, communicates well in English, collaborates effectively with teammates and stakeholders, and eager to be on a highperforming team, taking their careers to the next level with us.
Highly relevant:
• General computing concepts and expertise: Unix environments, networking, distributed and cloud computing
• Python frameworks and tools: pip, pytest, boto3, pyspark, pylint, pandas, scikitlearn, keras
• AgileLean project methodologies and rituals: Scrum, Kanban
• Workflow scheduling and monitoring tools: Apache Airflow, Luigi, AWS Batch
• Columnar and big data databases: Athena, Redshift, Vertica, Snowflake, HiveHadoop
• Version control: git commands, branching strategies, collaboration etiquette, documentation best practices
• General AWS or Cloud services: Glue, EMR, EC2, ELB, EFS, S3, Lambda, API Gateway, IAM, Cloudwatch, DMS
• Container management and orchestration: Docker, Docker Swarm, ECS, EKSKubernetes, Mesos
• CI CD tools: CircleCI, Jenkins, TravisCI, Spinnaker, AWS CodePipeline .
Also good to have:
• JVM languages, frameworks and tools: Kotlin, Java, Scala Maven, Spring, Lombok, Spark, JDK Mission Control
• Distributed messaging and event streaming systems: Kafka, Pulsar, RabbitMQ, Google PubSub
• RDBMS and NoSQL databases: MySQL, PostgreSQL DynamoDB, Redis, Hbase
• Enterprise BI tools: Tableau, Qlik, Looker, Superset, PowerBI, Quicksight
• Streaming data processing frameworks: Spark Streaming, Apache Beam, Apache Flink
• Data science environments: AWS Sagemaker, Project Jupyter, Databricks
• Log ingestion and monitoring: ELK stack (Elasticsearch, Logstash, Kibana), Datadog, Prometheus, Grafana
• Metadata catalogue and lineage systems: Amundsen, Databook, Apache Atlas, Alation, uMetric
• Data privacy and security tools and concepts: Tokenization, Hashing and encryption algorithms, Apache Ranger

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Critical Thinking
  • Adaptability
  • Collaboration
  • Communication

Data Engineer Related jobs