Match score not available

Career Opportunities: Data Engineer (831829)

Remote: 
Full Remote
Contract: 
Experience: 
Mid-level (2-5 years)
Work from: 

Offer summary

Qualifications:

Bachelor’s in Computer Science or related field, Experience in data architecture and management, Proficient with SQL and NoSQL databases, Knowledge of big data technologies, Understanding of cloud computing and security.

Key responsabilities:

  • Develops strategies for practical data solutions
  • Ensures efficient data governance and access
  • Creates and maintains optimal data pipeline architecture
  • Supports software architecture and data processing improvements
  • Builds tools to improve data accessibility
Bayer logo
Bayer XLarge https://www.bayer.com/
10001 Employees
See more Bayer offers

Job description

 

 

At Bayer we’re visionaries, driven to solve the world’s toughest challenges and striving for a world where ,Health for all, Hunger for none’ is no longer a dream, but a real possibility. We’re doing it with energy, curiosity and sheer dedication, always learning from unique perspectives of those around us, expanding our thinking, growing our capabilities and redefining ‘impossible’. There are so many reasons to join us. If you’re hungry to build a varied and meaningful career in a community of brilliant and diverse minds to make a real difference, there’s only one choice.

 

Data Engineer 

 

We are looking for a Data Engineer to join our Warsaw Digital Hub!

 

The Data Engineer is responsible for developing, maintaining, and evaluating big data solutions across functional / process content areas. The profile uses data management best practices and innovative, fit-for-purpose technologies to build stable, flexible, high-performing data & analytics products that help reduce the lead time from data collection to insight generation. It assembles large, complex data sets (e.g. text, time series, imagery) into a useful format for analysis using fit-forpurpose database technologies and designs, builds and orchestrates ETL pipelines.

 

 

YOUR TASKS AND RESPONSIBILITIES:

 

Strategy:

  • Develops strategies to identify, acquire and use appropriate data sets to develop practical solutions and support decision making.
  • Defines the strategy and engineering guidelines for major data platforms jointly with the data architect.

Governance:

  • Ensures to implement efficient data access control mechanisms to different raw data sources both on-premises and cloud environments.
  • Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability.

Execution:

  • Creates and maintains optimal data pipeline architecture -Builds an infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using various state of the art and big data technologies.
  • Architects, builds and manages a robust, scalable, data pipeline in a hybrid-cloud environment that provides data processing capabilities to meet functional / non-functional business requirements.
  • Responsible for establishing and continuously improving sustainable, efficient, scalable, adaptable data processes and data processing applications.
  • Supports the development of continuous integration processes and infrastructure.
  • Analyzes large and complex data sets and models to gain insights and build suitable data models.
  • Assembles large, complex data sets (e.g. text, time series, imagery) into a useful format for analysis using fit-for-purpose database technologies.
  • Builds services and tools to make data more accessible to all data consumers (tools such as searchable metadata, API gateway).
  • Ensures end-to-end pipeline orchestration.
  • Supports the further development of existing enterprise data analytics solutions and be available as a contact person for data support.
  • Responsible for participating in the data warehouse by continuously improving ELT and ETL processes for the data lakes and data warehouses.

Digital change:

  • Supports software architecture decisions by writing, maintaining, and reviewing software code and ensures the code quality by automated testing.
  • Builds semantic data layers and knowledge graphs using fit-for-purpose methodology and technologies e.g. linked data concept, RDF, triple stores, graph databases to make core data structures that have already been discovered findable, accessible and reusable in an efficient way to for data knowledge workers.
  • Works with the platform development team to design & use tools for data logging and repeatable data tasks to accelerate and automate data scientist duties.
  • Responsible for processing, cleaning, structuring, and improving data model and building processes to support company-wide analytics.

 

 

WHO YOU ARE:

 

  • Bachelor’s degree in computer science, management information systems, or a related discipline working experience.
  • Experience in data architecture development, data asset management, data modelling, and linked data concepts as well as data taxonomy creation.
  • Experience with distributed databases (e.g. Cassandra) and NoSQL databases like HBase, MongoDB, Snowflake or Object stores like Blob or S3 and GraphDBs etc.
  • Experience with object-oriented/object function scripting languages (e.g., Python, Java, C++, Scala, etc.).
  • Profound experience in information architecture, data science, business intelligence or equivalent areas.
  • Experience in user experience modelling, information design, and concept generation.
  • Strong capability to manage implementation and maintenance of data architecture concepts and related platforms and data pipelines.
  • Able to draw insights from structured and unstructured data.
  • Relational databases, knows SQL dialects like e.g. T-SQL and PSQL.
  • Big data technologies (Hadoop, Hive, HBase, Spark, etc.).
  • Competent in data systems design and data visualization techniques.
  • Ability to perform business domain analysis and business process modelling.
  • Knowledge of data pipeline and workflow management tools (e.g., Airflow, DataFactory, Azkaban, Luigi, etc.).
  • Knowledge of cloud computing (e.g., AWS, Azure).
  • Knowledge of version control systems (e.g., GitHub, GitLab, etc.).
  • Basic knowledge of network security, enterprise security, cloud security, database security.
  • Knowledge of usability design and data warehousing techniques.
  • Understanding of data quality standards and data cleaning processes.
  • Verbal and written communication skills.
  • Ability to challenge and convince the various stakeholders involved in any project.

 

 

What do We offer: 

 

  • A flexible, hybrid work model 
  • Great workplace in a new modern office in Warsaw
  • Career development, 360° Feedback & Mentoring programme
  • Wide access to professional development tools, trainings, & conferences
  • Company Bonus & Reward Structure 
  • Increased tax-deductible costs for authors of copyrighted works
  • VIP Medical Care Package (including Dental & Mental health)
  • Holiday allowance (“Wczasy pod gruszą”)
  • Life & Travel Insurance
  • Pension plan
  • Co-financed sport card - FitProfit
  • Meals Subsidy in Office 
  • Budget for Home Office Setup & Maintenance
  • Access to Company Game Room equipped with table tennis, soccer table, Sony PlayStation 5 and Xbox Series X consoles setup with premium game passes, and massage chairs
  • Tailored-made support in relocation to Warsaw when needed
  • Please send your cv in English

 

WORK LOCATION: WARSAW AL. JEROZOLIMSKIE 158

   
YOUR APPLICATION  
   

Bayer welcomes applications from all individuals, regardless of race, national origin, gender, age, physical characteristics, social origin, disability, union membership, religion, family status, pregnancy, sexual orientation, gender identity, gender expression or any unlawful criterion under applicable law. We are committed to treating all applicants fairly and avoiding discrimination.

Bayer is committed to providing access and reasonable accommodations in its application process for individuals with disabilities and encourages applicants with disabilities to request any needed accommodation(s) using the contact information below. 

Bayer offers the possibility of working in a hybrid model. We know how important work-life balance is, so our employees can work from home, from the office or combine both work environments. The possibilities of using the hybrid model are each time discussed with the manager.
 

 
   
   
Location: Poland : Mazowieckie : Warszawa     
Division: S&DT  
Reference Code: 831829     

 

 

Required profile

Experience

Level of experience: Mid-level (2-5 years)
Spoken language(s):
EnglishEnglish
Check out the description to know which languages are mandatory.

Data Engineer Related jobs