Match score not available

Cloud AWS Engineer

Remote: 
Full Remote
Experience: 
Entry-level / graduate
Work from: 

Offer summary

Qualifications:

Bachelor's degree in related field or equivalent experience, 7+ years designing large-scale enterprise solutions, 3+ years cloud solutions design/building experience, Expertise in AWS RDS, DynamoDB, DocumentDB, Experience in ETL/ELT tools and technologies.

Key responsabilities:

  • Understand business needs and current data infrastructure
  • Proactively identify gaps and make architectural recommendations
  • Collaborate with teams to design and support analytics systems
  • Automate and optimize data processing workloads
  • Develop end-to-end automation for software currency
VARITE INC logo
VARITE INC Information Technology & Services Large https://www.varite.com/
1001 - 5000 Employees
See more VARITE INC offers

Job description

Job Descriptions

  • Understand technology vision and strategic direction of business needs
  • Understand our current data model and infrastructure, proactively identify gaps, areas for improvement, and prescribe architectural recommendations with a focus on performance and accessibility.
  • Partner across engineering teams to design, build, and support the next generation of our analytics systems.
  • Partner with business and analytics teams to understand specific requirements for data systems to support both development and deployment of data workloads ranging from Tableau reports to ad hoc analyses.
  • Own and develop architecture supporting the translation of analytical questions into effective reports that drive business action.
  • Automate and optimize existing data processing workloads by recognizing patterns of data and technology usage and implementing solutions.
  • Solid grasp of the intersection between analytics and engineering while maintaining a proactive approach to assure solutions demonstrate high levels of performance, privacy, security, scalability, and reliability upon deployment.
  • Provide guidance to partners on effective use of the database management systems (DBMS) platform through collaboration, documentation, and associated standard methodologies.
  • Design and build end to end automation to support and maintain software currency
  • Create automation services for builds using Terraform, Python, and OS shell scripts.
  • Develop validation and certification process through automation tools
  • Design integrated solutions in alignment with design patterns, blueprints, guidelines, and standard methodologies for products
  • Participate in developing solutions by incorporating cloud native and 3rd party vendor products
  • Participate in research and perform POCs (proofs of concept) with emerging technologies and adopt industry best practices in the data space for advancing the cloud data platform.
  • Develop data streaming, migration and replication solutions
  • Demonstrate leadership, collaboration, exceptional communication, negotiation, strategic and influencing skills to gain consensus and produce the best solutions.
  • Engage with Senior leadership, business leaders at Client and the Board to share the business value.

Quals--

  • Demonstrates mutual respect, embraces diversity, and acts with authenticity
  • Bachelor's degree in Computer Science, Management Information Systems, Computer Engineering, or related field or equivalent work experience; advance degree preferred
  • Seven or more years of experience in designing and building large-scale solutions in an enterprise setting in both
  • Three years in designing and building solutions in the cloud
  • Expertise in building and managing Cloud databases such as AWS RDS, DynamoDB, DocumentDB or analogous architectures
  • Expertise in building Cloud Database Management Systems in Databricks Lakehouse or analogous architectures
  • Expertise in Cloud Data Warehouses in Redshift, BigQuery or analogous architectures a plus
  • Deep SQL expertise, data modeling, and experience with data governance in relational databases
  • Experience with the practical application of data warehousing concepts, methodologies, and frameworks using traditional (Vertica, Teradata, etc.) and current (SparkSQL, Hadoop, Kafka) distributed technologies
  • Refined skills using one or more scripting languages (e.g., Python, bash, etc.)
  • Experience using ETL/ELT tools and technologies such as Talend, Informatica a plus
  • Embrace data platform thinking, design and develop data pipelines keeping security, scale, uptime and reliability in mind
  • Expertise in relational and dimensional data modeling
  • UNIX admin and general server administration experience required
  • Presto, Hive, SparkSQL, Cassandra, or Solr other Big Data query and transformation experience a plus
  • Experience using Spark, Kafka, Hadoop, or similar distributed data technologies a plus
  • Able to expertly express the benefits and constraints of technology solutions to technology partners, business partners, and team members
  • Experience with leveraging CI/CD pipelines
  • Experience with Agile methodologies and able to work in an Agile manner is preferred
  • One or more cloud certifications.

Required profile

Experience

Level of experience: Entry-level / graduate
Industry :
Information Technology & Services
Spoken language(s):
English
Check out the description to know which languages are mandatory.

Other Skills

  • Collaboration
  • Communication
  • Leadership
  • Negotiation

Cloud Engineer Related jobs