Description
The Associate Solution Architect will be responsible for providing technical leadership, architecture, governance, and oversight of CFRA’s next generation of data science and data processing software using a modern cloud-native technology stack with Python on AWS cloud infrastructure. This is a rare opportunity to make a big impact on both the team and the organization by being part of the design and development of application frameworks that will serve as the foundation for all future development at CFRA.
The ideal candidate has a passion for solving business problems with technology and can effectively communicate business and technical needs to stakeholders. We are looking for candidates that value collaboration with colleagues and having an immediate, tangible impact for a leading global independent financial insights and information company.
Key Responsibilities
- Requirement Analysis: Collaborate with stakeholders to understand business requirements and data sources and define the architecture and design of data engineering models to meet these requirements.
- Architecture Design: Design scalable, reliable, and efficient data engineering models, including algorithms, data pipelines, and data processing systems, to support business requirements and quantitative analysis.
- Technology Selection: Evaluate using POCs and recommend appropriate technologies, frameworks, and tools for building and managing data engineering models, considering factors like performance, scalability, and cost-effectiveness.
- Data Processing: Develop and implement data processing logic, including data cleansing, transformation, and aggregation, using technologies such as AWS Glue, Batch, Lambda.
- Quantitative Analysis: Collaborate with data scientists and analysts to develop algorithms and models for quantitative analysis, using techniques such as regression analysis, clustering, and predictive modeling.
- Model Evaluation: Evaluate the performance of data engineering models using metrics and validation techniques and iterate on models to improve their accuracy and effectiveness.
- Data Visualization: Create visualizations of data and model outputs to communicate insights and findings to stakeholders.
- Monitoring and Logging: Implement monitoring and logging solutions for data engineering models using tools like AWS CloudWatch to ensure model health and performance.
- Security and Compliance: Ensure data engineering model architecture complies with security best practices and regulatory requirements, implementing encryption, access controls, and data masking as needed.
- Documentation: Create and maintain documentation for data engineering model architecture, design, and implementation, including diagrams, data flow descriptions, and operational procedures.
- Collaboration: Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to understand their requirements and integrate data engineering models into their workflows.
- Continuous Improvement: Stay updated with the latest trends, tools, and technologies in data engineering and quantitative analysis, and continuously improve data engineering model processes and methodologies.
Desired Skills And Experience
- A minimum of 10+ years of development experience on enterprise applications
- Data Engineering: Understanding of data engineering principles and practices, including data ingestion, processing, transformation, and storage, using tools and technologies such as AWS Glue, Batch, Lambda.
- Quantitative Analysis: Proficiency in quantitative analysis techniques, including statistical modeling, machine learning, and data mining, with experience in implementing algorithms for regression analysis, clustering, classification, and predictive modeling.
- Programming Languages: Proficiency in programming languages commonly used in data engineering and quantitative analysis, such as Python, R, Java, or Scala, as well as experience with SQL for data querying and manipulation.
- Big Data Technologies: Familiarity with big data technologies and platforms, such as Hadoop, Apache Kafka, Apache Hive, or AWS EMR, for processing and analyzing large volumes of data.
- Cloud Computing: Proficiency in using AWS cloud services for data pipeline architecture and implementation.
- Data Integration: Understanding of data integration techniques and tools for integrating data from various sources, including batch and real-time data integration, and experience with ETL (Extract, Transform, Load) processes.
- Database Systems: Knowledge of database systems, including relational databases (e.g., MySQL, PostgreSQL) and NoSQL databases (e.g., MongoDB, Cassandra), and experience in designing and managing database schemas.
- Problem-solving Skills: Excellent problem-solving skills, with the ability to analyze complex data engineering and quantitative analysis problems, identify solutions, and implement them effectively.
- Communication and Collaboration: Strong communication and collaboration skills, with the ability to effectively communicate technical concepts to non-technical stakeholders and work effectively in a team environment.
- Learning Agility: A commitment to continuous learning and staying updated with the latest trends, tools, and technologies in data engineering, quantitative analysis, and machine learning.
- Bachelor's Degree: A bachelor's degree in Computer Science, Software Engineering, or a related field is often preferred, although equivalent experience and certifications can also be valuable. Experience with Financial domain knowledge is a plus.
Benefits
- 21 days of Annual Vacation
- 8 sick days
- 6 casual days
- 1 paid Volunteer Day
- Medical, Accidental & Term Life Insurance
- Telehealth, OPD
- Competitive pay
- Annual Performance Bonus
About CFRA
CFRA is a leading independent investment insights and data analytics company. Through an unmatched multidisciplinary approach to investment research, including expert lenses on forensic accounting, fundamental, policy, legal, fund, and technical research, CFRA provides actionable analytics to make better investment and business decisions. CFRA is results-oriented, we place an unwavering priority on the quality of our research, from the productivity and performance of our analysts to the success of our client relationships. Over 2,000 clients rely on CFRA’s proprietary research conducted by experts who uniquely analyze industries, funds and companies of interest with our time-tested and rigorous research methodology.
On October 1, 2016, CFRA acquired S&P Global’s Equity and Fund Research business, a leading provider of independent research and commentary with offerings focused on stocks, ETFs and mutual funds as well as sectors and industries. The Equity and Fund Research business originated in the 1920’s and has amassed a worldwide base of investing clients.
The combined firm is committed to being the world's leading independent investment research firm with ~90 global analysts, authoring in-depth qualitative research on 1,600+ companies. In addition, CFRA offers a comprehensive view on global sector themes, industries, and funds, through in-depth qualitative research on 11 Sectors, 73 Industries, 19,000+ ETFs, 15,000+ Mutual Funds, as well as quantitative company research on 20,000+ global companies.
Founded as the “Center for Financial Research and Analysis” in 1994, today our company is simply known as CFRA. However, our mission remains to be the “center” for our global clients by providing independent, differentiated, and actionable analysis to help you make better investment and business decisions.
Our clients are based in the US, Europe, Middle East, Asia, and Australia representing thousands of investment professionals and risk managers at leading hedge funds, mutual funds, pension managers, insurance companies, private equity, investment advisors, banks, regulators, corporations, and professional service organizations.