Logo for Veracity Software Inc

Pyspark / SQL Engineer

Roles & Responsibilities

  • 5-10+ years of experience as a PySpark / SQL Engineer with a focus on data engineering and analytics
  • Experience building threat detection or log analytics pipelines using PySpark, SQL, and Databricks
  • Hands-on experience with Terraform for deploying data infrastructure
  • Familiarity with cloud platforms such as AWS or Azure

Requirements:

  • Design and build threat detection pipelines using PySpark, SQL, and Databricks
  • Support the migration of detection rules and content from a legacy platform to Databricks
  • Create and maintain PySpark log pipelines and associated rule configuration files; write unit tests and perform data validation checks
  • Deploy pipelines and infrastructure using Terraform; optimize data workflows for performance and scalability

Job description


Pyspark / SQL Engineer
1 Contractor Remote position
Los Angeles, CA (Remote)
3 Months

Hours per week: 40
Monday to Friday, 9am to 5pm

Skills:
Pyspark
SQL
databricks
Threat Detection
Terraform
Log Analytics for Security

We are seeking a skilled PySpark / SQL Engineer to support our Threat Detection team in building and migrating security analytics pipelines using Databricks. This role will focus on a platform migration project, moving detection rules and associated content from a legacy system into Databricks' native detection framework. You will be responsible for creating equivalent PySpark log pipelines, rule configuration files, unit tests, and data validation checks, and deploying these pipelines using Terraform. A strong background in data engineering, particularly with large-scale log analytics, is essential.

Requirements
5-10+ years of experience as a PySpark / SQL Engineer, with a strong focus on data engineering and analytics.
Prior experience in building threat detection or log analytics pipelines using PySpark, SQL, and Databricks.
Hands-on experience with Terraform for deploying data infrastructure.
Proficient in PySpark for large-scale data processing and transformation.
Familiarity with cloud platforms such as AWS or Azure is preferred.
Strong analytical skills and attention to detail when working with complex datasets.
Proven ability to work effectively in collaborative, cross-functional teams.
Excellent verbal and written communication skills in English.

Responsibilities
Design and build threat detection pipelines using PySpark, SQL, and Databricks.
Support the migration of detection rules and content from a legacy platform to Databricks.
Create and maintain PySpark log pipelines and associated rule configuration files.
Write unit tests to ensure pipeline accuracy and stability.
Perform data validation checks to ensure data integrity.
Deploy pipelines and infrastructure using Terraform.
Optimize existing data workflows and queries for performance and scalability.
Collaborate with cross-functional teams to understand data requirements and ensure alignment with detection objectives.

Related jobs

Other jobs at Veracity Software Inc

We help you get seen. Not ignored.

We help you get seen faster — by the right people.

🚀

Auto-Apply

We apply for you — automatically and instantly.

Save time, skip forms, and stay on top of every opportunity. Because you can't get seen if you're not in the race.

✨

AI Match Feedback

Know your real match before you apply.

Get a detailed AI assessment of your profile against each job posting. Because getting seen starts with passing the filters.

Upgrade to Premium. Apply smarter and get noticed.

Upgrade to Premium

Join thousands of professionals who got noticed and hired faster.