Data Operations Engineer

Date: Mar 20, 2026

Location: Greenville, SC, US, 29601

Company: Purpose Financial

Address: 322 Rhett Street, Greenville, South Carolina, United States - 29601   

 

Purpose Financial, Inc. is an innovative consumer financial services company that offers a diverse suite of credit products, promoting financial inclusion and meeting consumers wherever they are. Through its brands, the company is committed to helping customers achieve their version of financial stability in the moment and in the future.  Since 1997, Purpose Financial has been a pioneer in the consumer credit and financial services market offering money solutions in over 800 storefronts locations and online lending.  Providing services in over 23 states, Purpose Financial employs over 2,500 team members.

At Purpose Financial we are always on the lookout for motivated individuals who share in our values of mutual respect to join our team of outstanding professionals.

We offer:

  • Competitive Wages
  • Health/Life Benefits
  • Health Savings Account plus Employer Seed
  • 401(k) Savings Plan with Company Match
  • Paid Parental Leave
  • Company Paid Holidays
  • Paid Time Off including Volunteer Time
  • Tuition Reimbursement
  • Business Casual Environment
  • Rewards & Recognition Program
  • Employee Assistance Program
  • Office in downtown Greenville that offers free parking, onsite gym, free snacks/drinks


To learn more about Purpose Financial visit Purpose Financial Website.

 

Position Summary

We’re looking for a skilled Data Operations Engineer to join our growing data team. This position reports to the Manager of Data Operations and is responsible for ensuring the reliability, scalability, and performance of our data pipelines and platforms. You’ll work closely with data engineering, analytics, and infrastructure teams to support mission‑critical data workflows, with a strong focus on Snowflake, Kafka, and modern data‑ops best practices.

Job Responsibility

Data Pipeline Operations

  • Monitor, maintain, and optimize data pipelines across Snowflake, Kafka, and related systems
  • Troubleshoot and resolve data ingestion, transformation, and delivery issues
  • Ensure high availability and reliability of streaming and batch data processes

Platform Management

  • Manage Snowflake environments including warehouses, databases, roles, and resource optimization
  • Operate and tune Kafka clusters, topics, partitions, and consumer groups
  • Implement data quality checks, validation rules, and operational alerts

Automation & Tooling

  • Build automation for deployment, monitoring, and recovery of data workflows
  • Develop scripts and tools (Python, SQL, Bash, etc.) to streamline operations
  • Contribute to CI/CD pipelines for data infrastructure

Collaboration & Governance

  • Partner with data engineers to improve pipeline design and performance
  • Work with analytics teams to ensure timely and accurate data delivery
  • Support data governance, security, and compliance initiatives

Accountability

Understand, adhere to, and enforce all corporate policies. 

Job Responsibilities Cont.

Education Required

A Bachelor’s degree in Computer Science, Information Technology, software Engineering, or data management, or related field.

Experience Required

  • 5+ years in data operations, data engineering, or similar technical role
  • Hands‑on experience with Snowflake (query optimization, warehouse tuning, RBAC, data sharing)
  • Strong experience with Kafka (producers/consumers, schema registry, monitoring, troubleshooting)
  • Proficiency in SQL and at least one scripting language (Python preferred)
  • Experience with cloud platforms (AWS preferred, Azure, or GCP)
  • Familiarity with orchestration tools (Airflow, dbt, etc.)
  • Strong understanding of data modeling, ETL/ELT concepts, and data lifecycle management

Knowledge Required

Excellent written and verbal communications skills; adaptability and flexibility to changing environment; and comfortable working in a dynamic, high volume, fast-paced environment. Ability to understand and ensure compliance with policies, procedures, and laws governing our industry/business and products.

Preferred Qualifications:

  • Experience with Snowflake features such as Streams, Tasks, and Snowpipe
  • Knowledge of Kafka Connect, Kafka Streams, or Confluent Platform
  • Experience with writing connectors, both producers and consumers
  • Exposure to observability tools (Datadog, Prometheus, Grafana, Splunk)
  • Background in DevOps or SRE practices applied to data systems
  • Experience working in fast‑paced, cloud‑native environments

Physical Requirements

  • Sitting for long periods of time; standing occasionally; walking; bending; squatting; kneeling; pushing/pulling; reaching; twisting; frequent lifting of less than 10 lbs., occasional lifting of up to 20 lbs.; driving and having access during the workday to an insured and reliable transportation; typing; data entry; grasping; transferring items between hands and/or to another person or receptacle; use of office equipment to include computers; ability to travel to, be physically present at, and complete the physical requirements of the position at any assigned location.
  • Experience working in fast‑paced, cloud‑native environments

Competencies

OKR
Technical Proficiency/Leadership

Travel

0%-10%

Attire

Business Casual

Other

Must be eligible to work in the USA and able to pass a background check.

All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, or disability.

Requisition ID: 46092
 


Nearest Major Market: Greenville
Nearest Secondary Market: South Carolina