REMOTE Kafka Engineer (1211)

REMOTE Kafka Engineer (1211) | Augusta, 30901, Augusta, GA, US


Referral Eligible 


This position is remote/work from home. The Kafka engineer with assist in the design and development of a modular ETL solution which harvests, ingests, and streams data from source systems to analytics platforms (structured, semi-structured, and unstructured; queue / batch and real-time) using Kafka and NiFi for data pipeline deployment. Able to handle discreet tasks in a fast-paced environment, both independently and as part of a team. 

Essential Functions:

  • Maintain documentation and update tasking documents
  • Assist in data modeling, queue design, data pipeline design, job scheduling and monitoring, etc. Transform data into usable streams through development of data structures, designing a data replay strategy, implementing the data streaming and flow architecture
  • Implement data pipelines for event-based data replay and streaming, including micro-batch and batch-based data pipelines.
  • Will be responsible for managing a version control system such as GIT or SVN.
  • Will need to successfully switch between tasks in a seamless manner. 

Job Qualifications:

Qualifications and Skills include:

  • Must be fully capable of implementing and administering Apache Kafka.
  • Intermediate level expertise creating data pipelines to include ETL and streaming data such as log data or tool/sensor data to indices.
  • Experience evaluating new methodologies and technologies to meet requirements.
  • Scripting experience in Groovy, Python, C, C++, Java, .NET, Bash, AWK.
  • Must have experience in supporting successful Kafka data pipelines, batch / queue engineering, ETL, and related tasks.
  • Specialized work experience in environments with Kafka, Groovy script.
  • Professional manner and a strong ethical code.


Preferred Qualifications and Skills include:

  • Design and implement data visualizations. Experience with ELK stack preferred 
  • Cloud workload experience (Amazon Web Services, Azure)
  • Experience using working with Microsoft Azure, to include working within cloud-based virtual machines and utilizing web-based productivity (email, calendar, and Teams).
  • Previous experience within distributed data processing pipelines such as Big Data Platform (BDP) using Python, Java, Kafka, S3, AWS (C2S, AC2SP), etc.
  • Microservice architectures and any of the following configuration management systems: Puppet, Ansible, Rancher, Kubernetes, Packer, Terraform, and NiFi

Education/Experience includes:

  • 3 to 10 years of experience and a BA/BS
  • 7 to 13 years of experience and an Associates Degree
  • 9 to 16 years of experience

Working Conditions:

Prolonged periods sitting at a desk and working on a computer. 

Position Type/Expected Hours of Work:

Full time position


Less than 5%

Background screening:

Employment is contingent on successfully passing the required background check, as well as other factors, including, but not limited to, drug screens. Must be willing to obtain a security clearance should compliance requirements require it. 

AAP/EEO Statement: 

Zapata Technology is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, genetic information, creed, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status. 

AAP/EEO Employer: Minorities/Females/Disabled/Veterans