ETL Data Wrangler, Senior – LINUX + Strong Scripting Skills 1284

ETL Data Wrangler, Senior – LINUX + Strong Scripting Skills 1284 | Augusta, 30901, Augusta, GA, US




The Senior Data Flow Engineer, aka Data Wrangler, will constantly interact via scripting and query of LINUX-centric data systems to support customer queries in a fast-paced environment, both independently and as part of a team. Referral Eligible!


Essential Functions:

  • Conceptualize and build scalable, robust microservices using technologies like NiFi, Kafka, Hadoop, Spark, Hive, Map reduce
  • Assist data modeling, database design, ETL design, job scheduling and monitoring, etc.
  • Design and develop modular ETL solutions which ingest and deliver data from source systems to analytics platforms (structured, semi-structured, and unstructured; queue / batch and real-time)
  • Maintain reference architecture and documentation for the purposes of architectural governance and application roadmap

Job Qualifications:

Qualifications and Skills include:

  • Must have direct experience with successful Apache Server data flow engineering
  • Direct and ongoing experience with Apache Server ETL methodology
  • Experience ETL scripting with one or more: Groovy, Python, C, C++, Java, .NET, Bash, AWK
  • Must have an IAT2 certification of Security+ CE or higher. 

Preferred Qualifications and Skills include:

  • Specialized Agency Experience with Apache NiFi (Niagara Files) is preferred, and training will be required if that skillset is not already in your portfolio
  • Microservice architectures and any of the following configuration management systems: Puppet, Ansible, Rancher, Kubernetes, Packer, Terraform, Kafka, NiFi 
  • Version control systems such as GIT or SVN
  • Must have direct experience resolving various Apache Server hardware and software functionality problems
  • Experience using working with Microsoft Azure, including working within cloud-based virtual machines and utilizing web-based productivity (email, calendar, and Teams)
  • Previous design and development within distributed data processing pipelines such as Big Data Platform (BDP) using Python, Java, Kafka, S3, AWS (C2S, AC2SP), etc.


Education/Experience includes:

  • 10 years of experience and a BA/BS  

Working Conditions:

Prolonged periods sitting at a desk and working on a computer.


Position Type/Expected Hours of Work:

Full time position


Background screening:

Employment is contingent on successfully passing the required background check, as well as other factors, including, but not limited to, drug screens. 

*This position requires an active TS/SCI clearance. 


AAP/EEO Statement: 

Zapata Technology is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, genetic information, creed, religion, sex, pregnancy, sexual orientation, gender identity, national origin, age, protected veteran status, or disability status. 


AAP/EEO Employer: Minorities/Females/Disabled/Veterans