Job Title: DevOps Engineer (50% Kafka, 50% DevOps)
Location: San Antonio, USAA
Job Summary:
The DevOps Engineer will work with both Apache Kafka and DevOps tools to design, deploy, and manage real-time data streaming solutions, particularly using AWS services and Apache Flink. This role requires a deep understanding of Kafka, cloud infrastructure, and automation tools.
Key Skills & Expertise:
1. AWS Managed Services:
◦ Proficiency with AWS services such as MSK (Managed Streaming for Kafka), Kinesis, Lambda, EC2, RDS, VPC, IAM, and more.
◦ Ability to manage infrastructure using CloudFormation or Terraform.
2. Apache Flink:
◦ Expertise in Apache Flink for stream and batch data processing, including its integration with Kafka.
◦ Experience with managing Flink clusters on AWS (EC2, EKS, or managed services).
3. Apache Kafka:
◦ In-depth knowledge of Kafka architecture, including brokers, topics, partitions, producers, and consumers.
◦ Experience with managing Kafka clusters (MSK or self-managed).
4. DevOps & Automation:
◦ Experience with automating deployments and infrastructure using CI/CD tools (e.g., Jenkins, GitLab, GitHub Actions).
◦ Proficiency with Docker, Kubernetes, and cloud-based containerization.
5. Programming & Scripting:
◦ Strong scripting skills in Python, Bash, or Go for automation and integration tasks.
6. Monitoring & Performance Tuning:
◦ Knowledge of monitoring tools like CloudWatch, Prometheus, and Grafana.
◦ Expertise in optimizing data pipelines for performance and scalability.
Key Responsibilities:
1. Infrastructure Design & Implementation:
◦ Design and deploy real-time data pipelines using Apache Flink and Kafka on AWS, ensuring scalability and fault tolerance.
2. Platform Management:
◦ Manage and optimize Kafka clusters and Flink jobs for performance and scaling on AWS.
3. Automation & CI/CD:
◦ Automate infrastructure provisioning and monitoring using Terraform, CloudFormation, and other tools.
4. Collaboration:
◦ Work closely with Data Engineers, Data Scientists, and DevOps teams for smooth integration of data systems.
5. Security & Compliance:
◦ Implement security for Kafka and Flink clusters, including encryption and access controls.
6. Optimization & Troubleshooting:
◦ Optimize Kafka and Flink deployments for performance, troubleshoot issues, and ensure reliability
...~60 months experience developing, implementing and integrating systems related to criminal justice processing at the state or federal level (including but not limited to: disposition processing, fingerprint identification processing, criminal history updating)....
...Safety Advisor, Field Operations Location: Challis, Idaho (primary project site) with nationwide travel Work Arrangement: Project based, on site rotation with paid travel Help build and sustain a proactive safety culture on a high visibility industrial...
...system. Assist in processing vendor payments (checks, ACH, wires) in a timely manner. Match invoices to purchase orders and expense reports to ensure accuracy. Help monitor and track outstanding invoices, credits, and payment status. Support expense...
...deadlines. High level of ownership, organization, and accountability. Ability to write clear, natural, and persuasive copy in English. Preferred Qualifications Experience working in a marketing agency or fast-paced marketing team. Familiarity with...
...supply chain issues. Coordinate with Distribution Center and offshore teams regarding POs and delivery expectations. Manage... ...take part in our comprehensive benefits program which includes Medical, Dental, Vision, Life & Disability Insurances, 401(k) plan, FSA...