Job Title: DevOps Engineer (50% Kafka, 50% DevOps)
Location: San Antonio, USAA
Job Summary:
The DevOps Engineer will work with both Apache Kafka and DevOps tools to design, deploy, and manage real-time data streaming solutions, particularly using AWS services and Apache Flink. This role requires a deep understanding of Kafka, cloud infrastructure, and automation tools.
Key Skills & Expertise:
1. AWS Managed Services:
◦ Proficiency with AWS services such as MSK (Managed Streaming for Kafka), Kinesis, Lambda, EC2, RDS, VPC, IAM, and more.
◦ Ability to manage infrastructure using CloudFormation or Terraform.
2. Apache Flink:
◦ Expertise in Apache Flink for stream and batch data processing, including its integration with Kafka.
◦ Experience with managing Flink clusters on AWS (EC2, EKS, or managed services).
3. Apache Kafka:
◦ In-depth knowledge of Kafka architecture, including brokers, topics, partitions, producers, and consumers.
◦ Experience with managing Kafka clusters (MSK or self-managed).
4. DevOps & Automation:
◦ Experience with automating deployments and infrastructure using CI/CD tools (e.g., Jenkins, GitLab, GitHub Actions).
◦ Proficiency with Docker, Kubernetes, and cloud-based containerization.
5. Programming & Scripting:
◦ Strong scripting skills in Python, Bash, or Go for automation and integration tasks.
6. Monitoring & Performance Tuning:
◦ Knowledge of monitoring tools like CloudWatch, Prometheus, and Grafana.
◦ Expertise in optimizing data pipelines for performance and scalability.
Key Responsibilities:
1. Infrastructure Design & Implementation:
◦ Design and deploy real-time data pipelines using Apache Flink and Kafka on AWS, ensuring scalability and fault tolerance.
2. Platform Management:
◦ Manage and optimize Kafka clusters and Flink jobs for performance and scaling on AWS.
3. Automation & CI/CD:
◦ Automate infrastructure provisioning and monitoring using Terraform, CloudFormation, and other tools.
4. Collaboration:
◦ Work closely with Data Engineers, Data Scientists, and DevOps teams for smooth integration of data systems.
5. Security & Compliance:
◦ Implement security for Kafka and Flink clusters, including encryption and access controls.
6. Optimization & Troubleshooting:
◦ Optimize Kafka and Flink deployments for performance, troubleshoot issues, and ensure reliability
...Job Title: Vice President of Global Supply Chain & Logistics Location: Newark, CA Department: Procurement Reports to: Vice President... ...business. This includes procurement, logistics, inventory management, demand planning, supplier relations, and cost management...
...~60 months experience developing, implementing and integrating systems related to criminal justice processing at the state or federal level (including but not limited to: disposition processing, fingerprint identification processing, criminal history updating)....
Location: 760 Airport Fwy Suite 400 Hurst TX 76054 Work Week : Flexible hours are available! Able to work starting at 5:00 AM or until midnight if required, based on caf hours and availability. Welcome to the DYNE Hospitality Group (Tropical Smoothie Caf...
Pride Health is hiring a Special Investigative Unit Investigator for one of its clients in Michigan. This is a 12-week contract with a possible extension, with competitive pay and benefits. This position is remote, but candidates MUST be in Eastern or Central ...
...requests, or modification of existing loans. Post-closing disbursement for purchase money transactions. Daily routine includes Supporting documentation for all incoming loans, letters of credit, and subordination order and review; review of attorney-prepared...