Job Title: DevOps Engineer (50% Kafka, 50% DevOps)
Location: San Antonio, USAA
Job Summary:
The DevOps Engineer will work with both Apache Kafka and DevOps tools to design, deploy, and manage real-time data streaming solutions, particularly using AWS services and Apache Flink. This role requires a deep understanding of Kafka, cloud infrastructure, and automation tools.
Key Skills & Expertise:
1. AWS Managed Services:
◦ Proficiency with AWS services such as MSK (Managed Streaming for Kafka), Kinesis, Lambda, EC2, RDS, VPC, IAM, and more.
◦ Ability to manage infrastructure using CloudFormation or Terraform.
2. Apache Flink:
◦ Expertise in Apache Flink for stream and batch data processing, including its integration with Kafka.
◦ Experience with managing Flink clusters on AWS (EC2, EKS, or managed services).
3. Apache Kafka:
◦ In-depth knowledge of Kafka architecture, including brokers, topics, partitions, producers, and consumers.
◦ Experience with managing Kafka clusters (MSK or self-managed).
4. DevOps & Automation:
◦ Experience with automating deployments and infrastructure using CI/CD tools (e.g., Jenkins, GitLab, GitHub Actions).
◦ Proficiency with Docker, Kubernetes, and cloud-based containerization.
5. Programming & Scripting:
◦ Strong scripting skills in Python, Bash, or Go for automation and integration tasks.
6. Monitoring & Performance Tuning:
◦ Knowledge of monitoring tools like CloudWatch, Prometheus, and Grafana.
◦ Expertise in optimizing data pipelines for performance and scalability.
Key Responsibilities:
1. Infrastructure Design & Implementation:
◦ Design and deploy real-time data pipelines using Apache Flink and Kafka on AWS, ensuring scalability and fault tolerance.
2. Platform Management:
◦ Manage and optimize Kafka clusters and Flink jobs for performance and scaling on AWS.
3. Automation & CI/CD:
◦ Automate infrastructure provisioning and monitoring using Terraform, CloudFormation, and other tools.
4. Collaboration:
◦ Work closely with Data Engineers, Data Scientists, and DevOps teams for smooth integration of data systems.
5. Security & Compliance:
◦ Implement security for Kafka and Flink clusters, including encryption and access controls.
6. Optimization & Troubleshooting:
◦ Optimize Kafka and Flink deployments for performance, troubleshoot issues, and ensure reliability
Data Entry Specialist Immediate opening for Global Transportation organization Onsite Fullerton area Great part-time PM shift role 20-25 hours per week/College students welcome Monday to Friday 6PM to 10PM OR 8am-12pm OR 12pm-4pm $20-21.00 per hour Transportation...
...Aircraft Assembler Boeing 757 Program (Charleston, SC) Contract-to-Hire | Competitive Pay | Relocation Assistance Available We are hiring 100+ Aircraft Assemblers to support Boeings 757 program in Charleston, SC . This is a 12-month contract-to-hire opportunity...
...data migration, integration, and transformation. Must have experience managing large scale Data modernization initiative and Stakeholders... ...Required Skills & Experience: 10+ years of experience in software testing with a focus on data-centric projects. Proven...
...Geriatric and Adult Psychiatry practice. We have numerous skilled nursing and assisted living facility clients where we provide... ...workflow. All of our clinicians perform the exact same duties, so together we have refined our processes to be as efficient and...
...English language. Ability to work overtime and late hours on short notice. Available to work non-traditional schedule, including nights and weekends. Must pass a written exam. A commitment to diversity, equity, accessibility, and inclusion. Physical...