Job Title: DevOps Engineer (50% Kafka, 50% DevOps)
Location: San Antonio, USAA
Job Summary:
The DevOps Engineer will work with both Apache Kafka and DevOps tools to design, deploy, and manage real-time data streaming solutions, particularly using AWS services and Apache Flink. This role requires a deep understanding of Kafka, cloud infrastructure, and automation tools.
Key Skills & Expertise:
1. AWS Managed Services:
◦ Proficiency with AWS services such as MSK (Managed Streaming for Kafka), Kinesis, Lambda, EC2, RDS, VPC, IAM, and more.
◦ Ability to manage infrastructure using CloudFormation or Terraform.
2. Apache Flink:
◦ Expertise in Apache Flink for stream and batch data processing, including its integration with Kafka.
◦ Experience with managing Flink clusters on AWS (EC2, EKS, or managed services).
3. Apache Kafka:
◦ In-depth knowledge of Kafka architecture, including brokers, topics, partitions, producers, and consumers.
◦ Experience with managing Kafka clusters (MSK or self-managed).
4. DevOps & Automation:
◦ Experience with automating deployments and infrastructure using CI/CD tools (e.g., Jenkins, GitLab, GitHub Actions).
◦ Proficiency with Docker, Kubernetes, and cloud-based containerization.
5. Programming & Scripting:
◦ Strong scripting skills in Python, Bash, or Go for automation and integration tasks.
6. Monitoring & Performance Tuning:
◦ Knowledge of monitoring tools like CloudWatch, Prometheus, and Grafana.
◦ Expertise in optimizing data pipelines for performance and scalability.
Key Responsibilities:
1. Infrastructure Design & Implementation:
◦ Design and deploy real-time data pipelines using Apache Flink and Kafka on AWS, ensuring scalability and fault tolerance.
2. Platform Management:
◦ Manage and optimize Kafka clusters and Flink jobs for performance and scaling on AWS.
3. Automation & CI/CD:
◦ Automate infrastructure provisioning and monitoring using Terraform, CloudFormation, and other tools.
4. Collaboration:
◦ Work closely with Data Engineers, Data Scientists, and DevOps teams for smooth integration of data systems.
5. Security & Compliance:
◦ Implement security for Kafka and Flink clusters, including encryption and access controls.
6. Optimization & Troubleshooting:
◦ Optimize Kafka and Flink deployments for performance, troubleshoot issues, and ensure reliability
Job Title : Quality Engineer / SDET With Salesforce Testing Job Location: Henderson, Nevada (5 Day's Onsite Mandatory) Job Duration : Long Term Job Description: Firstly, we need someone who can develop, implement, and execute automated UI test scripts for Salesforce...
...Longevity Health Coach - Elite Membership Programs About Extension Health Extension Health is a full-stack platform for proactive, personalized healthcarecombining cutting-edge diagnostics, longevity therapeutics, and concierge-level experience to optimize human...
...delivering postal mail to rural and suburban areas. This role ensures timely and accurate mail delivery while providing excellent customer... ...to individuals and businesses on the route. This position is a part-time or intermittent opportunity within the United States Postal...
...in designing, developing, and troubleshooting Documentum applications (Content Server, D2 Config, D2 Classic, D2 Smart View, Brava, CTS, xPlore, DFC Client). ~ Strong understanding of Documentum architecture, object model, and security. ~ Experience with platform...
...Job Description: Responsibilities: Execute performance and reliability tests on complete medical devices. Follow step-by-step documented procedures (e.g., vibration, humidity, chamber testing). Operate and troubleshoot hardware (oscilloscopes, probes, testing...