DevOps Engineer - IBM/USAA
Location: San Antonio, TX
Tools & Technologies:
• Apache Kafka (Self-managed or MSK)
• AWS Managed Apache Flink
• Amazon EC2, S3, RDS, and VPC
• Terraform/CloudFormation
• Docker, Kubernetes (EKS)
• ELK Stack, CloudWatch
• Python, Bash
Skills & Expertise:
1. AWS Managed Services:
◦ Proficiency in AWS services like Amazon MSK, Kinesis, Lambda, S3, EC2, RDS, VPC, and IAM.
◦ Experience with infrastructure management using Terraform or AWS CloudFormation.
2. Apache Flink:
◦ Understanding of real-time stream and batch data processing with Apache Flink.
◦ Familiarity with Flink's integration with Kafka and managing Flink clusters on AWS.
3. Kafka Broker (Apache Kafka):
◦ Strong knowledge of Kafka architecture, brokers, topics, partitions, producers, and consumers.
◦ Experience with Kafka management, monitoring, scaling, and optimization, particularly with Amazon MSK or self-managed Kafka clusters.
4. DevOps & Automation:
◦ Expertise in automating deployments and infrastructure provisioning.
◦ Familiarity with CI/CD pipelines using tools like Jenkins, GitLab, GitHub Actions, CircleCI.
◦ Experience with Docker and Kubernetes for containerizing and orchestrating applications in the cloud.
5. Programming & Scripting:
◦ Strong scripting skills in Python, Bash, or Go for automation tasks and data pipeline integrations.
6. Monitoring & Performance Tuning:
◦ Proficiency with monitoring tools like CloudWatch, Prometheus, Grafana.
◦ Expertise in optimizing data pipelines for scalability, fault tolerance, and performance.
Responsibilities:
1. Infrastructure Design & Implementation:
◦ Design and deploy scalable, fault-tolerant real-time data processing pipelines using Apache Flink and Kafka on AWS.
2. Platform Management:
◦ Manage and optimize Kafka clusters (MSK or self-managed) and Flink jobs on AWS infrastructure for low-latency, high-throughput processing.
3. Automation & CI/CD:
◦ Automate infrastructure provisioning and deployment using Terraform, CloudFormation, or other tools.
◦ Integrate new applications into CI/CD pipelines for real-time data processing.
4. Collaboration with Data Engineering Teams:
◦ Work with Data Engineers, Data Scientists, and DevOps teams to ensure smooth integration and high performance of real-time data systems.
5. Security and Compliance:
◦ Implement security for Kafka and Flink clusters, ensuring compliance with regulatory standards (e.g., GDPR, HIPAA).
6. Optimization & Troubleshooting:
◦ Optimize Kafka and Flink performance, troubleshoot issues related to message delivery, job failures, or AWS service outages.
...transportation of the sick and injured in accordance with local, state, national and Acadian Ambulance standards of practice. REPORTS TO: EMT-Basic Operations Supervisor/Coordinator Operations Manager Director of Operations Regional Vice President SMS West...
...Summary : Responsible for maintaining organized and updated documents for the Project. Their duties include using document management... .... Review and update documents for maintenance and quality control. Keep other personnel updated on new document versions and how...
...Job Title: Field Reimbursement Manager (Little Rock) Job Location: Little Rock, AR, USA Job Location Type: Remote Job Contract Type: Full-time Job Seniority Level: Mid-Senior level Why Endo? We want the best and brightest people at Endo to...
SWIFT DEDICATED REFRIGERATED OPPORTUNITYLots of Drop & Hook Freight!Advantages of Swift Dedicated Refrigerated FleetInterested in driving with Swift Refrigerated?Pleasetext DRIVESWIFT to25000or click here:http://oli.vi/wdUcD to speak with our recruiting...
Looking for a Recreation/ Sports Management major who is looking to earn some extra money and also gain invaluable experience! Job entails assisting... ...programs and creating new ones. Also helping to run special events and assisting where ever needed to run a busy health and...