Job Search

Kafka Platform Engineer – remote

Remote, USA

Contract to hire

Posted 09/12/2025

Job Description

OUR GOAL: 
Treat our consultants and clients the way we would like others to treat us!
 
Interested in joining our team? Check out the opportunity below and apply today!
  
A Kafka Platform Engineer is needed for a six-month contract-to-hire opportunity involving designing, implementing, and supporting scalable, secure Kafka-based messaging pipelines that power real-time communication between critical systems such as credit, loan applications, and fraud services. This role focuses on improving the resiliency, reliability, and operations of the Kafka platform in a highly regulated financial environment. The Kafka Platform Engineer partners closely with engineering and platform teams to support the migration from on-prem to AWS and ensure seamless integration across systems.

Top Must-Haves:

  • Kafka & Confluent Cloud Expertise
    • Deep understanding of Kafka architecture and Confluent Cloud services.
    • Experience with Kafka Connect, Schema Registry, and stream processing.
  • AWS Infrastructure & Database Management
    • Hands-on experience with AWS services like RDS, Aurora, EC2, IAM, and networking.
    • Ability to integrate Kafka with AWS-hosted databases and troubleshoot cloud-native issues.
  • Terraform & Infrastructure Automation
    • Proficiency in Terraform for provisioning Kafka clusters, AWS resources, and managing infrastructure as code.
    • Familiarity with GitOps workflows and CI/CD pipelines.

 
Certification Requirements:

  • Confluent Certified Developer for Apache Kafka
    • Validates deep understanding of Kafka architecture, APIs, and Confluent tooling.
    • Ideal for engineers building and managing Kafka-based data pipelines.
  • AWS Certified Solutions Architect – Associate
    • Demonstrates strong knowledge of AWS services, networking, and architecture best practices.
    • Especially useful for integrating Kafka with AWS-hosted databases and services.
  • HashiCorp Certified: Terraform Associate
    • Confirms proficiency in infrastructure as code, Terraform modules, and cloud provisioning.
    • Valuable for managing Kafka infrastructure and AWS resources declaratively.

 
Essential Job Functions

  • Regularly check cloud services for performance issues and recency and optimize as needed. Configure and manage user permissions and roles to ensure secure access to cloud resources. Develop and maintain backup strategies to ensure data integrity and availability. Maintain detailed records of system configurations and changes for compliance and troubleshooting. – (25%)
  • Write and maintain scripts for automated deployment processes. Ensure automated tests are part of the CI/CD pipeline to catch issues early. Track deployment progress and resolve any issues that arise during the process. Work closely with developers to ensure smooth integration of new code into production. Continuously improve deployment processes to reduce downtime and increase efficiency. – (25%)
  • Set up and configure tools to monitor cloud infrastructure and applications. Develop dashboards for real-time monitoring and set up alerts for critical issues. Regularly review monitoring data to identify trends and potential issues. Provide regular reports on system performance and health to stakeholders. Continuously improve monitoring solutions to cover new services and technologies. – (20%)
  • Organize meetings to gather requirements from various teams for cloud projects. Ensure alignment between development, network, and security teams on cloud initiatives. Mediate and resolve any conflicts or discrepancies in requirements or priorities. Keep detailed records of discussions and decisions made during meetings. Ensure that all agreed-upon actions are completed in a timely manner. – (15%)
  • Regularly review resource usage to identify areas for optimization. Predict future resource requirements based on current trends and business growth. Create plans for scaling resources up or down based on demand. Ensure that resources are allocated efficiently to avoid waste and reduce costs. Continuously review and adjust capacity plans to reflect changes in business needs or technology. – (15%)

 
Minimum Qualifications

  • Bachelor’s Degree in Information Technology, Computer Science, Engineering or related field or equivalent, relevant work experience
  • At least 1 platform specific certification (AWS, Azure, GCP, DevSecOps, Apache Kafka).
  • 2+ years of relevant experience working across areas of the Platform engineering.
  • 2+ years of experience of cloud services and understanding of infrastructure as code (IaC) tools like Terraform or AWS CloudFormation.

Skills:

  • Programming Languages
  • Cloud Services Management
  • CI/CD
  • Configuration Management (CM)
  • Infrastructure As Code (IaC)
  • DevSecOps
  • Monitoring Solutions
  • IT Capacity Planning
  • Security Management
  • Technical Communication
  • Cloud Deployment

 
Top 3 Nice-To-Haves:

  • Monitoring & Observability
    • Experience with tools like Prometheus, Grafana, Datadog, or Confluent Metrics API.
    • Ability to set up alerting and dashboards for Kafka and cloud services.
  • Security & Governance
    • Knowledge of RBAC, encryption, and audit logging in Confluent Cloud and AWS.
    • Experience implementing secure data pipelines and compliance controls.
  • Strong Collaboration & Incident Response
    • Ability to work cross-functionally with data engineers, SREs, and developers.
    • Skilled in communicating during outages, postmortems, and planning sessions.

 
Additional Preferred Qualifications:

  • 5+ years of cloud engineering experience, particularly in designing and implementing cloud platform solutions.
  • 3+ years of experience with Apache Kafka in highly regulated, mission-critical environments (preferably finance or banking).
  • Strong understanding of Kafka internals and distributed systems.
  • Proficiency in Java, Scala, or Python for building Kafka producers, consumers, and stream processors.
  • Experience with Kafka Connect, Schema Registry (Avro), and Kafka Streams.
  • Hands-on experience with containerization (Docker, Kubernetes) and CI/CD pipelines.
  • Familiarity with securing Kafka using Kerberos, SSL, ACLs, and integration with IAM systems.
  • Solid understanding of financial transaction systems, messaging standards, and data privacy regulations (e.g., SOX, PCI-DSS, GDPR).

 
A typical day might include:

  • Morning Check-ins:
  • Reviewing system health dashboards and alerts.
  • Checking in with direct reports or team leads on ongoing issues or overnight incidents.
  • Leading or attending stand-ups with infrastructure, network, and operations teams.
  • Coordinating with cybersecurity, application development, and support teams.
  • Reviewing infrastructure roadmaps and project timelines.
  • Evaluating vendor performance and contract renewals.
  • Approving changes and reviewing architecture proposals.
  • Use Terraform to provision or update Kafka topics, connectors, or AWS resources.
  • Troubleshoot Kafka Connect integrations with AWS databases (e.g., RDS, Aurora).
  • Optimize throughput, latency, and schema evolution.
  • Update Confluence pages with:
    • Architecture diagrams.
    • Runbooks for incident response.
    • Kafka topic naming conventions and retention policies.
    • Document changes made via Terraform and link to Jira tickets
  • Stakeholder Engagement: Meeting with business units to understand upcoming needs.
  • Handling escalated technical issues or outages.
  • Making decisions on resource allocation and prioritization.
  • High interaction, especially with infrastructure engineers, network admins, project managers, and application owners Expect daily or near-daily engagement.

Reference: 1036325 
  
Don’t meet every single requirement? Studies have shown that women and people of color are less likely to apply to jobs unless they meet every qualification. At Revel IT, we are dedicated to building a diverse, inclusive, and authentic workplace, so if you’re excited about this role, but your experience doesn’t align perfectly with every qualification in the description, we encourage you to apply anyway. You might be the right candidate for this or our other open roles!  

  
Revel IT is an Equal Opportunity Employer. Revel IT does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.  

#gdr4900

Job ID:

1036325

Related Jobs

Apply Now

"*" indicates required fields

Full Name*
This field is hidden when viewing the form
Accepted file types: doc, pdf, docx, docs, Max. file size: 128 MB.
This field is for validation purposes and should be left unchanged.

Related Jobs