DevOps Data Hadoop Engineer

Job Location: Chennai

Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

Experience: Minimum of 6+ years of experience in big data engineering or a related role.

 

Required Skills:

  • DevOps and CI/CD: Design, implement, and manage CI/CD pipelines using tools like Jenkins and GitOps to automate and streamline the software development lifecycle. Exp in cloudera Hadoop(CDP).
  • Containerization and Orchestration: Deploy and manage containerized applications using Kubernetes and OpenShift, ensuring high availability and scalability.
  • Infrastructure Management: Develop and maintain infrastructure as code (IaC) using tools like Terraform or Ansible.
  • Big Data Solutions: Architect and implement big data solutions using technologies such as Hadoop, Spark, and Kafka.
  • Distributed Systems: Design and manage distributed data architectures to ensure efficient data processing and storage.
  • Collaboration: Work closely with development, operations, and data teams to understand requirements and deliver robust solutions.
  • Monitoring and Optimization: Implement monitoring solutions and optimize system performance, reliability, and scalability.
  • Security and Compliance: Ensure infrastructure and data solutions adhere to security best practices and regulatory requirements.

📩 Apply Now: [email protected]

Job Application