Data Engineer

Location: Offshore- Chennai/Bengaluru
Experience: 4+ Years
Education: Bachelor’s/Postgraduate degree in Computer Science or related field (or equivalent industry experience)

Role Overview:

We are looking for a skilled Data Engineer to design, build, and maintain scalable, real-time data processing systems. The ideal candidate will have strong experience in event-driven architectures, streaming technologies, and distributed systems, along with a solid understanding of performance optimization, security, and data governance.

You will play a key role in developing robust data pipelines and ensuring high availability, reliability, and compliance across enterprise data platforms.

Key Responsibilities:

Data Engineering & Architecture

  • Design and develop real-time data pipelines using Kafka and streaming frameworks such as Flink, Spark Streaming, or Beam.
  • Build and maintain event-driven architectures for scalable and high-throughput data processing.
  • Implement distributed data systems ensuring performance, resilience, and fault tolerance.
  • Work with SQL and NoSQL databases to manage structured and unstructured data efficiently.

Performance & Optimization

  • Perform JVM tuning and performance optimization for high-volume data processing applications.
  • Monitor and improve system performance, ensuring low latency and high throughput.
  • Optimize resource utilization across distributed environments.

DevOps & Infrastructure

  • Containerize applications using Docker and orchestrate using Kubernetes.
  • Implement and maintain CI/CD pipelines using tools like Jenkins and GitHub.
  • Work in Linux environments, leveraging shell scripting for automation and operational efficiency.

Security & Compliance

  • Design and implement secure data processing systems, ensuring compliance with enterprise and regulatory standards.
  • Identify security gaps in system design and recommend improvements.
  • Implement data protection mechanisms including encryption, anonymization, and secure data transfer processes.
  • Ensure compliance with data governance policies, including handling of sensitive data and logging standards.

Monitoring & Reliability

  • Implement monitoring and alerting for distributed systems and streaming platforms.
  • Ensure system resiliency and fault tolerance in production environments.
  • Troubleshoot and resolve issues in distributed services ecosystems.

Collaboration & Best Practices

  • Work in an Agile environment, collaborating with cross-functional teams.
  • Follow best practices in coding, testing, documentation, and security.
  • Research and benchmark new technologies to enhance system capabilities.

Technical Skills:

Core Technologies

  • Strong programming skills in Java or Scala
  • Hands-on experience with Kafka and streaming frameworks (Flink, Spark, Beam)
  • Experience with event-driven architecture and distributed systems

Infrastructure & DevOps

  • Docker, Kubernetes for containerization and orchestration
  • CI/CD tools such as Jenkins, GitHub
  • Linux OS and shell scripting

Data & Storage

  • Experience with SQL and NoSQL databases
  • Familiarity with caching systems (Redis preferred)

Security & Networking

  • Knowledge of data security, encryption, and compliance standards
  • Understanding of networking basics (DNS, Proxy, ACLs)
  • Experience with secure data transfers and ETL processes

Functional Skills:

  • Experience working in Agile development environments
  • Ability to ensure high-quality architecture and system design
  • Strong analytical skills to evaluate and benchmark technologies
  • Experience in Banking, Financial Services, or Fintech domains is preferred

Soft Skills:

  • Strong communication and collaboration skills
  • Ability to work independently and drive initiatives
  • Positive, proactive attitude with strong problem-solving abilities
  • Ability to influence teams and contribute to technical decision-making

📩 Apply Now: [email protected]

Job Application