CodeX Tech-IT LLC

Codex Tech - Data Platform Engineer - System Reliability

Click Here to Apply

Job Location

bangalore, India

Job Description

About the Role : We are looking for a highly skilled Senior Data Platform Engineer to join our growing team. You will play a critical role in building and scaling our data platform to support complex data processing and analytics needs. As part of this role, you will lead the design and development of scalable systems that handle large volumes of batch and real-time data, enabling data-driven insights and business decisions across the company. This is an exciting opportunity for someone who is passionate about cutting-edge data technologies, cloud-native environments, and driving business impact through data infrastructure. Key Responsibilities : Data Platform Design and Development : - Architect, build, and maintain a scalable and reliable data platform that supports batch processing, real-time data streams, and advanced analytics. - Ensure the platform can handle the growing needs of the business in a cost-efficient and performance-optimized manner. Data Pipeline Management : - Develop and manage ETL/ELT pipelines to support data ingestion, transformation, and distribution across the organization. - Ensure that these pipelines are scalable, resilient, and optimized for performance. Real-Time Data Processing : - Design and implement real-time data processing solutions using tools like Apache Kafka and Apache Flink, ensuring low-latency, high-throughput data streams for business-critical applications. Data Storage Optimization : - Lead the design and optimization of data lakes and data warehouses. - Ensure efficient data storage, retrieval, and cost management in cloud environments. System Reliability and Monitoring : - Implement monitoring solutions and data quality frameworks to ensure the accuracy, reliability, and security of data. - Proactively identify and resolve performance bottlenecks and data pipeline issues. Collaboration and Cross-Team Support : - Collaborate with data scientists, analysts, and product teams to understand business requirements and deliver scalable data solutions. - Ensure that the platform supports both current and future data use cases. Data Security and compliance : - Implement and champion data security and compliance measures to ensure data integrity and privacy. Technical Leadership : - Provide technical leadership and mentorship to junior engineers. - Promote best practices in system design, coding, and deployment processes across the team. Automation and Infrastructure as Code (IaC) : - Automate data infrastructure management and deployments using tools like Terraform and Ansible, ensuring consistency, repeatability, and reduced operational overhead. Required Qualifications : - 7 years of hands-on experience in data platform engineering, with expertise in architecting and maintaining scalable data platforms. - Proven experience designing and implementing both batch and real-time data pipelines in cloud environments (AWS preferred). - Deep expertise in distributed data processing frameworks (e., Apache Spark, Presto) and real-time streaming technologies (Kafka, Flink). - Experience with modern data lake architectures and table formats like Hudi, Iceberg, or Delta Lake. - Strong hands-on programming skills in Python and/or Java, with the ability to build robust, scalable, and maintainable code for complex data systems. - Hands-on SQL and/or NoSQL experience. - Hands-on experience with Infrastructure as Code (Terraform or Ansible) for managing cloud infrastructure. - Experience in performance tuning and optimization of large-scale data systems. - Proficient in scoping and prioritizing tasks, focusing on delivering incremental improvements, while remaining flexible and responsive to evolving project needs. - Excellent communication and leadership skills, with a track record of mentoring junior engineers. - Bachelor's/Master's degree in Computer Science, Engineering, or a related field. Preferred Qualifications : - Experience with DataOps and CI/CD pipelines in data engineering. - Experience in building data platforms from bottom up. - Knowledge of Kubernetes is a significant plus. - Experience in a startup environment, with a willingness to innovate and adapt to changing business needs. Location : Bangalore. Work Policy : Hybrid (Startup Environment) 2-3- Days- in- office. (ref:hirist.tech)

Location: bangalore, IN

Posted Date: 10/27/2024
Click Here to Apply
View More CodeX Tech-IT LLC Jobs

Contact Information

Contact Human Resources
CodeX Tech-IT LLC

Posted

October 27, 2024
UID: 4914605033

AboutJobs.com does not guarantee the validity or accuracy of the job information posted in this database. It is the job seeker's responsibility to independently review all posting companies, contracts and job offers.