HamburgerMenu
iimjobs
Job Views:  
128
Applications:  21
Recruiter Actions:  1

Posted in

IT & Systems

Job Code

1526084

We are looking for a hands-on Data Architect who will be responsible for designing, developing, and optimizing our data and analytics architecture. You will play a critical role in defining the data strategy, designing scalable data pipelines, and implementing best practices for real-time and batch analytics solutions. This role requires a strong technical leader who is passionate about data engineering, analytics, and driving data-driven decision-making across the organization.

Key Responsibilities:

- Data Architecture & Design: Define and own the architecture for data processing, analytics, and reporting systems, ensuring scalability, reliability, and performance.

- Data Engineering: Design and implement highly efficient, scalable, and reliable data pipelines for structured and unstructured data.

- Big Data & Real-Time Analytics: Architect and optimize data processing workflows for batch, real-time, and streaming analytics.

- Cross-Functional Collaboration: Work closely with Product Managers, Data Scientists, Analysts, and Software Engineers to translate business requirements into scalable data architectures.

- Code Reviews & Mentorship: Review code, enforce data engineering best practices, and mentor engineers to build a high-performance analytics team.

- Data Governance & Compliance: Ensure data security, integrity, and compliance with regulations (GDPR, CCPA, etc.).

- Optimization & Performance Tuning: Identify performance bottlenecks in data pipelines and analytics workloads, optimizing for cost, speed, and efficiency.

- Cloud & Infrastructure: Lead cloud-based data platform initiatives, ensuring high availability, fault tolerance, and cost optimization.

Technical Skills:

- Experience: 10+ years of experience in data architecture, analytics, and big data processing.

- Proven Track Record: Experience designing and implementing end-to-end data platforms for high-scale applications.

- Strong Data Engineering Background: Expertise in ETL/ELT pipelines, data modeling, data warehousing, and stream processing.

- Analytics & Reporting Expertise: Experience working with BI tools, data visualization, and reporting platforms.

- Deep Knowledge of Modern Data Technologies:

- Big Data & Analytics: Spark, Kafka, Hadoop, Druid, ClickHouse, Presto, Snowflake, Redshift, BigQuery.

- Databases: PostgreSQL, MongoDB, Cassandra, ElasticSearch.

- Cloud Platforms: AWS, GCP, Azure (experience with cloud data warehouses like AWS Redshift, Snowflake is a plus).

- Programming & Scripting: Python, SQL, Java, Scala.

- Microservices & Event-Driven Architecture: Understanding of real-time event processing architectures.

Didn’t find the job appropriate? Report this Job

Job Views:  
128
Applications:  21
Recruiter Actions:  1

Posted in

IT & Systems

Job Code

1526084

UPSKILL YOURSELF

My Learning Centre

Explore CoursesArrow