Top
Weekday AI

Data Engineer

Bengaluru, Karnataka, India

30 Days ago

Job Overview


Posted Date: 17 August 2025

Job Type: Full Time

Workplace Type: Not Specified

Experience Level: Mid-Senior level

Salary: Competitive & Based on Experience

Experience: 6 - 7 yrs

Job Description


This role is for one of the Weekday's clients

Salary range: Rs 600000 - Rs 1700000 (ie INR 6-17 LPA)

Min Experience: 3 years

Location: Bangalore, Chennai, pune, Kolkata, Gurugram

JobType: full-time

Experience: 6+ years in IT with 3+ years in Data Warehouse/ETL projects

Requirements

Primary Responsibilities:

  • Design and develop modern data warehouse solutions using Snowflake, Databricks, and Azure Data Factory (ADF).
  • Deliver forward-looking data engineering and analytics solutions that scale with business needs.
  • Work with DW/BI leads to gather and implement requirements for new ETL pipelines.
  • Troubleshoot and resolve issues in existing pipelines, identifying root causes and implementing fixes.
  • Partner with business stakeholders to understand reporting requirements and build corresponding data models.
  • Provide technical mentorship to junior team members and assist with issue resolution.
  • Engage in technical discussions with client architects and team members to align on best practices.
  • Orchestrate data workflows using scheduling tools like Apache Airflow.

Qualifications:

  • Bachelor's or Master's degree in Computer Science or a related field.
  • Expertise in Snowflake, including security, SQL, and object design/implementation.
  • Proficient with Snowflake tools such as SnowSQL, Snowpipe, Snowsight, and Snowflake connectors.
  • Strong understanding of Star and Snowflake schema modeling.
  • Deep knowledge of data management principles and data warehousing.
  • Experience with Databricks and a solid grasp of Delta Lake architecture.
  • Hands-on with SQL and Spark (preferably PySpark).
  • Experience developing ETL processes and transformations for data warehousing solutions.
  • Familiarity with NoSQL and open-source databases such as MongoDB, Cassandra, or Neo4J.
  • Exposure to structured and unstructured data, including imaging and geospatial formats.
  • Proficient in DevOps tools and practices including Terraform, CircleCI, Git.
  • Strong background in RDBMS, PL/SQL, Unix Shell Scripting, and query performance tuning.
  • Databricks Certified Data Engineer Associate/Professional certification is a plus.
  • Ability to thrive in a fast-paced, dynamic environment managing multiple projects.
  • Experience working within Agile development frameworks.
  • Excellent communication, analytical, and problem-solving skills with strong attention to detail.

Mandatory Skills:
Snowflake, Azure Data Factory, PySpark, Databricks, SQL, Python


Key skill Required

  • SQL
  • Shell Scripting
  • Architecture
  • Python
  • Apache
  • Apache Airflow
  • Azure
  • CircleCI
  • Data Engineering
  • MongoDB
  • Analytics
  • Attention to Detail
  • Azure Data Factory
  • Communication
  • Computer Science
  • Data Management
  • Data Warehouse
  • Design
  • Development
  • Dynamic Environment
  • Git
  • Implementation
  • Issue Resolution
  • Management
  • Neo4j
  • NoSQL
  • Performance Tuning
  • PL/SQL
  • Pyspark
  • Reporting
  • Science
  • Security
  • Snowflake Schema
  • SnowSQL
  • SPARK
  • Terraform
  • Unix Shell


Company Details


Company Name: Weekday AI

Recruiting People: HR Department

Contact Number: --

Important Fraud Alert:
Beware of imposters. elsejob.com does not guarantee job offers or interviews in exchange for payment. Any requests for money under the guise of registration fees, refundable deposits, or similar claims are fraudulent. Please stay vigilant and report suspicious activity.