### Job Summary
We are looking for a motivated ETL Data Engineer with 2–3 years of experience in building and maintaining data pipelines. The ideal candidate should have hands-on experience with Snowflake and strong expertise in ETL/ELT processes, data warehousing, and SQL. You will play a key role in transforming raw data into reliable, scalable, and high-quality datasets for analytics and business intelligence.
—
### Key Responsibilities
• Design, develop, and maintain ETL/ELT pipelines using Snowflake
• Design, develop, and maintain modular dbt models (SQL & Python) using best practices like DRY (Don’t Repeat Yourself) and version control.
• Extract data from multiple sources (databases, APIs, flat files, cloud systems)
• Transform and load data into Snowflake data warehouse efficiently
• Write optimized SQL queries for data processing and transformation
• Develop and maintain data models (star schema, snowflake schema)
• Ensure data quality, integrity, and consistency across pipelines
• Monitor, troubleshoot, and optimize ETL workflows and job performance
• Collaborate with data analysts, BI developers, and business stakeholders
• Implement data validation, error handling, and logging mechanisms
• Maintain documentation for data pipelines, workflows, and architecture
—
### Required Skills & Qualifications
• 2–3 years of experience in ETL/Data Engineering
• Hands-on experience with dbt (Core or Cloud), including macros, packages, and hooks.
• Hands-on experience with Snowflake (tables, stages, warehouses, data loading)
• Strong proficiency in SQL (joins, window functions, CTEs, performance tuning)
• Experience in ETL/ELT tools (e.g., Azure Data Factory, Informatica, Talend, etc.)
• Good understanding of data warehousing concepts
• Knowledge of data modeling techniques (dimensional modeling)
• Familiarity with file formats (CSV, JSON, Parquet)
• Experience handling large datasets and optimizing performance
• Basic scripting knowledge (Python or Shell scripting is a plus)
• Strong analytical and problem-solving skills
—
### Preferred Skills
• Experience with cloud platforms (Azure/AWS/GCP)
• Knowledge of Snowflake features like Time Travel, Cloning, and Data Sharing
• Familiarity with orchestration tools (Airflow, Prefect, etc.)
• Experience with dbt (Data Build Tool)
• Exposure to CI/CD pipelines and version control (Git)
• Understanding of Agile methodologies (Scrum/Kanban)
—
### Education
• Bachelor’s degree in Computer Science, Information Technology, or related field
—
### Nice to Have
• Snowflake certification (SnowPro Core or equivalent)
Job Responsibilities: Supports and educates customers in the optimal usage of applications. Supports customers in using (software) applications and solutions...
Apply For This JobGeneral Information: We are seeking a highly skilled Electronics Design Engineer to join our R&D team, responsible for driving end‑to‑end...
Apply For This JobDescription At Amazon, we’re working to be the most customer-centric company on earth. To get there, we need exceptionally talented,...
Apply For This JobRailways for the world of tomorrow. The project – NHSRCL Mumbai-Ahmedabad – Mumbai, India Deutsche Bahn’s expertise is in high...
Apply For This JobJob Description This position will be responsible for activities related to the timely acquisition of materials and effective flow of...
Apply For This JobExecutive – Online Merchandiser Location: Bengaluru Experience: 1–3 years Type: Full-time About the Role:This role is focused on online visual...
Apply For This Job