### Job Summary
We are looking for a motivated ETL Data Engineer with 2–3 years of experience in building and maintaining data pipelines. The ideal candidate should have hands-on experience with Snowflake and strong expertise in ETL/ELT processes, data warehousing, and SQL. You will play a key role in transforming raw data into reliable, scalable, and high-quality datasets for analytics and business intelligence.
—
### Key Responsibilities
• Design, develop, and maintain ETL/ELT pipelines using Snowflake
• Design, develop, and maintain modular dbt models (SQL & Python) using best practices like DRY (Don’t Repeat Yourself) and version control.
• Extract data from multiple sources (databases, APIs, flat files, cloud systems)
• Transform and load data into Snowflake data warehouse efficiently
• Write optimized SQL queries for data processing and transformation
• Develop and maintain data models (star schema, snowflake schema)
• Ensure data quality, integrity, and consistency across pipelines
• Monitor, troubleshoot, and optimize ETL workflows and job performance
• Collaborate with data analysts, BI developers, and business stakeholders
• Implement data validation, error handling, and logging mechanisms
• Maintain documentation for data pipelines, workflows, and architecture
—
### Required Skills & Qualifications
• 2–3 years of experience in ETL/Data Engineering
• Hands-on experience with dbt (Core or Cloud), including macros, packages, and hooks.
• Hands-on experience with Snowflake (tables, stages, warehouses, data loading)
• Strong proficiency in SQL (joins, window functions, CTEs, performance tuning)
• Experience in ETL/ELT tools (e.g., Azure Data Factory, Informatica, Talend, etc.)
• Good understanding of data warehousing concepts
• Knowledge of data modeling techniques (dimensional modeling)
• Familiarity with file formats (CSV, JSON, Parquet)
• Experience handling large datasets and optimizing performance
• Basic scripting knowledge (Python or Shell scripting is a plus)
• Strong analytical and problem-solving skills
—
### Preferred Skills
• Experience with cloud platforms (Azure/AWS/GCP)
• Knowledge of Snowflake features like Time Travel, Cloning, and Data Sharing
• Familiarity with orchestration tools (Airflow, Prefect, etc.)
• Experience with dbt (Data Build Tool)
• Exposure to CI/CD pipelines and version control (Git)
• Understanding of Agile methodologies (Scrum/Kanban)
—
### Education
• Bachelor’s degree in Computer Science, Information Technology, or related field
—
### Nice to Have
• Snowflake certification (SnowPro Core or equivalent)
Title Sr. Specialist Division KBR Sustainable Technology Solutions (STS) provides holistic and value-added solutions across the entire asset life cycle....
Apply For This JobJob Opening: Graduate Engineer Trainee (GET) / Management Trainee (MT) – Production Location: Wada / Thane, Maharashtra Department: Production –...
Apply For This JobLocation: Dehradun, Uttarakhand, IN Areas of Work: Sales & Marketing Job Id: 14535 External Job Description Job Purpose The position...
Apply For This JobA. About the Role: We require a skilled and experienced Regulatory Support Consultant to provide comprehensive support in handling various...
Apply For This JobWe are seeking a strategic, data-driven Middle Office Data Engineering and Quality – Director to lead the build and evolve...
Apply For This JobJob Overview Plan A Technologies is looking for a dedicated Senior Jira Administrator. In this position, you will be responsible...
Apply For This Job