Content Creator / Social Media Manager
Job Description
Data Engineer (Python | SQL | Snowflake | ETL)
CosmicFusion Labs is hiring a Data Engineer on behalf of our client, People-Tree, a US-based organization looking for a skilled professional to join their data team remotely from India. You'll work US shift hours, collaborating directly with cross-functional teams to design, build, and optimize data pipelines and infrastructure at scale.
This is a hands-on engineering role — not a support or maintenance position. You'll own critical data workflows end to end, from ingestion and transformation to delivery and monitoring.
What You'll Do
- Design, develop, and maintain robust ETL/ELT pipelines to ingest, transform, and load data from diverse sources
- Build and optimize data models and warehousing solutions on Snowflake
- Write efficient, production-grade Python scripts for data processing, automation, and pipeline orchestration
- Develop and tune complex SQL queries for analytics, reporting, and data validation
- Ensure data quality, integrity, and reliability across the pipeline through testing and monitoring
- Collaborate with analysts, product teams, and stakeholders in a US-timezone working environment
- Troubleshoot pipeline failures, optimize performance bottlenecks, and implement scalable solutions
- Document data flows, schemas, and architecture decisions for team knowledge sharing
Must-Have Qualifications
- 5–6 years of professional experience as a Data Engineer or in a similar data-focused engineering role
- Strong proficiency in Python for data engineering tasks — not just scripting, but well-structured, maintainable code
- Strong SQL skills — you should be comfortable writing complex joins, window functions, CTEs, and performance-optimized queries
- Hands-on experience with Snowflake — schema design, query optimization, storage strategies, and Snowflake-specific features (stages, streams, tasks, pipes)
- Solid experience building and managing ETL/ELT pipelines using tools such as Apache Airflow, dbt, Luigi, Prefect, or similar orchestration frameworks
- Strong problem-solving and algorithmic thinking — comfortable with LeetCode-level challenges involving data structures, algorithms, and complexity analysis
- Familiarity with data quality frameworks, testing strategies, and monitoring best practices
- Excellent communication skills and ability to work effectively in a remote, US-shift setup (typically EST or PST overlap)
Good-to-Have Qualifications
- Experience with cloud platforms — AWS (S3, Glue, Redshift, Lambda) or Azure (Data Factory, Synapse)
- Familiarity with dbt for transformation layer management
- Experience with Apache Spark / PySpark for large-scale data processing
- Exposure to CI/CD pipelines for data workflows (e.g., GitHub Actions, Jenkins)
- Knowledge of data governance, cataloging tools, and access control best practices
- Experience working with REST APIs for data ingestion and integration
- Prior experience in a client-facing or staff augmentation setup
Work Arrangement
- Remote from anywhere in India
- US Shift — overlapping working hours with US-based teams (typically evening/night IST)
- Candidates must have a reliable high-speed internet connection and a professional home office setup
Job Type: Permanent
Pay: ₹70,000.00 per month
Experience:
- Snowflake: 4 years (Required)
- Python: 4 years (Required)
- SQL: 4 years (Required)
- ETL: 4 years (Required)
Work Location: Remote
Preparing for this role?
Practice with an AI interviewer tailored to Sr. Data Engineer at CosmicFusion Labs.
More Jobs
View all jobsBuyer Support
Licensed Property & Casualty Agent