Job Description
Responsibilities:
- Act as Data domain expert for Snowflake in a collaborative environment to provide demonstrated understanding of data management best practices and patterns.
- Design and implement robust data architectures to meet and support business requirements leveraging Snowflake platform capabilities.
- Develop and enforce data modeling standards and best practices for Snowflake environments.
- Develop, optimize, and maintain Snowflake data warehouses.
- Leverage Snowflake features such as clustering, materialized views, and semi-structured data processing to enhance data solutions.
- Ensure data architecture solutions meet performance, security, and scalability requirements.
- Stay current with the latest developments and features in Snowflake and related technologies, continually enhancing our data capabilities.
- Collaborate with cross-functional teams to gather business requirements, translate them into effective data solutions in Snowflake and provide data-driven insights.
- Stay updated with the latest trends and advancements in data architecture and Snowflake technologies.
- Provide mentorship and guidance to junior data engineers and architects.
- Troubleshoot and resolve data architecture-related issues effectively.
Skills Requirement:
- 5+ years of proven experience as a Data Engineer with 3+ years as Data Architect.
- Proficiency in Snowflake with Hands-on experience with Snowflake features such as clustering, materialized views, and semi-structured data processing.
- Experience in designing and building manual or auto ingestion data pipeline using Snowpipe.
- Design and Develop automated monitoring processes on Snowflake using combination of Python, PySpark, Bash with SnowSQL.
- SnowSQL Experience in developing stored Procedures writing Queries to analyze and transform data
- Working experience on ETL tools like Fivetran, DBT labs, MuleSoft
- Expertise in Snowflake concepts like setting up Resource monitors, RBAC controls, scalable virtual warehouse, SQL performance tuning, zero copy clone, time travel and automating them.
- Excellent problem-solving skills and attention to detail.
- Effective communication and collaboration abilities.
- Relevant certifications (e.g., SnowPro Core / Advanced) are a must have.
- Must have expertise in AWS. Azure, Salesforce Platform as a Service (PAAS) model and its integration with Snowflake to load/unload data.
- Strong communication and exceptional team player with effective problem-solving skills
Educational Qualification Required:
- Masters degree in Business Management (MBA / PGDM) / Bachelor's degree in computer science, Information Technology, or related field.
Place of Posting:
Will be working out of Noida (Work from Office), and anywhere else subject to the organization’s requirement.