Experience : 4 – 10 Years
Primary Skills (Mandatory)
Bigdata, Data Lake, Cloud warehouse, Cloud Analytics, Advanced analytics, Snowflake, SnowSQL, SnowPipe, Python, Hadoop, Spark, Distributed processing, Data Ingestion (ETL/ELT), data management (governance, quality, integration, metadata), data prep/modelling, PowerBI, Tableau.
Roles & Responsibilities
- Must have experience in Bigdata and Data Migration to Snowflake
- Must have worked on Snowflake data migration form on prem to cloud or from any other traditional dbs , Scalability, Data Storage Formats, concurrency requests, streaming data ingestion.
- Ability to build quick Proof of concept as per the requirements.
- Analysis on large amounts of historical data, determining suitability for Data Lake, Delta Lake setup, data clean-up, data management and filtering, loading into Snowflake.
- Conducting experiments with different types of data models, analyzing performance, to identify the best ways to employ.
- Expertise in Snowflake advanced concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, zero copy clone, time travel and understand how to use these features.
- Strong Proficiency in RDBMS, complex SQL, PL/SQL, Python, performance tuning
- Expertise in Snowflake concepts like setting up resource monitors, RBAC controls, virtual warehouse sizing, query performance tuning, zero copy clone, time travel and understand how to use these features