-
SalaryUp to INR 44 LPALocationIndiaIndustryInformation Technology and ServicesJob Description
Summary role description:
Hiring for a Snowflake Architect - ADF for a top-tier global Systems Integration / IT Services major.
Company description:
Our client is a top-tier global Systems Integration, IT Services, Consulting and Digital Solutions company, that helps hundreds of customers secure competitive advantage through technology. Their comprehensive Digital Transformation platform, drives and accelerates the Mobile, Analytics & AI / ML, IoT / Industry 4.0, Cloud and Social journeys of their customers.
Role details:
- Title / Designation: Snowflake and ADF Architect
- Location: Chennai, Bengaluru, Hyderabad, Coimbatore, Mumbai, Pune, New Delhi, Kolkata, Bhubaneswar
- Work Mode: Hybrid
Role & responsibilities:
- Lead the design, development, programming, testing, deployment, and maintenance of data quality procedures within Snowflake and ADF environments.
- Collaborate with stakeholders to ensure the architecture addresses data management and quality standards in line with client requirements.
- Build and implement Data Quality (DQ) rules, standardization, and validation processes to enhance data accuracy, such as phone and customer data validation.
- Assess current installations, domain configurations, and provide recommendations for best practices.
- Profile source data, define or validate metadata, and cleanse and standardize data sources.
- Serve as the single point of contact for clients throughout engagements, from pre-sales to post-delivery, ensuring smooth communication and project alignment.
- Supervise and mentor junior to mid-level team members, providing guidance and training as needed.
- Create and present regular reports to communicate project status to both internal and external stakeholders.
- Drive end-to-end DQ assessment, identifying key challenges, gaps, and issues in the data quality ecosystem.
- Define the data profiling process, review profiling reports, propose DQ rules, and finalize remediation recommendations after stakeholder consultations.
- Own the Data Quality improvement and monitoring strategy, ensuring its implementation and proper documentation.
Candidate requirements:
- 12+ years of experience in data quality and data integration solutions, with at least 1 year as an architect.
- Proven experience in leading Data Quality and Data Integration projects, specifically within Snowflake and Azure Data Factory (ADF).
- Experience working with Informatica, Talend, and bespoke data quality solutions.
- Strong knowledge of Python and PySpark, with hands-on expertise in building custom DQ solutions.
- Experience with SQL, Azure, and data profiling tools.
- Familiarity with root cause analysis for data quality issues and the automation of data integration and profiling processes.
Selection process:
- 2 Technical Discussions
- HR Discussion
It has come to our attention that clients and candidates are being contacted by individuals fraudulently posing as Antal representatives. If you receive a suspicious message (by email or WhatsApp), please do not click on any links or attachments. We never ask for credit card or bank details to purchase materials, and we do not charge fees to jobseekers.