-
Salary6 LPA - 15 LPA
- Location
-
IndustryInformation Technology and Services
Summary role description:
Hiring for Data Platform Engineer – Python for a HealthTech platform provider
Company description:
Our client is a US Headquartered HealthTech platform provider that focuses on creating electronic health record (EHR) and practice management (PM) software that helps healthcare providers streamline their operations and provide better patient care. The company partners with payers, at-risk provider organizations and government agencies to inform decision-making with actionable data insights. They are also a NCQA-certified HEDIS® vendor and risk adjustment leader, providing dedicated support and staff augmentation to improve quality and optimize risk accuracy. They place a great emphasis on culture and bring together best-in-class talent with diverse backgrounds and different points of view.
Role details:
- Title / Designation: Data Platform Engineer - Python
- Location: Hyderabad
- Work Mode: Work from office
Role & responsibilities:
- Ability to design and build microservices in Python.
- API and data format integration experience (REST, WebServices, XML, JSON)
- Designing and developing API(s) and Microservices using Python frameworks Flask, Django, and FastAPI.
- Providing users access to datasets using REST and Python.
- Extraction, transformation, and loading (ETL\ELT) of data from a wide variety of data sources (Structured, Unstructured, SQL, NoSQL, Blobs, Files & Realtime);
- Create and maintain optimal data pipeline architecture.
- Collaborate with data analysts, data scientists, and other stakeholders to understand their data needs and provide data pipeline solutions.
- Develop frameworks necessary to monitor and troubleshoot data pipeline issues.
Candidate requirements:
- 3+ years of experience with Python, Scala, Apache Kafka, Kubernetes, Docker and data engineering
- 3+ years of experience in Cloud ETL/ELT for creating data pipelines
- experience with any ETL\ELT tool (Talend, Pentaho, Glue, ADF, GCD Glue, ADF, Informatica, SSIS)
- Experience in Data Warehousing\Data Modeling experience in any of the following areas: Dimensional, DataMart, DataVault, DataLake, DataMesh, GraphDB.
- Experience with the following databases: MSSQL, Azure, MySql, PostgreSQL, Snowflake, MongoDb, MariaDB, Casandra, and Oracle.
- Knowledge of Snowflake, SnowSQL.
- Knowledge of Datawarehouse Design Methodologies (3NF, Kimball, Data Vault)
- Knowledge of Kubernetes, Apache Kafka, Data Streams, Data Products and Rest
Selection process:
- 1 assessment test
- Technical discussion with Indian hiring manager
- Technical discussion with US hiring manager
- HR Discussion