Back to careers
Python Databricks Developer
Role Overview
We are seeking a highly skilled Python Databricks Developer to join our team on a contract basis. The ideal candidate will have extensive experience in Databricks, Python, Spark, and AWS services, with a strong background in ETL pipelines and healthcare data. You will be responsible for designing, developing, and optimizing data workflows to ensure scalable and high-performance solutions.
Experience Required: 7–10 Years
Location: Hybrid – Bangalore, Pune, Mumbai, Hyderabad, Noida
Engagement: 6 Months | Contract
Work Hours: 11:00 AM – 8:00 PM (with 4-hour US overlap)
Key Responsibilities
- Design, develop, and maintain data pipelines on Databricks using Python and Spark.
- Work with ETL systems (e.g., DataStage) to integrate and transform healthcare data.
- Build scalable solutions leveraging AWS services (S3, Lambda) and Airflow for orchestration.
- Write efficient SQL queries for data analysis and transformation.
- Collaborate with cross-functional teams to deliver secure, optimized, and reliable data solutions.
- Ensure compliance with healthcare data standards and security frameworks.
Required Skills & Qualifications
- 7–10 years of experience in software development and data engineering.
- Strong expertise in Databricks, Python, Spark, and SQL.
- Hands-on experience with AWS services (S3, Lambda) and Airflow.
- Proficiency in working with healthcare datasets and ETL systems (DataStage preferred).
- Strong understanding of data pipelines, performance tuning, and big data best practices.
- Excellent problem-solving skills and ability to work in a hybrid, fast-paced environment.
#Python #Databricks #Spark #SQL #AWS #HybridJobs #Qualcosoft