Python with ADB Developer | 4 To 9 Years | Pune & Hyderabad & Bangalore
Capgemini
Job Description
Capgemini Invent is the digital innovation, consulting and transformation brand of the Capgemini Group, a global business line that combines market leading expertise in strategy, technology, data science and creative design, to help CxOs envision and build what’s next for their businesses.
As a Python Developer with Databricks, you will be responsible for developing and maintaining scalable data pipelines, managing cloud environments on Azure, and ensuring smooth integration with APIs. The ideal candidate will be proficient in Python, Databricks (PySpark), and Azure DevOps, with a strong understanding of cloud services, DevOps practices, and API testing.
Notice Period – 30 to 90 days
Key Responsibilities:
- Develop and Maintain Data Pipelines: Design, develop, and maintain scalable data pipelines using Python and Databricks (PySpark).
- Data Processing: Apply strong proficiency in Python and advanced Python concepts to process and manipulate large datasets effectively.
- API Ingestion: Experience in API ingestion, working with data in JSON format to integrate and automate data workflows.
- Cloud Management: Use Azure Portal for managing cloud environments and services.
- Databricks PySpark: Work with Databricks and PySpark to build distributed data processing applications.
- DevOps & Agile Methodology: Implement DevOps best practices and work within a Scrum framework to ensure continuous integration and continuous delivery (CI/CD) pipelines.
- API Testing & Automation: Use Postman to test and automate APIs for robust integration and data workflows.
- Collaboration: Work closely with cross-functional teams to implement solutions aligned with business objectives and technical requirements.
Primary Skills
Required Qualifications:
- Programming Skills: Strong proficiency in Python with experience in data processing libraries (e.g., Pandas, NumPy).
- Databricks Experience: Hands-on experience with Databricks (PySpark) for data processing and analysis.
- Cloud Platform: Experience using Azure Portal to manage cloud environments and services.
- API Handling: Expertise in working with APIs, specifically with data ingestion and integration in JSON format.
- DevOps Methodology: Familiarity with DevOps practices and experience working in Agile/Scrum environments.
- API Testing Tools: Proficiency with Postman for API testing and automation.
- Version Control: Experience using Visual Studio Code and version control systems like Git.
Preferred Qualifications:
- Familiarity with Azure DevOps for building and deploying CI/CD pipelines.
- Experience working with large-scale data processing frameworks such as Apache Spark or Hadoop.
- Azure Certifications (e.g., Azure Data Engineer, Azure Developer) are a plus.
Skills & Attributes:
- Excellent problem-solving and troubleshooting skills.
- Strong communication and collaboration abilities.
- Ability to manage multiple priorities and meet deadlines in a fast-paced environment.
- A proactive mindset focused on continuous improvement and automation.