Azure Data Engineer

Full time @Aditi Porwal in Data Science & Big Data
  • Ahmedabad, Gujarat, India View on Map
  • Salary: ₹2,000,000.00 - ₹2,200,000.00 / Yearly
Email Job

Job Detail

  • Experience  6 - 10 years

Job Description

About Company – The Company is a software development company that has been helping businesses embrace digital experiences for the past 36 years. The company employs more than 400+ talented engineers. With a global presence, they provide top-tier services to clients around the world and have earned ISO 9001:2015 and ISO/IEC 27001:2013 certifications to demonstrate their commitment to quality and security. The company serves enterprise customers in more than 25 countries with quality business solutions.

Roles & Responsibilities – 

  • Design, develop, and deploy scalable data solutions on the Azure platform, leveraging services such
    as Azure Databricks, Azure SQL Database, Azure Data Factory, and others as appropriate.
  • Collaborate with cross-functional teams to understand data requirements and translate them into
    technical solutions.
  • Build and optimize Azure data pipelines for data ingestion, transformation, and loading (ETL)
    processes using Python, SQL, and other relevant technologies.
  • Implement data governance and security best practices to ensure data integrity, privacy, and
    compliance with regulatory standards.
  • Monitor and troubleshoot data pipelines to identify and resolve performance issues, bottlenecks,
    and data quality issues.
  • Develop and maintain documentation for data engineering processes, including data models, data
    flow diagrams, and system architecture diagrams.
  • Stay current with emerging trends and technologies in the field of data engineering and contribute tothe continuous improvement in processes.
  • 3-4 years of experience in data engineering, with a focus on cloud-based data platforms and
    technologies.
  • Strong proficiency in Azure cloud services, particularly Azure Databricks, Azure SQL Database, Azure
  • Data Factories and Python Notebooks.
  • Hands-on experience with SQL Server and MongoDB, including data modelling, querying, and
    optimization.
  • Proficiency in Python programming for data processing, automation, and scripting.
  • Experience with data integration tools, such as Apache Spark, Kafka, or Azure Data Factory.
  • Solid understanding of data warehousing concepts, dimensional modelling, and ETL processes.
  • Excellent problem-solving skills and attention to detail, with a strong focus on delivering high-quality solutions.
  • Effective communication skills and the ability to collaborate with team members and stakeholders
    across different functional areas.

Note: We appreciate every application, however, due to a high volume of applicants, only shortlisted candidates will be contacted

Required skills

Other jobs you may like