: Data Engineer Location - Dubai Job SummaryWe are looking for an experienced Data Engineer. The suitable candidate should have demonstratedexperience in designing and implementing ETL/ ELT solutions on-premises modern platforms to supportEnterprise data warehouse, Data Mart and advanced analytics capabilities. Success in this role comes frommarrying a strong data engineering background with product and business expertise to deliver scalable datapipeline.Role Purpose
Design, implement, and maintain robust, scalable, and high-performance ETL/ELT pipelines to
ingest, process, and load data from various sources
Develop automated processes for deploying, testing, and managing data pipelines, and
orchestrate workflows using tools.
Deep understanding of data warehousing and data model concepts .
Analyze data requirements, complex source data, data models, and determine the best methods
in extracting, transforming and loading the data into the data staging, warehouse and othersystem integration projects.
Analyze business requirements and outline solutions.
Develop and deploy ETL/ELT job workflow with reliable error/exception handling and rollback
Proactively communicate innovative ideas, solutions, and capabilities over and above the specific
task request
Work closely with data architect, data analysts, and business stakeholders to understand data
requirements and develop solutions that meet business needs.
Collaboratively work with a team and independently. Continuously strive for high performing
business solutions
Perform and coordinate unit and system integration testing.
Optimize data storage and retrieval to enhance performance and efficiency. Improve query
performance and reduce latency for analytics and reporting
Document data pipelines, architecture, and processes. Establish and promote best practices for
data engineering within the organizationRequirementsKey Requirements and Qualifications
Minimum 6 years data engineering relevant experience.
Proficiency in Advanced SQL and experience with any ETL / ELT tools.
Knowledge of data modelling, data warehousing, and data lake architecture.
Hand on with HTAB databases like Single Store .
Familiarity with orchestration tools like Stone Branch or similar workflow management tools.
Experience in Health care/insurance industry is preferred
Bachelors degree in computer science or any related field with programming exposure
Skills and Competencies
Strong analytical and problem-solving skills with attention to detail
Experience with CI/CD tools like Jenkins, Bit Bucket.
Experience in developing flows using batch, Realtime and streaming process using Pysql /
PySpark preferred
Experience in data streaming technology like Oracle Golden Gate or similar .
Ability to work with multiple projects and work streams at one time. Must be able to deliver