Talent.com
عرض العمل هذا غير متوفر في بلدك.
Data Integration Expert

Data Integration Expert

FiftyFive TechnologiesSaudi Arabia
منذ أكثر من 30 يومًا
الوصف الوظيفي

This job requires relocation to Saudi Arabia for 12 months.

We are seeking a Data Integration Expert to architect and implement scalable, secure, and robust data platform solutions to support machine learning (ML), business intelligence (BI), and customer value management (CVM) analytics. This role requires deep expertise in data ingestion, ETL pipelines, and on-premise data platforms like Cloudera or Teradata .

Key Responsibilities :

  • Define and implement scalable architecture for data platforms with a focus on AI and analytics enablement .
  • Ensure the platform supports end-to-end ML pipelines , BI dashboards , and CVM analytics .
  • Design and maintain reference models for data ingestion, transformation, processing, and access.
  • Integrate security, governance , and compliance frameworks (including PDPL and other local regulatory standards).
  • Work closely with data scientists, analysts , and IT infrastructure teams to align architecture with business goals.
  • Manage and optimize ETL pipelines and ensure data quality, lineage, and metadata management .
  • Provide hands-on leadership in setting up, maintaining, and evolving on-premise platforms such as Cloudera or Teradata.

Required Skills :

  • 6–8 years of experience in building ETL pipelines , data integration , and data platform management .
  • Strong understanding of on-premise data ecosystems , preferably Cloudera (Hadoop) or Teradata .
  • Proficiency in data ingestion frameworks , data lakes , and batch + real-time data processing .
  • Experience in data governance, compliance , and security standards , especially PDPL or similar data privacy laws.
  • Strong knowledge of SQL, Spark, Hive , and scripting languages (e.g., Python, Bash).
  • Ability to collaborate across cross-functional teams and work independently in a fast-paced environment.
  • Preferred Qualifications :

  • Experience working in telecom or large-scale enterprise data environments .
  • Familiarity with Kafka, NiFi , and data orchestration tools like Airflow .
  • Knowledge of DevOps practices , CI / CD pipelines , and containerization (Docker / Kubernetes) is a plus.
  • #J-18808-Ljbffr

    إنشاء تنبيه وظيفي لهذا البحث

    Integration • Saudi Arabia