Overview
Established in 2008, Geidea epitomises customer focused empowerment and commercial success through continuous innovation. Geidea makes best in class digital payment solutions available for all by attracting and leveraging the best creative & entrepreneurial talent in the market. Our solutions give any business the chance to get ahead and reach for more no matter their size or maturity. Our technology mirrors our people - Smart, Innovative & Forward Thinking.
To maintain competitive advantage as we grow, we are currently looking for new Data Architect Sr. Manager .
Job purpose
A highly skilled Data Architect to design and lead the development of scalable, secure, and high-performance data platforms across the enterprise. This role is central to shaping the data ecosystem supporting our Fintech services—including real-time financial transactions, credit scoring models, regulatory reporting, and customer analytics.
Data Architect will work across engineering, analytics, and product teams to build modern data infrastructure incorporating Big Data platforms, data lakes, ELT / ETL pipelines, and data warehouses. Your solutions must ensure high availability, governance, security, and performance—compliant with SAMA and other regulatory frameworks.
Key accountabilities and decision ownership
- Lead the end-to-end architecture of enterprise data platforms including Data Lakes, Lakehouse, and Data Warehouses.
- Design and maintain canonical data models (conceptual, logical, and physical) for structured, semi-structured (JSON, XML), and unstructured data.
- Develop architectural blueprints for hybrid and cloud-native solutions leveraging AWS, Azure, or GCP.
- Standardize data ingestion, transformation, and serving layers for streaming and batch use cases
- Architect Big Data processing solutions using Apache Spark, Flink, Presto, Trino, or Databricks for large-scale processing of financial and behavioral data.
- Implement distributed file systems (e.g., HDFS, S3, ADLS) and optimize file formats like Parquet, ORC, and Avro.
- Ensure scalable compute using EMR, Dataproc, or AKS / Kubernetes-based platforms.
- Data Lakes and Lakehouse Architecture
- Design and operationalize data lakes as central repositories for raw and curated data assets.
- Build Delta Lake, Apache Hudi, or Iceberg-based architectures to support ACID transactions, schema evolution, and time travel.
- Define governance standards across raw, staged, curated, and analytics layers of the lake architecture.
- Develop robust ELT / ETL pipelines using tools like Apache Airflow, DBT, AWS Glue, Azure Data Factory, or Kafka Connect.
- Optimize data transformations for performance, reusability, and modular design (e.g., using SQL / Scala / Python).
- Ensure orchestration of dependencies, retries, alerting, and logging in a fully observable pipeline ecosystem.
- Data Warehousing & BI Integration
- Architect cloud-based data warehouses such as Snowflake, BigQuery, Redshift, or Synapse Analytics.
- Define dimensional models (Star, Snowflake), facts / dimensions, and materialized views optimized for analytics and dashboarding tools (e.g., Power BI, Tableau, Looker).
- Enable self-service analytics by integrating semantic layers, metadata management, and data cataloging tools.
- Security, Governance & Compliance
- Implement data encryption (at-rest / in-transit), tokenization, and row / column-level security mechanisms.
- Define and enforce data governance policies, including data lineage, classification, and auditing aligned with SAMA, NCA, GDPR, and internal policies.
- Integrate with data catalogs (e.g., Alation, DataHub, Apache Atlas) and governance tools.
- DevOps & Observability
- Support CI / CD for data pipelines and infrastructure as code using Terraform, CloudFormation, or Pulumi.
- Implement observability practices via data quality checks, SLAs / SLIs, and monitoring tools like Prometheus, Grafana, or Datadog.
Must have technical / professional qualifications
Bachelor’s or master’s degree in computer science, Data Engineering, or a related technical field.8+ years of experience in data architecture, with significant contributions to production-grade data systems.Proven track record in designing and deploying petabyte-scale data infrastructure in Fintech , Banking , or RegTech environments.Technical Expertise
Strong command of Big Data technologies : Apache Spark, Hive, Hudi, Kafka, Delta Lake, Flink, Beam.Proficiency in Python, SQL, and optionally Scala / Java.Experience with cloud-native services on AWS (S3, Glue, Redshift, EMR, Lake Formation), Azure (Data Lake, Synapse, ADF), or GCP (BigQuery, Dataproc, Pub / Sub).Mastery of data modeling (3NF, dimensional, data vault), data versioning, and schema registry.Familiarity with ML feature stores, stream processing, and event-driven architectures is a plus.Soft Skills
Strategic thinking and ability to balance long-term vision with short-term delivery.Strong documentation, architecture diagramming (UML, ArchiMate), and presentation skills.Excellent English communication (Arabic a plus).Experience leading architecture reviews, engaging with senior stakeholders, and mentoring data engineers.Certifications
AWS Certified Data Analytics – SpecialtyMicrosoft Azure Solutions Architect / Data EngineerGoogle Cloud Professional Data EngineerTOGAF or DAMA CDMPValues
Our values guide how we think and act - They describe what we care about the most
C ustomer first - It’s embedded in our design thinking and customer service approach
O pen - Openness allows us to constantly improve and evolve
R eal - No jargon and no excuses!
B old - Constantly challenging ourselves and our way of thinking
R esilient – If we fail, we bounce back stronger than before
C ollaborative - We know that we can achieve a lot more as a team
We are changing lives by constantly striving for a better solution
Click apply below and become part of the Geidea story
Seniority level
DirectorEmployment type
Full-timeJob function
Engineering, Strategy / Planning, and ManagementWe’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr