Back to Job Listings

Senior Data Platform Architect (GCP Expert)

SpringCube

Full time - Senior Associate/ Asst Manager

Hospitality, Travel & Leisure

Singapore ( Onsite )

Published 3 weeks ago

Salary: SGD5,000 - SGD10,000

Contact Employer
  • Share:
Send Feedback
Report This Job

Job Description

The SpringCube team curated the following job opportunity to help you in your job search. Explore the position below to find your next career move.

Company Overview:

This is a leading online travel and leisure e-commerce platform specializing in experiences and travel services.

Job Title: Senior Data Platform Architect (GCP Expert)

Role Overview:

They are seeking a seasoned Data Platform Architect with deep expertise in modern data technologies (e.g., Flink, Kafka, Lakehouse, Paimon, Doris, BigQuery) and Google Cloud Platform (GCP). The role involves designing, building, and optimizing scalable data platforms, ensuring seamless integration with data ingestion, warehousing, AI/ML pipelines (Vertex AI), and data governance (Data Catalog). The ideal candidate is envisioned as a hands-on leader who can bridge technical execution with strategic vision.

What you’ll do:

Architect Modern Data Platforms:

Design and implement scalable, real-time data solutions using Apache Flink, Kafka, Lakehouse architectures, Apache Paimon, Apache Doris, and BigQuery.

Optimize data pipelines for batch/streaming workflows, ensuring low-latency and high throughput.

GCP-Centric Data Engineering:

Lead GCP-based data ingestion (Pub/Sub, Dataflow), warehousing (BigQuery, Dataproc), and ML pipelines (Vertex AI, TFX).

Implement DataOps/MLOps practices for CI/CD, monitoring, and governance.

Data Governance & Cataloging:

Deploy and manage data catalogs (e.g., Data Catalog, Collibra, Alation) for metadata management, lineage, and compliance.

Enforce data quality, security, and access controls.

Cross-Functional Leadership:

Collaborate with AI/ML teams to productionize models and embed analytics into business processes.

Mentor engineers and evangelize best practices in cloud-native data architectures.

What you’ll need:

8+ years in data engineering, architecture, or solutions roles, with at least 3 years focused on GCP.

Proficiency in real-time data tools: Flink, Kafka, Paimon, Doris, Spark.

Hands-on experience with GCP services: BigQuery, Dataflow, Pub/Sub, Dataproc, Vertex AI, Cloud Storage.

Strong SQL/Python/Java/Scala skills; familiarity with data lakehouse frameworks (Delta Lake, Iceberg, Hudi).

Experience with data catalogs (e.g., GCP Data Catalog, OpenMetadata) and metadata management.

Proven track record designing large-scale data platforms (10M+ rows/day).

Ability to translate business needs into technical solutions.

Certifications like GCP Professional Data Engineer or AWS/Azure equivalents are a plus.

Nice-to-Have:

Knowledge of multi-cloud integrations (e.g., AWS S3 + GCP).

Exposure to LLM pipelines, vector databases, or generative AI workflows.

Disclaimer

SpringCube curates tech job listings from various company websites to support tech professionals in Singapore.

  1. No Endorsement: Job ads on SpringCube do not imply endorsement of their authenticity or quality.
  2. No Client Relationship: This company is not a client of SpringCube unless stated.
  3. To Apply: Click the “Apply” button to be redirected to the hiring company’s application page for this job.
  4. No Liability: SpringCube is not liable for inaccuracies.