A leading global technology services and delivery organisation is building out its Modern Data & AI capability in APAC. The team partners enterprise clients on large-scale digital transformation programmes - designing, building and showcasing scalable data platforms and AI solutions across cloud ecosystems. They are hiring a newly created permanent position of a Solution Architect (Data & AI) as a key part of their expansion plans.
Job responsibilities:
- Run solution workshops, technical demos and POCs across modern data platforms, AI/ML and GenAI use cases.
- Support RFP/RFI responses and proposal development, contributing to solution scope, feasibility and adoption strategy.
- Architect and deliver solutions on modern data platforms such as Databricks, Snowflake, Microsoft Fabric, BigQuery, Cloudera CDP (lakehouse / distributed data systems).
- Design and implement robust data engineering pipelines (e.g., Spark, Airflow, Delta Lake, Kafka, dbt) with production-grade reliability, observability and data quality.
- Drive data governance and catalog implementations (e.g., Purview, Collibra, Unity Catalog, Alation, Informatica), including metadata, lineage, compliance and DQ policies.
- Build and operationalise AI/ML and MLOps workflows (e.g., MLflow, Kubeflow, SageMaker, Azure ML) across training, deployment, registry and monitoring.
- Deliver GenAI / RAG / agentic AI solutions using platforms such as Azure OpenAI, AWS Bedrock, Vertex AI, and frameworks like LangChain/LlamaIndex (plus vector DB integrations).
- Provide architecture leadership on enterprise patterns such as Lakehouse, Data Mesh and Data Fabric, driving best practices and capability roadmaps.
Job requirements:
- Proven hands-on experience delivering end-to-end Data & AI solutions in production (not just PoCs).
- Engineering foundation strongly preferred (e.g., started in software engineering before moving into architecture), with proven hands-on depth - not just high-level diagrams.
- Strong experience as a Solution Architect / Technical Architect delivering programmes end-to-end, with the ability to explain designs, trade-offs, and project outcomes clearly.
- Strong modern data platform experience (Databricks/Snowflake/Fabric/BigQuery/Cloudera) and solid distributed systems fundamentals.
- Strong data engineering toolkit: Spark/Airflow/Delta/Kafka/dbt, with a focus on reliability, data quality and observability.
- Practical experience with MLOps (MLflow/Kubeflow/SageMaker/Azure ML) and operationalising ML into CI/CD workflows.
- Hands-on GenAI exposure: prompting, RAG pipelines, vector databases, and (ideally) agent workflows.
Why you should join:
- Work on high-impact modernisation programmes spanning modern data platforms, AI/ML and GenAI across multiple enterprise clients.
- A role with real ownership - covering both architecture and delivery, with exposure to presales, solution strategy and executive stakeholders.
- Build cutting-edge capabilities (RAG, agents, MLOps, governance) in a team that values hands-on execution and practical outcomes.
Reg. No. R1766249
BeathChapman Pte Ltd
Licence no. 16S8112





