At IBM, work is more than a job – it’s a calling: To build. To design. To code. To consult. To think along with clients and sell. To make markets. To invent. To collaborate. Not just to do something better, but to attempt things you’ve never thought possible. Are you ready to lead in this new era of technology and solve some of the world’s most challenging problems? If so, lets talk.

As a Data Lake Architect, you’ll be part of a passionate team building new, powerful applications for IBM Security. As part of the extended team of product stakeholders, offering managers, architects, and product owners you are helping to define the new security products for hybrid cloud. We are looking for an experienced and passionate candidate who can coordinate and guide efforts of multiple scrum teams in a large, complex project. Our current teams span the globe, so we are looking for someone with excellent organizational and communication skills.

Your Role and Responsibilities

In this role you will be building a real-time, multi-tenanted data analytics platform for SaaS and Hybrid Cloud deployment. This platform must support ingestion from hundreds to hundreds of thousands unique data sources in reliable, secure, and cost effective manner.


  • Architect, design, and build a real-time, multi-tenanted data analytics platform for SaaS and Hybrid Cloud deployment .
  • Collaborate with teams across the globe to build an integrated experience for our end users.
  • Build a cost effective platform focusing on performance, reliability, scalability, and security.

Required Technical and Professional Expertise

  • English Fluent (verbal and written)
  • Proven hands-on experience building and operating an enterprise grade data analytics platform
  • Proven hands-on experience with distributed columnar data stores
  • Proven hands-on experience with query engines such as SparkSQL, PrestoDB, et cetera
  • Proven hands-on experience with distributed technologies such as Spark, Kafka, Kafka Streams, et cetera
  • 5+ years development experience with Java/Python
  • Familiarity with storage technologies and tiered storage patterns

Preferred Technical and Professional Expertise

  • Kubernetes and/or the Red Hat OpenShift Platform
  • Cloud (AWS, Azure, IBM Cloud, et cetera)
  • SQL, Python, Go
  • PostgreSQL
  • Relational, Graph, and Columnar DB design
  • ETL/ELT Pipelines
  • Open-source usage and involvement
  • Legal Work Status
Job Overview

Sign in

Sign Up

Forgotten Password