A global professional services firm is looking for a Databricks Consultant. You will work on capital markets assignments, contributing to large-scale data implementations and modern analytics architecture projects.
Must-have Skills:
You should have:
- Deep expertise with Databricks, Apache Spark, and Data Lake architecture (Delta Lake preferred)
- Strong experience with Kafka, streaming data pipelines, and real-time processing
- Hands-on ability in ETL development, including data ingestion, cleansing, and transformation at scale
- Working knowledge of CI/CD workflows, version control, and production-grade deployment strategies
- Familiarity with machine learning pipelines, particularly using MLflow or integrated Databricks ML tools
- Proficiency in PySpark, SQL, and integration with cloud platforms like Azure, AWS, or Google Cloud Platform
- Capital markets experience is required or strongly preferred: familiarity with trading data, financial instruments, or regulatory-driven environments
- Excellent client-facing communication and documentation skills
Responsibilities:
You ll be responsible for:
- Leading and delivering end-to-end implementations on the Databricks platform
- Designing and deploying data lake and streaming solutions tailored to capital markets use cases
- Managing data surveillance pipelines, ensuring performance, accuracy, and data governance compliance
- Supporting deployment pipelines and maintaining CI/CD best practices for repeatable delivery
- Collaborating with internal stakeholders and client teams to drive successful, timely delivery
- Participating in regular code reviews, architecture discussions, and hands-on client support
About the Company:
You ll join a high-performing, collaborative team with a strong pipeline of capital markets work and a reputation for technical excellence. The firm is known for investing in its people, offering real opportunities for career growth, ongoing learning, and working alongside talented, motivated colleagues. Compensation is competitive, with excellent benefits, including healthcare, training budgets, and flexible work options.
Job Features
| Job Category | Finance/Trading |
| Pay | $170,000 – $200,000 |
| Skills | Databricks, Apache Spark, Data Lake architecture, Kafka, ETL, CI/CD, version control, ML, PySpark, cloud, Capital markets experience |
| Culture | career growth, learning, talented colleagues, client-focused |