What You Can Expect At BBH:
If you join BBH you will find a collaborative environment that enables you to step outside your role to add value wherever you can. You will have direct access to clients, information and experts across all business areas around the world. BBH will provide you with opportunities to grow your expertise, take on new challenges, and reinvent yourself—without leaving the firm. We encourage a culture of inclusion that values each employee’s unique perspective. We provide a high-quality benefits program emphasizing good health, financial security, and peace of mind. Ultimately we want you to have rewarding work with the flexibility to enjoy personal and family experiences at every career stage. Our BBH Cares program offers volunteer opportunities to give back to your community and help transform the lives of others.
Your responsibilities
- Facilitate the establishment of a secure data platform on BBH’s OnPrem Cloudera infrastructure
- Document and develop ETL logic and data flows to facilitate the easy usage of data assets, both batch and real-time streaming
- Leverage the following (but not limited to) components of Cloudera distribution to achieve project objectives: Sqoop, Hive, Impala, Spark
- Consistent practice in coding and unit testing
- Work with distributed teams
Our requirements
- Bachelor's degree in Computer Science or related technical field, or equivalent experience
- 8+ years of experience in an IT, preliminary on hands on development
- Strong knowledge of architectural principles, frameworks, design patterns and industry best practices for design and development.
- Strong hands on experience with programming languages - Java, Scala or Python
- 4+ years’ real project experience as a data wrangler/engineer across design, development, testing, and production implementation for Big Data projects, processing large volumes of structured/unstructured data
- Strong hands-on experience with Snowflake, Spark and Kafka
- Experience with Oracle database engine with PL/SQL and performance tuning of SQL Queries
- Experience in designing efficient and robust ETL/ELT workflows and schedulers
- Communication skills – both written and verbal, strong analytical and problem-solving skills
- Experience working with Git, Jira, and Agile methodology
Optional
- End-to-end development life-cycle support and SDLC processes
- Working experience with Data Virtualization tools such as Dremio/Denodo
- Knowledge of Machine Learning libraries and exposure to Data Mining
- Working experience with AWS/Azure/GCP
- Working experience in a Financial industry
What we offer
- 2 additional days added to your holiday calendar for Culture Celebration and Community Service
- Private medical care for you and your family
- Life Insurance
- Hybrid Working Opportunities
- Professional trainings and qualification support
- Thrive Wellbeing Program
- Online benefit platform
- Contracts for an indefinite period of time with no probation period