Data Engineer    logo
Job TitleData Engineer
Business Stream:Tipico

We are Tipico, the leading sports betting provider in Germany and one of the most exciting tech companies in the industry. We are fuelled by our passion for the perfect product, using the newest tech stack to make advances every single day. Our culture is bursting with energy and ambition where we generate intense moments of Spannung, thriving through our values of Trust, Progress, and Passion. We push the limits of what is possible because we believe that when adrenaline meets progress, awesome things happen. 

 

Job Description 

Being a market leader means we always need to stay ahead. As a Data Engineer, you will be part of our strategic data team pioneering artificial intelligence concepts and cutting-edge data warehousing techniques, processing millions of transactions every day in low latency. At Tipico, we process terabytes of data every day, related to all areas of the business – such as bookmaking data, web analytics, CRM data, and much more. 

Are you ready to turn up the power and bring the thrill of Spannung into your working life? Then show us your game face – because we’re looking for a Data Engineer to join the team! 

 

Your playing field: 

  • You will create and maintain both batch (ELT) and real-time data pipelines and architecture. 
  • We are continuously becoming more data-driven – you will assemble large and complex data sets that meet our functional and non-functional business requirements. 
  • Innovation drives us and we are continuously reinventing ourselves. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. 
  • You’re involved in building the infrastructure required for optimal extraction, loading, and transformation of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies. 
  • Our focus is on gaining insights that will change the way we look at sports betting, the tools you will build will provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. 
  • Collaborate with your stakeholders including data architects, Executives, Product, Data, and IT team members from the beginning to the delivery of your projects.  

   Which skills should you bring to the pitch: 

  • 3 years of work experience. That is relevant for us and this position. 
  • Cloud data services. Such as AWS services such as S3, Athena, EC2, EMR, AWS Batch, and Lambda. 
  • Large-scale production databases and SQL. Familiarity with large cloud DWH tools such as Redshift or Snowflake, including data modeling techniques and handling of very large datasets. 
  • ELT. Experience with ELT development (extractions, data load, aggregation) using tools such as DBT and Airflow. 
  • Governance. Understanding and experience of data quality concepts and tools. Also, experience with using data catalog tools. 
  • Big Data. You have worked with or are familiar with big data technologies such as NIFI, Kafka, Python, Spark, and Hudi. 
  • Docker/Kubernetes. You have familiarity with containerization and orchestration technologies like Docker/Kubernetes. 
  • Strong analytics skills and the ability to innovate. We need you to come up with the best solutions for us that will push us forward. 
  • Fast learner. Working in a fast-paced environment requires us to learn new and complex concepts and be resourceful! 
  • Collaborative. Share your ideas! We want you to engage in our interactive discussions within our team and clearly communicate technical concepts. 

APPLY NOW!