Were searching for a Data Engineer
About The Position
We are the leading global provider of sports data, dedicated to revolutionizing the industry through innovative solutions. We excel in sports data collection and analysis, advanced data management, and cutting-edge services like AI-based sports tips and high-quality sports visualization.
As the sports data industry continues to grow, we remain at the forefront, delivering real-time solutions. If you share our passion for sports and technology and have the drive to advance the sports-tech and data industries, we invite you to join our team!
If you share our love of sports and tech, you've got the passion and will to better the sports-tech and data industries - join the team! We are looking for a highly motivated Data Engineer.
About the team: Data Integrity
Our Data Integrity is one of the main pillars of the company's offering and long-term strategy.
We are pushing the boundaries of real-time analysis, utilizing machine learning and artificial intelligence to find the delicate balance between low latency and data accuracy.
Responsibilities:
Building production-grade data pipelines and services
Design and Build data-lake/lakehouse
Taking ownership of major projects from inception to deployment
Architecting simple yet flexible solutions, and then scaling them as we grow
Collaborating with cross-functional teams to ensure data integrity, security, and optimal performance across various systems and applications.
Staying current with emerging technologies and industry trends to recommend and implement innovative solutions that enhance data infrastructure and capabilities.
Requirements: 3+ years of experience delivering production-grade data pipelines and backend services
2 + years of experience using Spark/Presto/ Trino or similar
Experience building data pipelines, and working in distributed architectures
Experience with SQL and NoSQL Databases
Experience with designing and implementing data lake/warehouse
Experience with Snowflake/Databricks/Sagemaker or similar Advantage
Knowledge and understanding of work in a modern CI environment: Git, Docker, K8S - An advantage
Experience with ETL tools: AWS Glue, Apache Airflow, etc... - An advantage
Experience with Kafka - Advantage
Experience with Kafka Connect (Java) - Advantage.
This position is open to all candidates.