Required Data Architect
The CTO Office is responsible for full stack design and development of the company products, infrastructure and central data & ML services and other business platforms. The department is responsible for the overall projects, design, architecture and development of huge-scale and very challenging cloud-based Big Data, machine learning and business applicative platforms for ensuring excellence in the performance of business aspects including: games & product design, marketing, retention, monetization etc. The department solutions support optimization of the company overall business performance.
Responsibilities
Data Architect to design and build scalable and robust data platforms, data applications, CRM, campaign management and other back-office applications
Responsible for our Data Platform Architecture and ML Platform
Responsible for multi projects at once, being the main source of technical knowledge and experience, facilitating, building (hands-on) and mentoring RnD teams
Responsible for cloud infrastructure, architecture, micro-services, CICD, production environments and for new innovative and complex developments.
Responsible for research, analysis and performing proof of concepts for new technologies, tools and design concepts, AI products and platforms in Data
Design and development core modules in our cloud and data platform infrastructures (hosting in Google Cloud, services and Kubernetes, based on: Spark Core/Streaming Structure/SQL, Scala, Python, React, Go, Node.js, Kafka, BigQuery, Redis, Benthos, Elasticsearch, Google Cloud Machine Learning Engine and VertexAI, DataBricks, MLFlow)
The platform handles huge amounts of data, through complex processing in batch and real time modes, complex data manipulation, using services, UI frameworks and interactive notebooks.
Requirements: You are a hands-on Tech leader and you know how to manage engineering teams and drive people to success using Agile methodologies with value-oriented approach
You are a people person - you know how to plan, execute and maintain projects from design to production with standards and vision
You come with 5+ years of practical experience with Scala/Java/Python programming languages, excellent programming skills - functional programming, design patterns, data structures and TDD approach
Expert SQL queries knowledge - you know how to write efficient, low-latency queries vs modern data-warehouse solutions (BQ - preferred, Snowflake,, Athana etc) and Streaming/Batch ETL using modern processing engines (Spark - preferred, Beam, Flask, Benthos, etc).
Experienced with Machine learning platforms and processes, from the early stages of training a model up until tuning, monitoring and serving
You know how to build large-scale (petabytes), low-latency distributed systems using modern cloud computing technologies (GCP - preferred , AWS).
Knowledge of working in microservices environments and building state-of-the-art data-driven solutions.
Desired:
Experience working with CI&CD tools and devops solutions - big advantage
Experience working with dockerize and kubernetes solutions - big advantage.
This position is open to all candidates.