Were looking for a data Engineer to join Tech19, responsible for developing innovative features based on multiple layers of data. These features will power recommendation systems, insights, and more.
This role involves working on diverse data pipelines that tackle challenges related to scale and algorithmic optimization.
Requirements: 3+ years of experience in data engineering, building and optimizing scalable data pipelines.
5+ years of experience as a software Developer, preferably in Python.
Algorithmic experience, including developing and optimizing Machine Learning models and implementing advanced data algorithms.
Experience working with cloud ecosystems, preferably AWS (S3, Glue, EMR, Redshift, Athena) or comparable cloud environments (Azure/GCP).
Expertise in extracting, ingesting, and transforming large datasets efficiently.
Deep knowledge of Big Data platforms, such as Spark, Databricks, Elasticsearch, and Kafka for Real-Time data streaming.
(Nice-to-have) Hands-on experience working with Vector Databases and embedding techniques, with a focus on search, recommendations, and personalization.
This position is open to all candidates.