Вакансия: Data Engineer
GitLab (regular) PySpark (regular) SQL (regular) Python (regular) Docker (regular) Kubernetes (regular) Terraform (regular) GCP / Azure / AWS (regular) PROJECT INFORMATION: Industry: MediaLocation: Fully remoteProject length: Long-termStart: ASAP / flexibleAssignment type: B2BRate: Up to 180 PLN/h+ depending on experience (negotiable) PROJECT DETAILS: A central component of the client's strategy is the establishment of a modern cloud data platform on which data-driven solutions are developed with the help of analytics, machine learning and artificial intelligence. RESPONSIBILITIES: The implementation of data pipelines for the preparation, provision and versioning of data for model trainingAdvising the Data Scientists on the development of machine learning models, especially regarding the productive operation of these modelsDesign and implementation of microservices to provide the models via REST API, including functions for monitoring these models in productive operationDeployment of the microservices in the productive cloud environment, accounting for high availability requirements. REQUIREMENTS: Experience with: Google Cloud Platform OR Azure OR AWS, Terraform, GitLab, Kubernetes, Docker, Airflow, MLflow, BigQuery, BigTable, Python, Pyspark, SQL, REST APIs WE OFFER: Flexible working hoursInternal reference bonusCo-financed benefits: Medicover card, Multisport cardTransparently built relations based on trust and fair play