вернуться в ленту
Вакансия опубликована
25
November
2025
Senior
Data Engineer
Удалённо (EU)
З/П не указана
Senior
Удалённо (EU)
З/П не указана
Suvoda is seeking a skilled and driven Cloud Data Engineer to help evolve our data platform towards a data mesh architecture. In this role, you’ll design and build domain-oriented data products and support near real-time reporting.You’ll work on building and optimizing ETL/ELT pipelines using AWS Glue and PySpark, ensuring scalable, high-performance data processing across our platform.
Responsibilities:
— Contribute to the design and implementation of a data mesh architecture using GraphQL APIs to expose domain-owned data products.
— Build and maintain a modern AWS-based data lake using S3, Glue, Lake Formation, Athena, and Redshift.
— Develop and optimize ETL/ELT pipelines using AWS Glue and PySpark to support batch and streaming data workloads.
— Implement AWS DMS pipelines to replicate data into Aurora PostgreSQL for near real-time analytics and reporting.
— Support data governance, quality, observability, and API design best practices.
— Collaborate with product, engineering, and analytics teams to deliver robust, reusable data solutions.
— Contribute to automation and CI/CD practices for data infrastructure and pipelines.
— Stay current with emerging technologies and industry trends to help evolve the platform.
Requirements:
— Bachelor’s degree in a technical field such as Computer Science or Mathematics.
— At least 4 years of experience in data engineering, with demonstrated ownership of complex data systems.
— Solid experience with AWS data lake technologies (S3, Glue, Lake Formation, Athena, Redshift).
— Understanding of data mesh principles and decentralized data architecture.
— Proficiency in Python, SQL
— Experience with data modeling, orchestration tools (e.g., Airflow), and CI/CD pipelines.
— Strong communication and collaboration skills.
Preferred Qualifications:
— Master’s degree, especially with a focus on data engineering, distributed systems, or cloud architecture.
— Hands-on experience in infrastructure-as-code tools (e.g., Terraform, CloudFormation).
— Expertise in AWS Glue and PySpark for scalable ETL/ELT development.
— Experience with event-driven architectures (e.g., Kafka, Kinesis).
— Familiarity with data cataloging and metadata management tools.
— Knowledge of data privacy and compliance standards (e.g., GDPR, HIPAA).
— Background in agile development and DevOps practices.
Важно: pедакция vseti.app не несет ответственности за любую информацию в этой публикации, в т. ч. текстовое описание и графические изображения, предоставленные нам авторами вакансии, публичными источниками сети интернет и другими пользователями интернета. Если вы нашли ошибку, пожалуйста, сообщите нам об этом help@vseti.app или в телеграм

Suvoda
Our advanced patient randomization and trial supply management (RTSM) system. Your clinical trial command and control center.
Подробнее о компанииДля отклика:
Стать заметнее для работодателей → здесь