Senior Data Engineer Aws Snowflake (Remote)

Other
Salary: Competitive Salary
Job Type: Full time
Experience: Senior Level

inhire.io

Senior Data Engineer Aws Snowflake (Remote)

Senior Data Engineer Aws Snowflake | inhire.io |Poland

Dla naszgeo klienta RITS Professional Services poszukujemy osobyna stanowisko Senior Data Engineer AWS/Snowflake.

Join Our Team as a Data Engineer!

...

Senior Data Engineer Aws Snowflake | inhire.io | Poland

Dla naszgeo klienta RITS Professional Services poszukujemy osoby na stanowisko Senior Data Engineer AWS/Snowflake.

Join Our Team as a Data Engineer!

Are you passionate about data and looking for an exciting role where you can shape the future of data infrastructure? We’re on the lookout for a talented Data Engineer with DevOps skills to join our innovative team.

What We Offer

  • 100% Remote Work – Enjoy the flexibility of working from anywhere in Poland.
  • Competitive Rate – Up to 150-155 zÅ‚/h on a B2B contract.
  • Challenging and Impactful Projects – Work with cutting-edge technologies and make a real impact.

Your Responsibilities

  • Design and maintain robust data pipelines that power business insights.
  • Work with large, complex data sets to meet both functional and non-functional requirements.
  • Implement internal process improvements by automating manual tasks, optimizing data delivery, and enhancing infrastructure scalability.
  • Build and maintain the infrastructure for data extraction, transformation, and loading (ETL) using SQL and AWS big data technologies.
  • Develop and optimize CI/CD pipelines (GitLab, DataOps).
  • Work with dbt for data transformations and scripting complex algorithms.
  • Manage Snowflake databases, advanced SQL features, and performance tuning using tools like JavaScript, Bash, Docker, Python, Gitlab, and JIRA.
  • Collaborate with cross-functional teams to provide data-driven insights and solutions.
  • Ensure data security across national boundaries and multiple AWS regions.
  • Create innovative tools for analytics and data science teams to enhance product functionality and business performance.

Requirements

  • 4+ years of experience with data pipeline-focused programming languages (eg. Python, R).
  • 4+ years of experience with SQL.
  • Experience with different types of storage solutions (filesystem, relational, MPP, NoSQL) and handling various data types (structured, unstructured, metrics, logs).
  • 3+ years of experience in working in data architecture concepts (in any of following areas data modeling, metadata mng., workflow management, ETL/ELT, real-time streaming, data quality, distributed systems).
  • 3+ years of experience in cloud technologies like Airflow, Glue, Dataflow, and handling data on AWS, Redshift, BigQuery, S3, etc.
  • 1+ year of experience with Java/Scala
  • Very good knowledge of data serialization languages such as JSON, XML, YAML.
  • Excellent knowledge of Git, Gitflow and DevOps tools (e.g. Docker, Bamboo, Jenkins, Terraform.
  • Knowledge of Unix and strong troubleshooting skills.
  • Bonus: pharma data formats (SDTM).

Why Join Us?

You’ll work alongside a talented and passionate team, get the opportunity to learn and grow, and tackle meaningful projects that drive real business value. If you’re a data enthusiast who loves solving complex problems and wants to shape the future of data systems, we’d love to hear from you!

Don’t hesitate and apply now!

Show more

Show less

Tagged as: remote, remote job, virtual, Virtual Job, virtual position, Work at Home, work from home

Load more listings
When applying state you found this job on Pangian.com Remote Network.