Senior Data Management Engineer (Remote)

Other
Salary: Competitive Salary
Job Type: Full time
Experience: Senior Level

Netvagas

Senior Data Management Engineer (Remote)

Senior Data Management Engineer | Netvagas | Brazil

Come with us and become a #Luber

If you love technology we have some exciting news for you: SO DO WE!

Check out our package and the reasons to become a #Luber:

About Luby

...

Senior Data Management Engineer | Netvagas | Brazil

Come with us and become a #Luber

If you love technology we have some exciting news for you: SO DO WE!

Check out our package and the reasons to become a #Luber:

About Luby

Overview of position:

The primary responsibility of Senior Data Management Engineer is to build data pipelines, model and prepare data, perform complex data analysis to answer Business questions, build and automate data pipeline and quality framework to enable and promote self service data pipelines, assist in operationalizing the AI / ML Engineering solutions. This role is expected to lead and guide other team members and evangelize the design patterns as well as coding standards.This role plays an active part in our Data Modernization project to migrate the from on-prem platforms such as IBM Netezza to cloud project.

What will be your attributions at #Luby ?

  • Team up with the engineering teams and enterprise architecture (EA) to define standards, design patterns, accelerators, development practices DevOps and CI/CD automation
  • Conduct complex data analysis to answer the queries from Business Users or Technology team partners
  • Build and automate the data ingestion, transformation and aggregation pipelines using Azure, Data Factory, Databricks / Spark, Snowflake, Kafka as well as Enterprise Scheduler tools such as CA Workload automation
  • Setup and evangelize the metadata driven approach to data pipelines to promote self service
  • Setup and continuously improve the data quality and audit monitoring as well as alerting
  • Constantly evaluate the process automation options and collaborate with engineering as well as architecture to review the proposed design.
  • Demonstrate mastery of build and release engineering principles and methodologies including source control, branch management, build and smoke testing, archiving and retention practices
  • Adhere to and enhance and document the design principles, best practices by collaborating with Solution and in some cases Enterprise Architects
  • Participate in and support the Data Academy and Data Literacy program to train the Business
  • Users and Technology teams on Data
  • Respond SLA driven production data quality or pipeline issues
  • Work in a fast-paced Agile/Scrum environment
  • Identify and assist with implementation of DevOps practices in support of fully automated deployments
  • Document the Data Flow Diagrams, Data Models, Technical Data Mapping and Production
  • Support Information for Data Pipelines Follow the Industry standard data security practices and evangelize the same across the team.

Position s Requirements

  • 5+ years of experience in an Enterprise Data Management or Data Engineering role
  • 5+ of hands on experience in building metadata driven data pipelines using Azure Data Factory, Databricks / Spark for Cloud Datalake
  • 5+ years hands on experience with using one or more of the following for data analysis and wrangling Databricks, Python / PySpark
  • 10+ years hands on experience with SQL data manipulation on databases
  • 3+ years of hands on experience using but not limited to Snowflake, Netezza, Oracle, SQL Server, MySQL, Teradata
  • Experience working in a multi developer environment and hands on experience in using either azure devops or gitlab
  • Preferably experienced in SLA driven Production Data Pipeline or Quality support
  • Experience or strong understanding of the traditional enterprise ETL platforms such as IBM Datastage, Informatica, Pentaho, Ab Initio etc.
  • Functional knowledge of some of the following technologies – Terraform, Azure CLI, PowerShell, Containerization (Kubernetes, Docker)
  • Functional knowledge of one or more Reporting tools such as PowerBI, Tableau, OBIEE
  • Team player with excellent communication skills, ability to communicate with the customer directly and able to explain the status of the deliverables in scrum calls
  • Ability to implement Agile methodologies and work in an Agile DevOps environment.

Preferred

  • Bachelors degree in Computer Science or Engineering or Mathematics or related field
  • 5+ years of experience in various cloud technologies within a large-scale organization.

We hire through PJ contracts while also offering many benefits (…so you get the best of both worlds o/):

Health And Well-being

  • HEALTH PLAN: Unimed co-participative health plan.
  • DENTAL PLAN: Bradesco national coverage.
  • LIFE INSURANCE: Life insurance from MetLife.
  • WELLHUB: Wellness platform, with fitness, mindfulness, therapy, nutrition and sleep solutions.
  • PSICONEXÃO: Referral network for psychology professionals.

AND +:

  • +EDUCATION: Hub of partner educational institutions offering exclusive discounts for Lubers!
  • MULTI PARTNERSHIP: 20% discount at Multilaser stores.
  • ADAPTABLE SCHEDULE: Adaptable schedules according to the project.
  • ANNUAL BONUS: 80h/year bonus.
  • REFER A FRIEND: When referring professionals, receive a cash bonus.
  • REFER A BUSINESSES: Receive a bonus when you refer new businesses to Luby.
  • REFER A BUSINESSES: Receive a bonus when you refer new businesses to Luby.
  • ANNUAL BONUS: 80h/year bonus.

Show more

Show less

Tagged as: remote, remote job, virtual, Virtual Job, virtual position, Work at Home, work from home

Load more listings
When applying state you found this job on Pangian.com Remote Network.