Nestle Senior Dbt Developer Dup Ro (Remote)

IT/Dev
Salary: Competitive Salary
Job Type: Full time
Experience: Senior Level

Luxoft

Nestle Senior Dbt Developer Dup Ro (Remote)

Nestle Senior Dbt Developer Dup Ro | Luxoft |Romania

Project Description:

  • The role of a DBT developer is pivotal in automating and optimizing...

    Nestle Senior Dbt Developer Dup Ro | Luxoft | Romania

    Project Description:

    • The role of a DBT developer is pivotal in automating and optimizing data movement between critical data layers—Staging (STG), Integration (INT), Business (BUS), and Presentation (PRS)—in a data pipeline built on Azure. This role is required to bridge the gap between Azure Data Factory (ADF), which orchestrates the pipeline, and DBT, which executes transformations and ensures data readiness for consumption. ADF lacks the necessary permissions to manage data movement directly, making it essential for DBT to be triggered by ADF for accurate and secure data transformation and movement.

    Responsibilities:

    • This DBT-driven automation enhances efficiency, minimizes manual intervention, and ensures that data is processed consistently and is ready for end-user consumption in a timely manner. It plays a crucial role in meeting business intelligence and reporting needs by ensuring that data flows seamlessly across layers, maintaining data integrity, security, and availability.

    Mandatory Skills Description:

    DBT (Data Build Tool):

    o Expertise in building, orchestrating, and managing data models using DBT, including deep knowledge of DBT commands, Jinja macros, and configurations.

    o Ability to define and manage dependencies between data models across layers (STG, INT, BUS, PRS) to ensure smooth orchestration and pipeline execution.

    o Proficiency in developing and managing DBT projects, performing testing, documenting models, and handling data transformation automation workflows.

    Snowflake Expertise:

    o Proficient in working with Snowflake as the data warehouse solution, leveraging Snowflake’s architecture to optimize DBT pipelines.

    o Experience in writing and optimizing SQL queries for Snowflake, ensuring efficient use of virtual warehouses, data partitioning, and clustering.

    o Knowledge of Snowflake roles, access control, and permissions management to ensure secure and seamless data operations across environments.

    SQL Proficiency:

    o Strong SQL skills to write, optimize, and troubleshoot complex queries tailored for Snowflake’s environment.

    o Ability to design efficient transformations using DBT and SQL, handling large datasets with performance tuning in mind.

    Azure Data Platform:

    o Proficiency with Azure services such as Azure Data Factory (ADF), Azure Synapse Analytics, and Data Lake, ensuring seamless integration between DBT, ADF, and the Snowflake data platform.

    o Ability to use ADF to orchestrate and automate data pipelines and trigger DBT executions through APIs or custom activities.

    Version Control (e.g., Git):

    o Hands-on experience with Git for version control of DBT models, managing branches, collaborating with teams, and resolving merge conflicts efficiently.

    Data Warehousing and ETL Concepts:

    o Deep knowledge of data warehousing architectures and best practices, including experience in layer-based architecture (STG, INT, BUS, PRS) for organizing data flow.

    o Expertise in ETL/ELT processes, data transformation, and ensuring consistent data quality across environments.

    CI/CD Pipelines:

    o Experience with continuous integration and deployment (CI/CD) practices, particularly in automating DBT projects using ADF, Jenkins, or other CI tools.

    o Familiarity with using GitHub Actions, Azure DevOps, or similar tools to automate the build, testing, and deployment of DBT models.

    Nice-to-Have Skills Description:

    • Orchestration Frameworks:

    o Proficient in setting up DBT orchestration frameworks using ADF, Airflow, or Prefect to schedule and automate DBT runs, ensuring timely data processing across environments.

    o Hands-on experience in managing orchestration dependencies between DBT, Snowflake, and ADF pipelines, ensuring reliable data flow.

    • API Integration:

    o Experience with integrating DBT Cloud or DBT Core with other systems using APIs, including triggering DBT executions via ADF custom activities or REST API calls.

    o Ability to automate end-to-end data pipelines that span across Snowflake, DBT, and Azure services.

    • Problem-Solving:

    o Strong analytical and problem-solving skills, with the ability to troubleshoot and resolve issues in data pipelines, optimize transformations, and manage bottlenecks efficiently.

    • Collaboration:

    o Experience working closely with data engineers, analytics teams, and business stakeholders to ensure alignment on data needs, model design, and data quality.

    o Ability to work cross-functionally, especially with Azure administrators, Snowflake architects, and other DBT developers, to ensure seamless data integration.

    • Documentation:

    o Ability to create and maintain comprehensive documentation for DBT models, workflows, and orchestration setups, ensuring all stakeholders are informed of data dependencies and transformations.

    • Adaptability and Innovation:

    o Willingness to continuously learn and adapt to emerging tools, techniques, and best practices in data orchestration, automation, and cloud-based data platforms.

    Languages:

    • English: B2 Upper Intermediate

    Show more

    Show less

    Tagged as: remote, remote job, virtual, Virtual Job, virtual position, Work at Home, work from home

Load more listings
When applying state you found this job on Pangian.com Remote Network.