Devops Engineer For Data Domain Aaa (Remote)

Other
Salary: Competitive Salary
Job Type: Full time
Experience: Senior Level

Deutsche Telekom IT Solutions HU

Devops Engineer For Data Domain Aaa (Remote)

Devops Engineer For Data Domain Aaa | Deutsche Telekom ITSolutions HU | Hungary

Company Description

The largest ICT employer in Hungary, Deutsche Telekom ITSolutions (formerly IT-Services Hungary, ITSH) is a subsidiary of the...

Devops Engineer For Data Domain Aaa | Deutsche Telekom IT Solutions HU | Hungary

Company Description

The largest ICT employer in Hungary, Deutsche Telekom IT Solutions (formerly IT-Services Hungary, ITSH) is a subsidiary of the Deutsche Telekom Group. Established in 2006, the company provides a wide portfolio of IT and telecommunications services with more than 5000 employees. ITSH was awarded with the Best in Educational Cooperation prize by HIPA in 2019, acknowledged as one of the most attractive workplaces by PwC Hungary’s independent survey in 2021 and rewarded with the title of the Most Ethical Multinational Company in 2019. The company continuously develops its four sites in Budapest, Debrecen, Pécs and Szeged and is looking for skilled IT professionals to join its team.

Job Description

We are seeking a talented and experienced DevOps Engineer with expertise in Kubernetes to join our team. As a DevOps Engineer, you will play a crucial role in managing deployments, automating processes, and facilitating seamless communication with our development team based in India. Your primary responsibility will be to ensure the reliability and efficiency of our Data Integration Layer (DIL) application running on a Kubernetes platform provided as a Container-as-a-Service (CaaS).

Key Responsibilities:

  • Kubernetes Expertise:
  • Leverage your deep knowledge of Kubernetes to effectively manage applications on the CaaS platform.
  • Collaborate with the CaaS provider to optimize cluster configurations.
  • Deployment Management:
    • Oversee the deployment pipeline, ensuring smooth and efficient releases.
    • Coordinate closely with development teams to streamline the deployment process.
  • Automation:
    • Develop and maintain automation scripts and tools for operational efficiency.
  • Collaboration:
    • Work closely with our Indian development team, providing support and maintaining open lines of communication.
    • Promote a collaborative DevOps culture within the organization.
  • Continuous Integration/Continuous Deployment (Magenta CI/CD):
    • Implement and enhance CI/CD pipelines for applications hosted on the CaaS platform.
    • Monitor and optimize the CI/CD process for reliability and speed.
  • Monitoring and Troubleshooting:
    • Implement monitoring solutions (CaaS resources / DIL pipelines)
    • Identify and resolve issues related to application performance and reliability.
  • Security:
    • Ensure the security of applications and data within the Kubernetes clusters.
    • Implement security best practices and conduct security assessments
    • Knowledge about PSA/SoCs

    Qualifications

    • Experience as a DevOps Engineer with a focus on Kubernetes.
    • Proficiency in English, both written and verbal.
    • Knowledge of DevOps principles and best practices.
    • Experience with containerization technologies.
    • Proficiency in automation and configuration management tools.
    • Familiarity with cloud platforms.
    • Excellent problem-solving and communication skills.

    If you are a proactive, self-motivated DevOps Engineer with a passion for Kubernetes and a desire to collaborate in a global environment, we would love to hear from you. Join us in optimizing our applications and ensuring their success on our Kubernetes-based CaaS platform.

    Additional Information

    DIL description: The Data Integration Layer is user interface-based data integration platform with batch and streaming data processing capacities. DIL enables any data engineer or data scientist to quickly build data pipelines (data transfer between source and target systems) via visual exploration tool. DIL captures user inputs and generates the appropriate code to run the pipeline in the Extract, Transform & Load (ETL) engine with advance feature like Logs visualizer, Governance, Collaborative development support and metrics (statistics) to accelerate the use cases using one platform.

    DIL is also used in Data Tribe to initiate and to foster the migration to GCP (google cloud platform).

    • Please be informed that our remote working possibility is only available within Hungary due to European taxation regulation.

    Show more

    Show less

    Tagged as: remote, remote job, virtual, Virtual Job, virtual position, Work at Home, work from home

    Load more listings
    When applying state you found this job on Pangian.com Remote Network.