Data Engineer (Remote)

Other
Salary: Competitive Salary
Job Type: Full time
Experience: Senior Level

myGwork - LGBTQ+ Business Community

Data Engineer (Remote)

Data Engineer | myGwork – LGBTQ+ Business Community |Italy

This job is with eBay, an inclusive employer and a member ofmyGwork – the largest global platform for the LGBTQ+ business community.Please do not contact the recruiter directly.

...

Data Engineer | myGwork – LGBTQ+ Business Community | Italy

This job is with eBay, an inclusive employer and a member of myGwork – the largest global platform for the LGBTQ+ business community. Please do not contact the recruiter directly.

At eBay, we’re more than a global ecommerce leader – we’re changing the way the world shops and sells. Our platform empowers millions of buyers and sellers in more than 190 markets around the world. We’re committed to pushing boundaries and leaving our mark as we reinvent the future of ecommerce for enthusiasts.

Our customers are our compass, authenticity thrives, bold ideas are welcome, and everyone can bring their unique selves to work – every day. We’re in this together, sustaining the future of our customers, our company, and our planet.

Join a team of passionate thinkers, innovators, and dreamers – and help us connect people and build communities to create economic opportunity for all.

About The Team And The Role

You will work in Certilogo’s Delivery & Technology division, with a team of highly skilled product managers, software developers, AI specialists, UX designers and data engineers with a shared passion for building great web applications, following Agile methodology, processes and tools.

In close collaboration with the Data & Analytics Senior Manager and the Tech Lead you will focus on the evolutionary development and maintenance of our Certilogo services.

In addition, closely in contact with the Innovation lead and the Data & Analytics Senior Manager, you will operate within the Innovation team in order to implement the data part of the different Pilots and Proof of Concepts that will help validate concepts, engage partners, provide insights into market acceptance, ultimately increasing the likelihood of developing and launching of a successful full-fledged feature.

What You Will Accomplish

  • You will design and develop robust, scalable, and efficient data pipelines to manage our Azure-based datalake
  • You will apply your expertise in writing reusable, testable, and efficient data scripts, working with data scientists and business analysts to architect and implement new data models and data integration to support business intelligence tools such as Tableau and Domo
  • You will use your analytical skills to perform deep dives into datasets to identify trends and insights, using Google Cloud tools to enhance data tracking and measure user engagement within our digital applications.
  • You will ensure data quality, maintainability, and scalability through rigorous data validation, testing, and refactoring.
  • You will conduct thorough requirements analysis and solution design, focusing on data modeling and database schema design within our Azure environment, and you will write the vital documentation supporting these activities.
  • You will fix and resolve issues related to data pipelines and data quality to maintain high availability and reliability of data services.

What You Will Bring

  • Minimum 3 years of experience in a data engineering role, with hands-on experience in maintaining large scale databases, data warehouses or data lakes.
  • Proficiency with BI tools such as Tableau and Domo, with a proven track record of developing actionable business insights from sophisticated datasets.
  • Strong experience with Google Tag Manager, Google Analytics, and BigQuery, particularly in supervising and analyzing user engagement across digital platforms.
  • Expertise in relational SQL and NoSQL databases, with specific experience in managing and optimizing data within Azure environments.
  • Experience with data pipeline and workflow management tools such as Apache Airflow, AWS Data Pipeline, Google Cloud Dataflow with a preference for those experienced in an Azure context, like Azure Data Factory or Azure Synapse.
  • Proficient in scripting languages used in data engineering such as Python, Java, C++, or Scala, with a preference for Python due to its extensive use in data analytics.
  • Solid understanding of data warehousing solutions, particularly within Azure, and experience building ETL pipelines tailored to business needs.

Please see the Talent Privacy Notice for information regarding how eBay handles your personal data collected when you use the eBay Careers website or apply for a job with eBay.

eBay is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, sex, sexual orientation, gender identity, veteran status, and disability, or other legally protected status. If you have a need that requires accommodation, please contact us at talent@ebay.com. We will make every effort to respond to your request for accommodation as soon as possible. View our accessibility statement to learn more about eBay’s commitment to ensuring digital accessibility for people with disabilities.

Jobs posted with location as “Remote – United States (Excludes: HI, NM)” excludes residents of Hawaii and New Mexico.

This website uses cookies to enhance your experience. By continuing to browse the site, you agree to our use of cookies. Visit our Privacy Center for more information.

Show more

Show less

Tagged as: remote, remote job, virtual, Virtual Job, virtual position, Work at Home, work from home

Load more listings
When applying state you found this job on Pangian.com Remote Network.