Software Engineer Gcp (Remote)

IT/Dev
Salary: Competitive Salary
Job Type: Full time
Experience: Senior Level

Aexonic

Software Engineer Gcp (Remote)

Software Engineer Gcp | Aexonic | India

Hi Candidate

I hope this message finds you well.

We are currently seeking an experienced Senior BigData Engineer (GCP) to join our dynamic team, and based on...

Software Engineer Gcp | Aexonic | India

Hi Candidate

I hope this message finds you well.

We are currently seeking an experienced Senior Big Data Engineer (GCP) to join our dynamic team, and based on your profile, we believe you could be a great fit for the role! 

Please find the job details below:

Position: Senior Big Data Engineer (GCP)

Experience– 5+ Yrs

Shift Time – EST Shift 5:30 PM to 2:30 AM

Remote / Hybrid – Full Time

We are seeking a highly skilled Senior Big Data Engineer with extensive experience in Big Data technologies, including Hadoop, Hive, PySpark, and expertise in Google Cloud Platform (GCP) and BigQuery. The successful candidate will be responsible for designing, building, and maintaining robust data pipelines that support large-scale data processing and analytics. This role requires excellent data analysis skills and the ability to architect solutions for cloud-based environments, with a focus on extracting, processing, and analyzing data from diverse sources.

Key Responsibilities:

  • Design, develop, and optimize Big Data pipelines and workflows using Hadoop, Hive, PySpark, and Google BigQuery.
  • Architect data solutions to facilitate smooth data migration and processing between on-premises systems and Google Cloud Platform (GCP).
  • Create and manage Unix Shell Scripts for process automation and data handling tasks.
  • Perform in-depth data analysis and work comfortably with large datasets to derive actionable insights.
  • Collaborate effectively with cross-functional teams to understand data requirements and translate them into scalable data solutions.
  • Ensure data quality, integrity, and security across all stages of the data lifecycle.
  • Stay updated with the latest industry trends and emerging technologies to continuously improve data solutions.

Mandatory Qualifications:

  • Minimum of 5 years of industry experience working with Big Data technologies.
  • Proficiency in Hadoop, Hive, and PySpark for data processing.
  • Strong experience with Unix Shell Scripting.
  • Expertise in SQL and extracting data from multiple sources.
  • Demonstrated experience in data analysis, specifically querying and analyzing large datasets on Hadoop HDFS using Hive and Spark.
  • Proven experience in architecting Big Data pipelines and data solutions.
  • Proficiency in Google BigQuery and migrating data pipelines to GCP.
  • Exceptional communication skills for clear requirement gathering and collaboration.

Preferred Qualifications:

  • Google Cloud Platform (GCP) certification.
  • Professional experience with cloud hosting platforms, with a preference for GCP.

Interested Candidate Share their profile by EOD with below details.

CCTC:

ECTC:

NP:

Offer Any:

Reason For Job Change:

Bhagyashri Patil

IT Recruiter

Aexonic Technologies Pvt. Ltd.

Kharadi, Pune, India

Cell +91-9860490420

  • b.patil@aexonic.com | www.aexonic.com

Show more

Show less

Tagged as: remote, remote job, virtual, Virtual Job, virtual position, Work at Home, work from home

Load more listings
When applying state you found this job on Pangian.com Remote Network.