Iitjobs Inc.
Senior Data Engineer (Remote)
Senior Data Engineer | Iitjobs Inc. | Worldwide
Role & responsibilities
Collaborate with stakeholders, data architects, and other team membersto understand
data requirements and translate them into scalable data engineeringsolutions on GCP.
Design and develop data pipelines, ETL (Extract, Transform, Load)...
Senior Data Engineer | Iitjobs Inc. | Worldwide
Role & responsibilities
Collaborate with stakeholders, data architects, and other team members to understand
data requirements and translate them into scalable data engineering solutions on GCP.
Design and develop data pipelines, ETL (Extract, Transform, Load) processes, and data
integration workflows using GCP services such as Dataflow, Pub/Sub, BigQuery, Cloud
Storage, and others.
Implement data transformation and data cleansing operations to ensure data quality and
consistency throughout the data pipelines.
Build and manage data storage systems, including databases, data lakes, and data
warehouses, leveraging GCP services like BigQuery, Cloud Storage, and Cloud Spanner.
Optimize data pipelines and data storage systems for performance, scalability, and
cost-effectiveness, considering factors such as data volume, velocity, variety, and quality.
Implement data security measures and ensure compliance with data governance and
privacy policies.
Monitor, troubleshoot, and optimize data pipelines and data infrastructure to ensure the
availability, reliability, and efficiency of data processing.
Collaborate with DevOps teams to design and implement monitoring, logging, and
alerting solutions for data pipelines and infrastructure.
Preferred candidate profile
Have at least 3 years of experience in the Google Cloud Platform (especially Big Query )
Experience with Java, Python and Google Cloud SDK & API Scripting.
Experience with GCP Migration activities will be an added advantage.
Strong knowledge of Google Cloud Platform (GCP) services and technologies, including
but not limited to Dataflow, Pub/Sub, BigQuery, Cloud Storage, Container, Docker, and
Cloud Spanner.
Proficiency in programming languages such as Python, SQL, or Java for data processing
and scripting tasks.
Experience in designing and building data pipelines and ETL processes using GCP data
engineering tools.
Familiarity with data modeling, schema design, and data integration techniques.
Knowledge of data warehousing concepts and experience with data warehouse solutions
like BigQuery.
Experience with version control systems, CI/CD pipelines, and infrastructure automation
tools like Git, Jenkins, or Terraform.
Soft Skills
Excellent verbal and written communication skills.
Attention to detail
Strong problem-solving abilities
The ability to adapt to changing requirements and technologies ensuring accuracy and
precision in data analysis, report design, and development
Perks and benefits
Remote
Contract Position
Related Jobs
See more All Other Remote Jobs-
NewSave
-
NewSave
-
NewSave
-
NewSave
-
NewSave
-
NewSave
-
NewSave
-
NewSave
-
NewSave
-
NewSave