Gcp Developer (Remote)

IT/Dev
Salary: Competitive Salary
Job Type: Full time
Experience: Senior Level

Cymetrix

Gcp Developer (Remote)

Gcp Developer | Cymetrix | India

Job Title: Senior Google Cloud Data Engineer (with API DevelopmentExperience)

Location: Remote

    ...

    Gcp Developer | Cymetrix | India

    Job Title: Senior Google Cloud Data Engineer (with API Development Experience)

    Location: Remote

    • Job Type: Full-time/Freelancer

    “Must have skills:

    1. GCP – GCS, PubSub, Dataflow or DataProc, Bigquery, Airflow/Composer, Python(preferred)/Java

    2. ETL on GCP Cloud – Build pipelines (Python/Java) + Scripting, Best Practices, Challenges

    3. Knowledge of Batch and Streaming data ingestion, build End to Data pipelines on GCP

    4. Knowledge of Databases (SQL, NoSQL), On-Premise and On-Cloud, SQL vs No SQL, Types of No-SQL DB (at least 2 databases)

    5. Data Warehouse concepts – Beginner to Intermediate level”

    Required Skills & Qualifications:

    ● Experience with Google Cloud Platform (GCP): Proficiency with GCP services, including

    BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, Cloud Composer

    (Airflow), and Kubernetes Engine (GKE).

    ● Data Engineering & Transformation Expertise: Strong experience in building and

    optimizing ETL/ELT pipelines, performing data cleansing, transformation, and normalization

    tasks. Expertise in working with large datasets and ensuring that data is structured for

    efficient analysis and reporting.

    ● API Development: Extensive experience in designing, developing, and maintaining RESTful

    APIs. Familiarity with OAuth, JWT, and best practices in API versioning, documentation, and

    security.

    ● Distributed Systems & Cloud Architecture: Knowledge of distributed systems,

    microservices, and cloud-native application architecture. Hands-on experience with cloud

    infrastructure management (compute, storage, networking).

    ● Data Modeling & Data Warehousing: Experience in building and maintaining data

    warehouses, designing optimized schemas, and working with data lakes or large-scale data

    storage solutions.

    ● Automation & Orchestration: Familiarity with workflow orchestration tools such as Airflow

    and automation frameworks. Experience in automating cloud infrastructure deployments

    using Terraform, Cloud Deployment Manager, or similar tools.

    ● Collaboration Tools: Comfortable working with Git, Jira, and Confluence for version control,

    issue tracking, and documentation.

    ● Problem Solving: Strong analytical skills, with a proven ability to troubleshoot complex

    issues and deliver effective solutions.

    Preferred Qualifications:

    ● Cloud Certifications: Google Cloud Professional Data Engineer or Professional Cloud

    Architect certification is a plus.

    ● Security & Compliance: Understanding of data security best practices, compliance

    frameworks (e.g., GDPR, CCPA), and encryption techniques in cloud environments.

    ● CI/CD Pipeline Experience: Experience setting up continuous integration and deployment

    pipelines for data solutions and APIs.

    ● Data Pipeline Development & Transformation: Design, develop, and maintain efficient and

    scalable data pipelines using Google Cloud technologies such as BigQuery, Cloud

    Dataflow, Cloud Composer (Airflow), Data Fusion, Pub/Sub, Cloud Storage, and Dataproc.

    Transform raw data into actionable insights by applying data cleansing, enrichment,

    normalization, and aggregation techniques.

    ● Data Transformation & Processing: Lead complex data transformation projects, including

    ETL/ELT processes, data validation, and enrichment. Optimize transformation logic for

    performance, scalability, and maintainability, ensuring that data is structured in a way that

    facilitates easy analysis and reporting.

    ● API Development & Integration: Develop and maintain RESTful APIs for data access and

    integration with internal and external systems, ensuring high availability and low latency.

    Ensure that APIs are secure, well-documented, and easy to integrate.

    ● Cloud Infrastructure Management: Implement and manage data infrastructure on Google

    Cloud, optimizing for performance, scalability, and cost-efficiency. Leverage Google

    Kubernetes Engine (GKE) and Cloud Functions to deploy containerized services and

    event-driven functions.

    ● Collaboration & Mentorship: Work closely with cross-functional teams to understand data

    needs and deliver solutions that address business requirements. Mentor junior engineers,

    sharing your expertise in best practices for cloud data engineering, API design, and data

    transformation.

    ● Documentation & Reporting: Maintain thorough documentation for data pipelines,

    infrastructure, data transformations, and APIs. Provide regular updates to stakeholders on

    project status, challenges, and outcomes.

    Required Skills & Qualifications:

    ● Experience with Google Cloud Platform (GCP): Proficiency with GCP services, including

    BigQuery, Cloud Storage, Dataflow, Pub/Sub, Cloud Functions, Cloud Composer

    (Airflow), and Kubernetes Engine (GKE).

    ● Data Engineering & Transformation Expertise: Strong experience in building and

    optimizing ETL/ELT pipelines, performing data cleansing, transformation, and normalization

    tasks. Expertise in working with large datasets and ensuring that data is structured for

    efficient analysis and reporting.

    ● API Development: Extensive experience in designing, developing, and maintaining RESTful

    APIs. Familiarity with OAuth, JWT, and best practices in API versioning, documentation, and

    security.

    ● Distributed Systems & Cloud Architecture: Knowledge of distributed systems,

    microservices, and cloud-native application architecture. Hands-on experience with cloud

    infrastructure management (compute, storage, networking).

    ● Data Modeling & Data Warehousing: Experience in building and maintaining data

    warehouses, designing optimized schemas, and working with data lakes or large-scale data

    storage solutions.

    ● Automation & Orchestration: Familiarity with workflow orchestration tools such as Airflow

    and automation frameworks. Experience in automating cloud infrastructure deployments

    using Terraform, Cloud Deployment Manager, or similar tools.

    ● Collaboration Tools: Comfortable working with Git, Jira, and Confluence for version control,

    issue tracking, and documentation.

    ● Problem Solving: Strong analytical skills, with a proven ability to troubleshoot complex

    issues and deliver effective solutions.

    Preferred Qualifications:

    ● Cloud Certifications: Google Cloud Professional Data Engineer or Professional Cloud

    Architect certification is a plus.

    ● Security & Compliance: Understanding of data security best practices, compliance

    frameworks (e.g., GDPR, CCPA), and encryption techniques in cloud environments.

    ● CI/CD Pipeline Experience: Experience setting up continuous integration and deployment

    pipelines for data solutions and APIs.

    Show more

    Show less

    Tagged as: remote, remote job, virtual, Virtual Job, virtual position, Work at Home, work from home

Load more listings
When applying state you found this job on Pangian.com Remote Network.