NewsWhip
Backend Engineer (Remote)
Backend Engineer | NewsWhip | Ireland
NewsWhip is the leading provider of predictive media intelligence,tracking and predicting engagement with the world’s news stories eachday.Our platform is used each day by journalists at hundreds of top tierpublications, and communications professionals at all 10 of the top 10global PR agencies, dozens of international brands, and NGOs, internationalorganizations and not-for-profits.As an Irish company, our product and...
Backend Engineer | NewsWhip | Ireland
NewsWhip is the leading provider of predictive media intelligence, tracking and predicting engagement with the world’s news stories each day.Our platform is used each day by journalists at hundreds of top tier publications, and communications professionals at all 10 of the top 10 global PR agencies, dozens of international brands, and NGOs, international organizations and not-for-profits.As an Irish company, our product and engineering functions are central to our business. This presents a great opportunity to directly input into what we build and how we build it. There is a direct link between your work, our customers and the continued success of NewsWhip.As a Backend Engineer on our Sources and Data Platform team, you will play a pivotal role not only in how NewsWhip ingests, transforms, and enriches data on a large scale, but also in building and maintaining the systems that extract, enrich, persist, and ultimately expose substantial volumes of social data to enable impactful and actionable insights.
Responsibilities:
- Maintain, extend and scale a distributed platform that can seamlessly scale to handle billions of ingested events per day
- Build and populate data pipelines for collection, storage, processing, and analysis of data
- Build integrations with external APIs, connectors and services for collection of social media/news content and the subsequent scoring and enrichment of this content
- Advocate for improvements to data quality, security, and performance
- Build and maintain the infrastructure required for optimal extraction, transformation and loading of large data sets
- Develop and support endpoints exposing feature value from our Data Warehouse to our customer-facing Application Development teams and internal Analysts
- Interface with other internal stakeholders – from both technical and non-technical teams (Engineering, PM, DataScience)
Requirements:
- 3+ years in software development
- Deep experience of at least one modern programming language such as Java, Scala, Go, Python etc.
- Experience working with storage systems such as Elasticsearch, Cassandra, Kafka and MySQL
- Experience with high volume event streaming/queueing systems (Kafka, Kinesis, RabbitMQ etc.)
- Experience building and deploying to any cloud service (GCP, AWS, Azure, etc.)
- Knowledge of software design principles and leading software development practices
- Strong communication & collaboration skills
- Willingness to get things done, learn new things, take initiative and challenge existing assumptions and conventions
- Experience building and operating distributed systems at scale
Nice to have:
- Experience with Timeseries datastores implemented in NoSQL (Cassandra, DynamoDB, MongoDB, etc.) or Dedicated DBs (Druid, InfluxDB, etc.)
- Working with the Lightbend Reactive Platform of Scala and Akka
- Distributed database/data processing technologies (Spark, Presto etc.)
- Experience working with schedulers in a distributed, service-oriented environment (Airflow, Step Functions, Argo etc.)
- Experience with observability principles (Instrumentation, Tracing, Telemetry)
- Experience in building real-time integrations with a variety of external APIs, connectors and services (Restful/Streaming integrations, disparate authentication mechanisms, rate limit considerations, etc.)
- Experience with IaC and DevOps methodologies
- Knowledge of basic Linux administration, Kubernetes and Google Cloud Platform
- Experience working in an agile environment with iterative development and fast feedback
Bonus points:
- Experience with Graph Databases (neo4j, ArangoDB etc.)
- Experience with Natural Language Processing – Implementation of provided models for Named Entity/Semantic Extraction and Linking, Sentiment Analysis, Content Classification, etc.
Benefits:
- Competitive salary
- Health insurance
- Bonus for individual and team performance
- Pension
- Great working environment – remote first or hybrid model
- An opportunity to help define an entirely new industry category
- Excellent opportunity to grow in one of Dublin’s fastest growing home-grown companies
Our ethos:
We believe in maintaining a friendly work environment, a healthy work-life balance, and compensating our employees fairly for their input. You’ll be part of a team that believes in mutual support and education, and for a company where a work week isn’t just the gap between weekends, but an opportunity to do work that is impactful and innovative. We also love eating and socializing together, annual and seasonal company and team retreats, healthy and unhealthy snacks and other perks.
Related Jobs
See more All Other Remote Jobs-
NewSave
-
NewSave
-
NewSave
-
NewSave
- Save
- Save
- Save
- Save
- Save
- Save