Share on facebook
Share on twitter
Share on linkedin
Share on whatsapp
Share on email

Senior Data Engineer

Israel - Ramat-Gan · Full-time

About The Position

The Big Data Engineer will join the data team and work alongside other data experts (Engineers & Analysts).

In this role, you are involved in all company decisions, making this a crucial role. The position gives different departments important exposure to different data sources that shed light on every aspect of our product, from user behavior, new feature adoption, operational, financial, marketing KPIs, and more.

You will work closely with various departments to address their business needs in-depth understanding of the business as this is key for any data engineer to fulfill this role properly.

Responsibilities

  • Design, architect and build Elementor data architecture from the ground up
  • Implement best-in-class solutions for different use cases
  • Utilize a combination of cloud-based and open source tools and frameworks to solve our most complex data problems
  • Develop tailor-made solutions as part of the data pipeline
  • Implement data pipelines and data architecture that is scalable, fault-tolerant, supports high throughput, and low latency, while considering security aspects at all times
  • Consider cost optimization (FinOps) at all stages of implementation
  • Provide best-in-class solutions in terms of industry standards and best practices

Requirements

  • Relevant certification in big data architecture in one or more of the public clouds (AWS Solution Architect / Big Data | GCP Data Engineer) - an advantage
  • 5+ years of industry experience in a similar role - an advantage
  • Familiarity with off-the-shelf ETL tools ( Matillion, Stitch, Fivetran) and orchestration frameworks (Airflow, AWS Appflow, AWS Step Functions & Data pipelines, AWS Glue, GCP Cloud Composer & Dataflow) Extensive experience in Big Data architecture at scale in the cloud (AWS | GCP) - an advantage
  • Deep understanding and experience with data lakes & data warehouses in the cloud (S3, Redshift, Cloud storage, BigQuery)
  • Deep understanding and experience with RDS and NoSql DBs (MySql, Elastic, MongoDB, Redis)
  • Fluent in Python but able to understand other languages as well. 
  • Familiar with bash scripting and Linux OS 
  • Experience with near real-time data streaming solutions (queuing and streaming, pub/sub paradigms, Kafka, AWS SNS / SQL, Kinesis Firehose & Streams, GCP Cloud Pub/Sub & Dataflow) - an advantage 
  • Accustomed to talking to internal stakeholders, understanding their needs, and building the absolute best-in-class data architecture to solve their problems in line with short- and medium-term needs.

Apply for this position