Remote Data Engineer needed for technology company that allows Online Media to achieve the most from their work.
This Jobot Job is hosted by: Patrick Murray
Are you a fit? Easy Apply now by clicking the "Apply Now" button and sending us your resume.
Salary: $100,000 - $150,000
A bit about us:
We are looking for a Data Engineer to help develop and own data solutions at our fast-paced Online Media and Ad Services and Revenue company. The Data Engineer role is an excellent opportunity for someone who wants to join a close-knit and smart team at a rapidly growing business. If you are a highly independent worker and have excellent organizational and problem-solving skills, this is the job for you.
If you are a Data Engineer that can work remote, please apply today!
Why join us?
- Competitive salary
- Great benefits
- Career growth
- Modern, startup culture
- Friendly work environment
- Remote employment
- Long term learning opportunities
Job Details
The position will require extensive hands-on experience building and scaling data pipelines and corresponding data stores for reporting. We are looking for someone with experience using Airflow, AWS Glue, GCP Dataflow (Beam or Flink), Spark, and/or data stores like Snowflake, Cassandra (or ScyllaDB) to join our fast-paced Online Media and Ad Services and Revenue technology company! Our Expected language competency would be in Python, Java, Scala, or R, with Ruby and/or Go nice to have.
Responsibilities:
- Develop and maintain ETL/ELT processes at scale, handling hundreds of millions of records per day
- Write advanced SQL for data wrangling, as well as scheduled reporting and automated queries
- Perform data exploration and analytics with the goal of developing new insights and business value
- Help develop APIs and dashboards to surface data insights
Qualifications:
- 3+ years of experience
- Hands-on implementation and maintenance of a large-scale, high-volume data pipeline
- Knowledge of programming language(s) for data processing, such as Python, Scala, R, and libraries and tools, such as Dataflow
- Experience with Ruby and/or Go is preferred
- In-depth knowledge of relational databases (e.g. PostgreSQL, MySQL) and NoSQL databases (e.g. Druid, Cassandra), as well as analytics using BigQuery
- Familiarity with containers and related technologies, such as Docker and Kubernetes, and working with public cloud (e.g. AWS or Google Cloud Platform)
- Knowledge of machine learning and time-series forecasting methodologies
Interested in hearing more? Easy Apply now by clicking the "Apply Now" button.