Company Neurio Technology

Name Senior Data Engineer - Clean Energy - Neurio Technology

Req # 55859

Location TBD

Employment Type Full Time

Shift 1st

Are you interested in joining a high-growth company and software team in the Clean Energy industry? 

Do you want to work with massive amounts of real-time energy data, and the latest technologies in IoT, machine learning, big data, and SaaS? 

Come join the Clean Energy Team in our mission to accelerate the adoption of renewable energy and create a more intelligent home!

Neurio, a subsidiary of Generac Power Systems, is looking for Data Engineers to contribute to our technical vision and to design and build our new cloud data store, and data intelligence pipeline.

Our software combines intermittent energy sources like solar with residential storage systems in order to build a reliable, environmentally sustainable electricity grid. Our applications help homeowners control their appliances and renewable energy sources for energy savings, help them manage and understand their homes’ energy consumption, help installers manage large fleets of devices, and much more.

As a senior member of the team, you will have significant responsibility and influence in shaping its future direction. We are looking for someone to iterate quickly on all stages of data pipeline, including bringing data intelligence products to production.

You should have deep expertise in the design, creation, management, and business use of large datasets, across a variety of data platforms. You should have excellent business and interpersonal skills to be able to work with business owners to understand data requirements, and to build highly scalable timeseries and data lake systems.

Successful candidates will have strong engineering skills and communication, as well as, a belief that data driven processes lead to great products. You will need to have a passion for quality and an ability to understand complex systems.

Above all you should be passionate about working with huge data sets and someone who loves to bring datasets together to answer business questions and drive growth.

As part of this role, you will be required to: 

• Analyze and extract relevant information from large amounts of streaming data to create an event driven data pipeline.

• Establish scalable, efficient, automated processes for large scale data analyses, model development, validation and implementation, specifically productizing ML applications

• Propose and validate solutions for real time streaming solutions

• Work closely with scientists and engineers to create and deploy new features 

What will you be required to have?

Bachelor’s Degree in Computer Science or related field
Production experience with various real time streaming technologies (Flink, Spark, Samza, etc.)
3+ years of Production Experience with Java/Scala and Python
1 to 3+ years of Expertise in developing big data pipelines using technologies like Kafka or Kinesis
1 to 3+ years of Experience with large scale data warehousing, mining or analytic systems.
1- 3+ years of Experience with various AWS services (EC2, S3, Lambda, Kinesis, EMR, Redshift etc.) 
Experience building scalable infrastructure software or distributed systems for commercial online services

What else will you need to be successful?

Strong knowledge of SQL and NoSQL data stores
Experience with YARN, Kubernetes
Experience with orchestration tools such as Apache Airflow
Background in Machine Learning and productizing Data Intelligence pipelines
Strong communication skills and commitment to teamwork
Sharp analytical abilities and proven design skills
Strong sense of ownership, urgency, and drive
Proven leadership abilities in an engineering environment in driving operational excellence and best practices

“We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability status, protected veteran status, or any other characteristic protected by law.”