Data Stream Processing Engineer

UNIBERG is a modern, innovative consulting company with outstanding technology expertise in carrier-grade IP and VoIP networks. UNIBERG is extending its fast growing web-based application services for the telecommunication industry.

We are looking for enthusiastic and technology-driven programmers to join our development and engineering teams in Frankfurt or Darmstadt, Germany. Suitable candidates like to develop software in an agile team and are motivated to learn. We are offering a permanent (unlimited) contract.

Employees at UNIBERG will profit from a company leading in the following technology fields: Performance Management, IP technology, SDN, Virtualisation, Large Scale Database Technologies, Routing, Test-automatization, IMS, and VoIP. You will join a very motivated team of engineers and developers providing solutions for the leading-edge in networking technologies, methodologies and their scaling requirements.

We provide a working environment making you creative and productive but also comfortable so that you will enjoy everyday tasks. Our social community is based on trust, friendship, and fairness. We do take care of nature and supporting voluntary social engagements.

Our social community is based on trust, friendship, and fairness. We do take care of nature and supporting voluntary social engagements. We especially encourage female candidates to apply because gender equality matters to us.

We are looking for a Data Stream Processing Engineer with the following qualifications:

The engineer must find fun in analysing massive data, finding patterns and do statistical analysis. The data stream processing engineer must be able to setup, adapt and develop streaming environments with massive data retrieval and processing, doing annotation and normalisation. He must be able to write concepts and be able to visualise problems, solutions and methodologies on presentations even for a larger audience. The data stream processing engineer must understand KPI for networking, application, virtualization or data-center environments. His daily work will include programing individual solutions for analyzing data, doing normalization, providing statistical information and even presentation of data.

Must have knowledge:
• Strong Mathematical and Statistical Analysis Background
• Experiences with Analysis Tools like Rapidminer, or Numenta/Nupic or equivalent
• Proven experiences with Stream Processing by Spark
• Database Technologies with Hadoop, Cassandra and/or Solr
• Streaming technologies with Kafka message bus
• Programming with C++ and Java

Nice to have would be knowledge in these technical frameworks/software and technologies:
• HighCharts, dygraph, d3 or equivalent graph engines
• Syslog Processing and Analyzing
• Netflow Collecting, Processing and Analyzing
• Python Programming

 

For more: Click the link

Leave a Reply

Your email address will not be published. Required fields are marked *