Back to all jobs
Specialist Data Engineering – Full-Time – Romania

Specialist Data Engineering – Full-Time – Romania

  • Full-time
  • Multiple cities
  • 7 October 2021
  • 1 position

Description

We are looking for a Data Engineer who will work on collecting, storing, processing, and managing huge and disparate sets of data. This involves the data solution design for and hands-on development of data-driven applications across the Schaeffler Industrial Division.

Typical tasks include the implementation and management of data ingestion and processing pipelines, data modeling and transformation, and the provisioning of data through unified data services.

Responsibilities

  • Design and implement large-scale data solutions using Cloud technologies;
  • Apply corporate data models for integrating, exploring, and retrieving data from heterogeneous data sources and IT systems;
  • Implement and manage ETL processes, data transformations, data flows and service APIs;
  • Work with data virtualization, integration, and analytics technologies on our data platform;
  • Implement optimized data representations (views) and aggregations from multiple data sources;
  • Apply data ingestion, retention, lineage, and access policies that have been defined by our corporate Digitalization & IT units;
  • Work in interdisciplinary, cross-functional teams according to agile methodology;
  • Closely collaborate with divisional business organizations, digitalization, and IT functions.

Skills and Qualifications

  • B.Sc. or M.Sc. in Computer Science or related fields;
  • Very strong language skills in English and / or German required;
  • Proficient understanding of distributed and Cloud computing principles and technologies;
  • Experience with integration of data from multiple data sources and IT systems (e.g., ERP/SAP, DWH/SAP-BW, Web Services, CRM/Salesforce, MES, etc.);
  • Experience with in working with Cloud environments and services, preferably with Microsoft Azure (e.g., Azure Data Factory, Data Lake, Event Hub, IoT Hub, Databricks, etc.);
  • Strong programming skills in Java and/or Python; Software development experience is a plus;
  • Experience in data modeling (ER, UML) and proficiency in SQL;
  • Experience with relational databases such as Microsoft SQL Server and Oracle Postgres;
  • Experience with non-relational databases such as MongoDB for example, their query languages and fields of application;
  • Knowledge of ETL techniques and frameworks, such as ADF, PowerCenter, NiFi, Sqoop.

The followings skills are considered as a plus and potential candidate differentiators:

  • Proficiency in database and system administration (Linux, Windows);
  • Experience with data virtualization architectures and platforms (e.g., Denodo);
  • Experience with data warehouse design and development;
  • Know-how and practical experience with IOT, machine connectivity and messaging infrastructure (nats.io, e.g.).