الوظائف الحالية
اكتشف و تقدم بالطلب الآن
جميع الوظائف
0
Data Engineer (m/f/d)
Contract
Brussels, Belgium
23.04.2025
Data Engineer
The role of a Data Engineer is highly technical and demands extensive expertise across various software development and programming domains. These professionals possess a deep understanding of data analysis, end-user requirements analysis, and business requirements analysis. This knowledge enables them to grasp business needs clearly and translate them into technical solutions. They are well-versed in physical database design principles and the system development life cycle. Additionally, they thrive in team environments.
Primary Responsibilities
- Data Pipelines: Design, develop, construct, test, and maintain comprehensive data management and processing systems.
- Data Transformation: Aggregate and transform raw data from diverse sources to meet both functional and non-functional business requirements.
- Data Ingestion: Identify opportunities for data acquisition and explore innovative ways to utilize existing data.
- Data Architecture & Models: Create data models to simplify system complexity, enhance efficiency, and reduce costs.
- Performance Optimization & Monitoring: Automate processes, optimize data delivery, and redesign the entire architecture to boost performance.
- Data Quality: Suggest improvements to enhance the quality, reliability, and efficiency of the entire system.
- Data Value: Develop solutions by integrating various programming languages and tools.
- Team Collaboration: As a senior member, communicate effectively and support team members by providing technical and architectural solutions.
Competencies
a. Store:
- Data Modelling
- Data Architecture
- Airflow
- AWS
- Big Data Framework / Hadoop: HDFS, Squid, Spark, Conda, Yarn, and MapReduce
- NoSQL Databases: Cassandra, Hbase, MongoDB
b. Access & Transport – Connectivity:
- ETL (Extract, Transform, Load): Informatica
- Big Data Framework / Hadoop: Flume & Sqoop, Yarn, Zookeeper
c. Enrich:
- Real-time processing framework: Apache Spark
- Big Data Framework / Hadoop: PIG, Hive
- SQL and NoSQL
- Machine learning (nice to have): Python & algorithms
d. Provision:
- Workflow
- Programming: Java, Python, and Scala
e. Development methodologies:
- Agile: Safe or Spotify
- DataOps
f. Ancillary capabilities:
- Strong communication skills
- Problem-solving
- Teamwork
- Innovation