Senior Data Engineer with Scala

Berlin

Company Social & Media:

Diconium

About the Company

Diconium is a global digital business transformation company that delivers innovative strategies and solutions across various industries. The company helps clients, including Fortune 500 firms, global market leaders, and SMEs, maximize the impact of their digital initiatives using software, data, and AI. With a focus on human connections, collaboration, and inclusivity, Diconium fosters a supportive work environment and provides maximum flexibility through a hybrid workplace.

The primary location for this role is Berlin.

About the Role

The Data Engineer role focuses on designing and developing scalable data pipelines, improving data processing efficiency, and supporting clients in creating smart data products.

Key Responsibilities

  • Design and develop scalable data pipelines to process and analyze large volumes of data
  • Evaluate and implement new data engineering technologies and tools
  • Support clients in accessing and utilizing data effectively across multiple domains (mobility, automotive, industrial, consumer, financial, non-profit)
  • Collaborate with Data Scientists and Analysts to identify data sources and develop appropriate solutions
  • Share expertise, mentor team members, and provide technical leadership within the team
  • Develop and maintain documentation for data engineering processes, best practices, and technologies

Requirements

  • At least 5 years of experience in Data Engineering
  • Proficiency in at least one programming language such as Python, Java, or Scala
  • Experience with Big Data technologies such as Hadoop, Spark, or Kafka
  • Knowledge of batch and real-time data processing frameworks (Apache Spark, Apache Flink, or similar)
  • Hands-on experience with cloud-based data platforms (Azure, AWS, GCP)
  • Understanding of data modeling and data architecture
  • Experience providing technical leadership and advising clients on technical concepts

Nice to Have

  • Experience developing machine learning pipelines and models
  • Knowledge of agile project management methodologies (SAFe, Scrum, Kanban)
  • Familiarity with DataOps and DevOps practices
  • Experience implementing data quality and data governance frameworks

Benefits

  • Professional and personal growth through training programs, language courses, competence centers, and an active tech community
  • Flexible work-life balance including hybrid work, flexible hours, workation, parental support, and sabbaticals
  • Opportunities to engage in sustainability initiatives, diverse communities, and after-work activities
  • Comprehensive benefits including public transport tickets, job bikes, health offers, supplementary insurance, pension plan, and various discounts

Complete details about this role can be found on the official website below: