Data Engineer with Scala

Kyiv

Company Social & Media:

Tieto

About the Company

Tietoevry Create is part of a leading technology company delivering innovative digital solutions across multiple industries. The company collaborates with global clients, including BICS, a worldwide telecommunications enabler, providing scalable data platforms and analytics solutions. Tietoevry Create emphasizes high-performance engineering, cloud-native architectures, and cross-functional collaboration.

About the Role

The company is seeking a Data Engineer to design, build, and optimize large-scale data pipelines for the BICS Voice and CC Value Streams. The role focuses on Scala Spark, Databricks, and AWS cloud services, delivering high-performance data platforms for network analytics, customer insights, real-time monitoring, and regulatory reporting.

Key Responsibilities

  • Design, develop, and maintain scalable batch data pipelines using Scala, Databricks Spark, Databricks SQL, and Airflow
  • Implement optimized ETL/ELT processes to ingest, cleanse, transform, and enrich large volumes of telecom network and operational data
  • Ensure pipeline reliability, observability, and performance tuning of Spark workloads
  • Build and manage data architectures leveraging AWS services such as S3, Lambda, IAM, and CloudWatch
  • Implement infrastructure-as-code using Terraform
  • Ensure security best practices and compliance with telecom regulatory requirements (GDPR, data sovereignty, retention)
  • Collaborate with cross-functional teams including Architecture, DevOps, Network Engineering, and Business Intelligence
  • Document system designs, data flows, and best practices

Requirements

  • 4+ years of experience as a Data Engineer or Big Data Developer
  • Strong proficiency in Scala and functional programming concepts
  • Advanced experience with Apache Spark (batch processing, performance tuning, cluster optimization)
  • Experience with optimized SQL-based data transformations for analytics and machine learning workloads
  • Hands-on experience with Databricks (notebooks, jobs, Delta Lake, Unity Catalog, MLflow is a plus)
  • Solid understanding of CI/CD practices using Git, Jenkins, or GitLab Actions
  • Strong AWS skills including S3, Lambda, IAM, CloudWatch, and related services
  • Knowledge of distributed systems, data governance, and security best practices
  • Experience with Airflow integration for orchestration of cloud data pipelines
  • Experience with IaC tools such as Terraform or CloudFormation
  • Python, DBT, or Snowflake experience is a plus
  • Experience in telecom sector and agile way of working is an advantage
  • English proficiency

Nice to Have

  • Strong analytical and problem-solving skills
  • High degree of ownership and continuous improvement mindset
  • Quality-oriented and pragmatic solution focus
  • Excellent communication and teamwork abilities
  • Ability to translate business requirements into technical solutions

Benefits

  • Opportunity to work with large-scale data platforms in the telecom industry
  • Exposure to cutting-edge technologies including Scala Spark, Databricks, and AWS
  • Collaborative, cross-functional team environment
  • Career growth opportunities within a global technology company
  • Inclusive, diverse, and supportive workplace culture

Complete details about this role can be found on the official website below: