Senior Data Engineer- databricks (office/remote)

  • Kraków, Poland
  • Full-Time
  • Remote

Job Description:

We are looking for a skilled and proactive Senior Data Engineer with deep expertise in Databricks to join our data platform team. You will be responsible for designing, building, and optimizing scalable data pipelines and lakehouse architectures that power analytics, reporting, and machine learning across the organization.


Location: Poland (office in Krakow/Wroclaw or 100% remote) 

This role is part of a US-based project and follows the US Central Time schedule. We expect candidates to be available from 4 PM to at least 8 PM CET/CESTPriority will be given to candidates who can meet these hours.


RESPONSIBILITIES:

  • Develop and maintain robust ETL/ELT pipelines using Databricks and Apache Spark
  • Design and implement Delta Lake architectures for structured and semi-structured data
  • Collaborate with data analysts, scientists, and product teams to deliver clean, reliable datasets
  • Optimize performance of Spark jobs and manage cluster resources efficiently
  • Automate workflows using Databricks Jobs, Workflows
  • Ensure data quality, lineage, and governance using Unity Catalog and monitoring tools
  • Document data models, pipeline logic, and architectural decisions
  • Participate in code reviews and contribute to engineering best practices

REQUIREMENTS:

  • 4+ years of experience as a Data Engineer or in a similar role
  • Strong hands-on experience with Databricks, including Delta Lake, Spark SQL
  • Proficiency in Python and SQL for data manipulation and pipeline development
  • Solid understanding of Apache Spark internals and performance tuning
  • Experience with cloud platforms (Azure, AWS,GCP)
  • Knowledge of data modeling, partitioning, and lakehouse principles
  • Ability to work with large-scale datasets and optimize storage and compute costs
  • Strong communication skills and ability to collaborate across teams
  • English proficiency- at least B2 level.
  • Highly desirable: availability to work from 4pm to at least 8pm CET/CEST.

NICE TO HAVE:

  • Experience with Azure Data Factory or Airflow
  • Proficiency with modern data warehouses and tools including Snowflake, Synapse, Red Gate
  • Exposure to data governance frameworks and tools (e.g., Unity Catalog, Purview)
  • Understanding of machine learning workflows and integration with data pipelines
  • Experience with BI tools (Power BI, Tableau) and supporting analytics teams
  • Contributions to open-source projects or technical blogs

WE OFFER YOU:

  • Flexible working time - you can agree on it within the team
  • Necessary tools and equipment
  • Communication in English - only foreign customers, and international Teams
  • Simple structure and 'open door' way of communication
  • Full-time English teachers
  • Medical insurance for employees
  • HiQo University- internal education and training programs
  • HIQO COINS - We have a system of rewarding employees for extracurricular activities