About the Project
- Take the next step in your career by joining a team where you will not just build pipelines, but design scalable data platforms. We are looking for an experienced Data Engineer who combines strong technical expertise in Python and SQL with a mindset for optimization. You will work directly with clients to build modern Data Warehouses and Lakehouses using industry-standard tools like Databricks and Azure Data Factory.
Design and implement pipelines to ingest, clean, and transform data into Data Lakes and Data Warehouses.
Actively monitor and optimize pipelines for performance and cost. Differentiate between a working query and an efficient one.
Configure pipeline deployments and manage environment settings using IaC tools (Terraform) and CI/CD.
Implement data quality checks (e.g., Great Expectations, dbt tests) and ensure data integrity.
Communicate with stakeholders to clarify requirements and provide technical guidance/code reviews to Junior engineers.
Core Languages: Python or Scala (Proficient/OOP), SQL (Advanced/Analytical).
Compute & Storage: Databricks, Snowflake, BigQuery, Synapse, Redshift.
Orchestration: Azure Data Factory (ADF), Airflow, Dagster.
Transformation: dbt (Core/Cloud), Spark/PySpark (Optimization focus).
Infrastructure: Terraform (Implementation), Docker, Kubernetes (K8s).
2+ years of hands-on data engineering experience (ETL/ELT).
Strong understanding of Data Warehousing concepts (Star/Snowflake Schema) and Data Lake principles (Medallion Architecture). Key Platforms: Proven experience with Databricks (managing clusters, notebooks), Azure Data Factory (pipelines, data flows), Snowflake.
Proficiency in Python (OOP, functional programming) and strong SQL skills for complex transformations.
Practical experience with at least one major cloud (AWS, GCP, Azure) and its data services (e.g., Kinesis/Lambda, Dataflow/BigQuery, Synapse).
Base understanding of Infrastructure as Code (Terraform). Ability to deploy and modify infrastructure resources independently.
Ability to communicate technical decisions directly to clients and the team.
Experience setting up complex modules in Terraform or Pulumi.
Hands-on experience with Delta Lake or Apache Iceberg features (Time Travel, Schema Evolution).
Experience modeling data for NoSQL databases (MongoDB, DynamoDB).
Associate-level cloud certifications (e.g., Azure Data Engineer Associate, Databricks Certified Data Engineer).
NIX is a global supplier of software engineering and IT outsourcing services
NIX teams collaborate with partners from different countries. Our specialists have experience in developing innovative projects from ecommerce to cloud for some of the largest companies in the world, including from the Fortune 500. The teams are focused on stable development of the international IT market, business, and their own professional skills.
Internet Services and Software
Finance and Banking
What we offer
- Competitive compensation packages.
- Stable employment, based on a full-time employment contract.
- Mentoring program and professional development plan.
- Private health insurance (Medicare Сlinic).
- AYCM sport pass, providing discounts at various sports facilities in Hungary.
- Interesting tasks and diverse opportunities for developing your skills.
- Free training courses.
- Participation in internal and external thematic events, technical conferences.
- A spacious office in the heart of Budapest (13th district).
- All necessary devices and tools for your work.
- Friendly, motivating atmosphere.
- Active corporate life.
Become a part of NIX team














