About the Project
- We are seeking a Senior Data Engineer to lead the design and implementation of scalable data pipelines using Databricks, Snowflake, cloud-native services, and distributed data processing frameworks. You will work across various cloud platforms (AWS, GCP, Azure), contributing to architecture decisions and the adoption of lakehouse patterns. The role involves technical leadership on client projects, ensuring adherence to best practices in data modeling, pipeline orchestration, security, and performance optimization.
Architect and maintain end-to-end data pipelines (batch and streaming) using tools such as Databricks, Apache Spark, Snowflake, and dbt.
Implement lakehouse architectures using technologies like Delta Lake, Apache Iceberg, and Parquet on cloud storage.
Design and manage ingestion workflows from APIs, event streams, databases, and cloud storage using native services.
Optimize data solutions for performance, reliability, cost, and scalability across different cloud environments.
Lead and mentor mid-level and junior engineers, and collaborate with cross-functional teams across engineering, product, and analytics.
4+ years of hands-on experience in data engineering, with a strong background in cloud-based pipelines and distributed processing.
Advanced Python skills; Scala proficiency is a plus.
Deep experience with either Databricks or Snowflake:
For Databricks: familiarity with Jobs, Workspaces, Notebooks, Unity Catalog, and Spark internals.
For Snowflake: hands-on with data modeling, warehouse optimization, Streams & Tasks, Time Travel, Cloning, and RBAC.
Strong knowledge of SQL, dimensional modeling, OLTP/OLAP, and SCDs.
Deep experience with cloud platforms and their data services:
AWS: Glue, Athena, Lambda, DMS, ECS, EMR, Kinesis, S3, RDS GCP: Dataflow, BigQuery, Cloud Functions, Datastream, Pub/Sub, Dataproc, Dataprep
Azure: Data Factory, Synapse Analytics, Azure Functions, Data Explorer, Event Hubs, Data Wrangler
Experience with streaming architectures and real-time data processing.
Familiarity with NoSQL systems (MongoDB, DynamoDB, Cosmos DB).
Knowledge of CI/CD for data pipelines, version control, testing frameworks.
NIX is a global supplier of software engineering and IT outsourcing services
NIX teams collaborate with partners from different countries. Our specialists have experience in developing innovative projects from ecommerce to cloud for some of the largest companies in the world, including from the Fortune 500. The teams are focused on stable development of the international IT market, business, and their own professional skills.










What we offer
- Competitive compensation packages.
- Stable employment, based on a full-time employment contract.
- Mentoring program and professional development plan.
- Private health insurance (Medicare Сlinic).
- AYCM sport pass, providing discounts at various sports facilities in Hungary.
- Interesting tasks and diverse opportunities for developing your skills.
- Free training courses, including English.
- Participation in internal and external thematic events, technical conferences.
- A spacious office in the heart of Budapest (13th district).
- All necessary devices and tools for your work.
- Friendly, motivating atmosphere.
- Active corporate life.
Become a part of NIX team
