Collaborate with cross-functional teams to identify, design, and implement data processing features to meet dynamic business requirements.
Architect and maintain scalable systems for extracting, transforming, and loading data from diverse sources such as external APIs, data streams and databases.
Implement robust solutions adhering to data privacy and security standards, ensuring compliance and data integrity.
Monitor industry trends and advancements in real-time data processing, proposing and implementing changes aligned with organizational goals.
Share expertise and collaborate with teams on various data engineering and project-related topics.
Evaluate, select, and implement tools and strategies for effective real-time data integration scenarios.
What we expect from you
-
3+ years of commercial experience in data engineering, emphasizing real-time processing.
-
Proficient programming skills in Scala.
-
In-depth understanding and practical experience with distributed computing approaches, patterns, and technologies, especially those related to real-time processing (e.g., Spark Streaming, Akka Streams).
-
Hands-on experience with any cloud platform (GCP, AWS, Azure) and its data-oriented components.
-
Strong proficiency in Scala, SQL and query tuning.
-
Expertise in data warehousing principles and modeling concepts.
-
Competence in using relational databases (PostgreSQL, MSSQL or MySQL).
-
Experience orchestrating data flows using tools like Apache Flink, Apache Kafka or similar technologies.
-
A collaborative team player with excellent communication skills.
-
Minimum English level B2.
Expertise in stream processing using industry standards (e.g., AWS Kinesis, Kafka Streams, Spark).
Deep knowledge of data storage design principles and understanding of SQL/NoSQL solutions and their configurations.
Experience in modern data warehouse building using platforms like Snowflake, AWS Redshift or BigQuery.
Proficiency in Spark internals, including tuning and query optimization for real-time processing.
Background in data integration and business intelligence architecture with a real-time perspective.
Experience with data lakes and lake-houses (e.g., Azure Data Lake, Apache Hudi, Apache Iceberg, Delta Lake).
Familiarity with containerized (Docker, ECS, Kubernetes) or serverless (Lambda) deployment.
Knowledge of popular data standards and formats (e.g., JSON, Avro, ProtoBuf).
Exposure to data science and machine learning.
NIX is a global supplier of software engineering and IT outsourcing services
NIX teams collaborate with partners from different countries. Our specialists have experience in developing innovative projects from ecommerce to cloud for some of the largest companies in the world, including from the Fortune 500. The teams are focused on stable development of the international IT market, business, and their own professional skills.
What we offer
-
Competitive compensation packages.
-
Stable employment, based on a full-time employment contract.
-
Private health insurance (Medicover Сlinic).
-
AYCM sport pass, providing discounts at various sports facilities in Hungary.
-
Interesting tasks and diverse opportunities for developing your skills.
-
Free training courses, including English.
-
Participation in internal and external thematic events, technical conferences.
-
A spacious office in the heart of Budapest (13th district).
-
All necessary devices and tools for your work.
-
Friendly, motivating atmosphere.
-
Active corporate life.