Skip to main content
Salvat

Big Data Engineer with Java @ING Hubs Romania



Apply now

Discover ING Hubs Romania

ING Hubs Romania offers 130 services in software development, data management, non-financial risk & compliance, audit, and retail operations to 24 ING units worldwide, with the help of over 2000 high-performing engineers, risk, and operations professionals.

We started out in 2015 as ING’s software development hub, then steadily expanded our range to include more services and competencies. Now we provide borderless services with bank-wide capabilities and operate from two locations: Bucharest and Cluj-Napoca

Our tech capabilities remain the core of our business, with more than 1800 colleagues active in Data and Analytics Tech, Tech Foundation and Channels, Retail Core Banking and Architecture, and Global Products and Technology Services. 

We enjoy a flexible way of working and a highly collaborative environment, where fair and constructive feedback is encouraged.  

For us, impact isn't a perk. It's the driver of our work. We are guided and rewarded by a shared desire to make the world a better place, one innovative solution at a time. Our colleagues make it their job to do impactful things and they love doing it in good company. Do you?  

Here’s a sneak peak of what our colleagues say about working within ING Hubs Romania:

  • At ING, we're building the solutions of tomorrow, today | 80% of our colleagues in Romania agree

The Mission

Wholesale Banking Data Ingestion Layer is one of the largest and most complex Data Lakes within ING. As an engineer in this space, you will contribute to the development and evolution of scalable, metadata-driven ingestion solutions that power critical data flows across the bank.

The platform serves as a foundational component for analytics, reporting, and regulatory compliance, and your work will directly impact its reliability, performance, and adaptability.

You’ll be joining a team that operates at the intersection of software and data engineering, in a DevOps-oriented environment, where automation, resilience, and collaboration are key.

Your day to day

In this role, you will design, develop, and maintain microservices using Java, Spark, Scala, Spring Boot and Python. You will be responsible for building and supporting scalable generic ingestion pipelines using Hive, NiFi, and HDFS on the Cloudera platform. Collaboration with ETL engineers will be key as you optimize snapshot-based ingestion and metadata-driven architectures. You will also develop and maintain monitoring dashboards using tools such as Kibana, while automating operational tasks through Bash and other scripting tools.

Your daily work will involve distributed systems on data platforms including Kafka, Spark, and Linux-based environments. You will integrate with DataStage and Oracle systems to support enterprise data workflows, and participate in code reviews, testing, and CI/CD processes to ensure high-quality delivery. Throughout, you will uphold best practices in software development, security, DevOps, and data engineering.

What you’ll bring to the team

  • Strong Java programming skills with experience designing and developing microservices‑based applications
  • Hands-on experience with Spring Boot, Apache Spark
  • Solid understanding of Big Data technologies, including Spark, Hive, HDFS, Kafka and NiFi
  • Working knowledge of databases, including Oracle/other RDBMS
  • Comfortable working in Linux environments, with proficiency in shell scripting (Bash)
  • Hands-on experience with monitoring and visualization tools (e.g. Kibana)
  • Strong DevOps mindset, including experience with CI/CD pipelines.
  • Strong motivation to design and build scalable, reliable, and maintainable systems
  • Strong data‑driven mindset with high digital fluency and a clear focus on data quality, reliability, and governance
  • Effective communication skills in English and the ability to collaborate confidently with diverse stakeholders
  • Proven ability to design structured, reusable processes and work in a disciplined, production‑focused environment
  • Team‑oriented, proactive, and curious mindset with a continuous focus on learning and process improvement
  • Strong ownership and delivery mindset, covering the full lifecycle from development to production with a customer‑centric approach

Nice to have

  • Experience working with Azure DevOps and Kubernetes
  • Experience with Google Cloud Platform (GCP) is a plus, enabling contribution to cloud‑native and hybrid data initiatives
  • Experience with Python, Scala and/or other JVM‑based languages
  • Familiarity with observability and monitoring tools such as Grafana, Prometheus, and the ELK stack
  • Experience working in an Agile/Scrum development environment

If you want to deep dive into the processing of personal data conducted by ING Hubs Romania during the recruitment process and your rights related to it, read the privacy notices on our website (make sure to scroll until you reach the Data Protection section/ Candidates tab). 

Apply now

Questions? Just ask
ING Recruitment team

Apply now

Mai mult pentru tine

The latest jobs straight to your inbox

Interested In

  • IT Engineering, București, București, RomâniaRemove

By submitting your information, you acknowledge that you have read our privacy policy and consent to receive email communication from ING.