Big Data Architect
XCaliber is looking for a Big Data Architect
+ 15 nationalities Offices in Malta, Poland and Portugal +80 staff
XCaliber is a tight-knit Technology team with offices in Malta and Poland. We collaborate to produce some of the most innovative technology in the iGaming industry.
There are a lot of plus points to working at XCaliber including:
Flexible working hours - Central office
Competitive salary - Calendar of social events
Conference / training budget - Gym benefit
Free lunch twice per week – daily breakfast
The Big Data Architect will be responsible for the successful delivery of Big Data and Data Warehouse solutions and related components.
In this role, you will be charged with creating workflows, procedures and identifying the appropriate technology stacks needed to fulfill the reporting and data management aspects of the business. Yet, this is not the average data warehouse and the aim is to establish a modern distributed system where real-time, data streams and cloud data warehouse storage engines will be part of your Swiss-knife.
Based on this, you will have a large amount of autonomy and a huge amount of creative lead on what is needed to create a successful DWH service to both us and our partners.
The ideal candidate will have a range of experience in the architecture of solutions including a strong background in Data Architecture, Big Data, Data Modelling and Data Governance and a background in strong technical team leadership.
- design, develop, and maintain a data warehouse and analytics architecture to meet business analysis and reporting needs, where scale and reliability of a modern distributed system is at heart
- develop, test, improve and maintain ETL jobs and data flow scripts to fill data warehouse from production databases and various external data sources
- manage data quality and develop processes to ensure the highest standards
- build fast and scalable data flows between databases, data warehouse, and cloud platforms, ensuring high availability and reliability of the solution
- propose solutions to optimize and scale data warehouse and data processing infrastructure
- contribute to the development of data analytical and self-learning applications
- Knowledge of data integration, data modeling, normalization and design of data warehouse
- Advanced knowledge of ETL development and optimization
- Experience with ETL with microservices, not a Monolith
- Experience with real-time and stream processing
- Experience in working with various data sources: Web Services, application API, XML, Relational Databases, Document Oriented NoSQL databases, etc.
- Good understanding of Big Data technologies (especially open source like:
- Apache Flink
- Apache Spark
- Apache Druid
- Kafka, Kafka-Connect
- Experience with big data storage and query engines
- Production experience with Amazon AWS especially for BigData purposes.
- Experience with cloud infrastructure providers
- High integrity and credibility, can-do attitude
- Excellent communicator in English.