About the company:
IT Camp client is the fastest-growing online consumer lending and FinTech company in Europe, with headquarters in Riga, Latvia. The company uses the latest technologies to ensure instant consumer identification and scoring.
Team: IT team consists of 80+ professionals across 8 different teams, working with 10+ products, using Agile and Scrum methodologies. Team that is led by seasoned professionals who not only look after their business, but also take good care of their colleagues. A team in which every piece of the puzzle matters.
Location: Office on Skanstes iela, Riga. Help with relocation (documents, tickets, accommodation).
Technology stack: Python, ETL processes, Apache Spark, Apache Kafka, Apache Airflow, RDBMS as Oracle, SQL Server, PostgreSQL, SQL data base, data models (Dimensional, Data Vault), Cloud platforms: AWS, Azure, GCP, Oracle, Kubernetes.
Due the growth the company is looking for Data Engineer (Senior level).
– Design and build data pipelines and infrastructure that enable efficient processing, storage, and analysis of a diverse range of data;
– Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and design solutions that meet their needs;
– Lead the development and implementation of data solutions, including data warehousing, ETL pipelines, and data lakes;
– Work with software engineers to integrate data pipelines and infrastructure into larger software systems;
– Optimize data processing and storage systems for performance and scalability;
– Implement and maintain data security and privacy measures to protect sensitive data;
– Develop and maintain documentation for data pipelines, infrastructure, and processes;
– Mentor and train junior data engineers on best practices and new technologies;
– Stay up-to-date with emerging trends and technologies in the data engineering field and recommend new tools and techniques to improve our data solution.
– Experience with Apache Spark or other data processing technologies;
– Expertise with scripting or programming languages such as Python, Java, or others;
– Experience with SQL;
– Good knowledge of the English language.
As an advantage:
– Experience with data warehousing, ETL, and data lake architecture and design principles;
– Experience with Apache Kafka or other streaming technologies;
– Knowledge of database technologies such as PostgreSQL or Oracle;
– Familiarity with cloud platforms such as AWS, Azure, or GCP and Kubernetes;
– Experience with Apache Airflow or other orchestration technologies;
– Experience working in Agile Scrum teams.
What company offers:
– Be a part of Top 1 Europe financial company;
– Ability to work for an employer that will motivate, appreciate and care for you;
– Friendly and collaborative environment and modern office;
– Flexible working hours start as late as 10 am and ability to work from home 2 days per week;
– Training opportunities and paid conferences fees, relevant Udemy courses etc.;
– Well-being activities, surprises and birthday gifts;
– Team buildings, company events, and an active social life;
– Bonuses for long-term cooperation like extra holidays etc.;
– Health insurance, private gym, fresh beverages and lunch in the office;
– Salary range from 2900 till 5850 EUR gross (2000 – 4000 EUR net), depending on experience and skills.