Book a call


IT Camp partner – advertising technology company that provides dynamic, targeted and automated native in-game advertising for the global video games industry across multiple platforms. Its proprietary technology is capable of inserting adverts into naturally occurring advertising spaces within video games across multiple platforms (mobile, PC and console).

Team size: 35 employees.

Office location: Brīvības street 214 (VEF), remote work option is acceptable – you can choose the most comfortable way for you.

Due to the growth of the company for our partner we are looking for Data Engineer to join the team.

Main tasks

• Create and maintain optimal data pipeline architecture
• Work with stakeholders including Product, Data Science and Back-end teams to assist with data-related technical issues and support their data infrastructure needs
• Architect and design data pipelines that are able handle billions of monthly data events
• Work on the scalability of the internal data platform to make sure handling greater workloads is efficient
• Work on improving the data quality to ensure stakeholders are able to work reliably with the data
• Come up with ideas for the current technology optimization to stay cutting-edge


• Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases
• Experience building and optimizing ‘big data’ data pipelines, architectures and data sets
• Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement
• Senior level experience of Python and knowledge of Pandas framework
• Understanding of high load concepts and ability to work with microservice architectures in mind
• Experience working with cloud platforms (like Google Cloud)
• Good English (spoken and written)

Nice to have:
• Experience with working and deploying Machine Learning pipelines
• Knowledge of BI Tools like Looker and how they work
• Previous experience with queue brokers (RabbitMQ, Kafka, Cloud Pub/Sub)
• Data aggregation pipelines related skills

Company is offering

• Salary from 1500 till 3000 EUR neto, depending on experience level
• The opportunity to work with a dynamically growing team that’s expanding rapidly
• An excellent start-up-like work environment, short decision paths and fast-moving environment, while being a part of a stable Public company with offices in London and Riga
• Parking place
• Training programs and career development opportunities
• Flexible working hours and work-life balance options.

Apply for this job

    Enter your name*

    Enter your email*

    Attach your resume*

    × How can I help you?