Vilnius
Big Data Engineer
Your main task will be:
- Ensure data quality and integrity through data governance best practices, including validation, transformation, and security protocols, using technologies such as PySpark, Impala, Hive, HDFS, ClickHouse, Zabbix, and Airflow
- Collaborate with the project management team to develop the overall implementation solution plan and actively contribute to project life cycle phases, leveraging technologies like PySpark and Airflow
- Continuously monitor and improve the performance of data systems to ensure they are optimized for speed, reliability, and scalability, utilizing technologies like PySpark and HDFS
Job Requirements
- 3+ years of data engineering or similar technical experience
- Knowledge of SQL, Data modelling architecture and ETL processes with tools such as Apache Airflow
- Proficiency in PySpark, Python, Linux, HDFS, Elasticsearch, ClickHouse, and various SQL databases
- Experience with AWS and NiFi is a plus
- Strong analytical and problem-solving skills
Benefits:
- Have freedom to implement best ideas
- Personal phone with unlimited calls, SMS, and data
- 5G mobile router for your home or travels
- Go3 + Netflix subscription for your downtime
- Comprehensive health and accident insurance
- Tier III pension accumulation plan
- Flexible hybrid setup: 4 days in our Vilnius office, 1 day remote
- Workation options and flexible working hours
- Themed company events, engaging team activities, and a fun, modern work culture
Gross salary 3000 – 4200 EUR/month