
Data Engineer
- Polska
- Stała
- Pełny etat
- Develop and maintain CI/CD pipelines to enhance developer productivity, agility, and code quality.
- Design and implement robust, scalable data pipelines using GCP services, Python, DBT, and Airflow.
- Optimize data processes for performance, scalability, and reliability.
- Code and manage BigQuery SQL procedures, functions, and other database components.
- Implement and manage ETL/ELT workflows ensuring high performance and scalability.
- Set up and improve monitoring, alerting, and troubleshooting processes to ensure data integrity and system health.
- Collaborate on architecture design and data modeling (star schema, snowflake schema, normalization).
- Ensure data quality through testing, validation, and automated checks.
- Produce and maintain comprehensive technical documentation.
- Stay updated with industry trends and best practices in data engineering and cloud technologies.
- 6+ years of experience in Data Engineering, including ETL architecture and pipeline development.
- Minimum 2 years of hands-on experience with GCP, including services such as BigQuery, Composer, GCS, and Cloud Functions.
- 3+ years of advanced SQL and strong Python or Java programming skills.
- Proven experience in building and maintaining cloud-based Data Warehouses (preferably BigQuery; Snowflake, AWS, or Azure also valued).
- Experience with DevOps tools (e.g., Git, GitLab) and practices such as automation and CI/CD.
- Familiarity with tools like Apache Airflow and integration platforms such as Celigo.
- Solid understanding of data modeling concepts and data warehouse/lake architectures.
- Strong troubleshooting and performance optimization abilities.
- Bachelor's degree in Computer Science, MIS, CIS, or equivalent experience.
- Excellent communication and teamwork skills.