- Position: Senior Data Engineer
- Location: Cracow, hybrid, 3 days from the office
- Typ of contract: B2B
Nobody read books anymore… Nope! Our client, a leading entertainment and technology company is shaping the future of storytelling, utilizing AI and predictive algorithms to discover untold stories and transform them into major successes. By producing a new $1M ebook every four weeks, they have rapidly become one of the top publishers worldwide, achieving a hit rate 40 times higher than traditional publishers. Their recent expansion into a new streaming platform marks just the beginning of their goal to become a dominant force in next-generation entertainment.
What You’ll Do:
- Build and maintain robust data pipelines using modern tools for data processing and transformation
- Enhance data workflows by optimizing performance, troubleshooting issues, and implementing monitoring systems
- Collaborate with cross-functional teams, including engineers, analysts, and product managers, to align data solutions with business needs
- Establish and uphold data quality standards throughout the data pipeline lifecycle
- Design scalable, efficient, and maintainable data architectures
- Ensure that decision-makers have access to timely, accurate, and actionable data insights
What You’ll Bring:
- Hands-on experience developing and optimizing data infrastructure and pipelines
- Strong proficiency in Python and SQL for data processing and management
- Deep knowledge of data workflow automation and transformation tools
- Ability to excel in a fast-paced, rapidly evolving environment
Who We’re Looking For:
- Independent thinker with a problem-solving mindset
- Data-focused professional who takes initiative and drives results
- Quick to adapt and execute in a dynamic setting
- Highly motivated and eager to advance their career
- Excited about shaping the future of AI-driven experiences in digital content and media
The company’s tech stack:
- Javascript, React 18, React Native/Expo,
- Ruby 3.05, Rails 6.1
- PostgreSQL, Redis, ElasticSearch, Airflow, Python and dbt to support data processing
- Kubernetes
What you can expect:
- A compensation between 5 800 USD –7 500 USD depending on your experience
- Free access to their premium platforms
- Unlimited vacation policy
- Working hours: 11:00 – 19:00