Data Engineering intern
Algoan is a Fintech founded in 2018 that transforms the way credit decisions are made through its scoring and budget analysis solutions based on Open Banking. Our API-based products enable financial institutions to optimize their decision-making processes by analyzing the banking data of loan applicants.
As a European leader in this field, we collaborate with major players (Sofinco, Cofidis, Revolut, etc.) to enhance user experience and acceptance rates while reducing the risk of over-indebtedness across various use cases (installment payments, consumer credit, car financing, etc.). Algoan strives to make credit more accessible to a broader audience, with security and compliance at the heart of its priorities.
Key figures:
- We are a team of 30 people.
- We have raised over €10 million.
- Our API is available in 6 European countries.
- We are ISO 27001 certified and regulated by the ACPR.
- Our data stack: Our core technologies include MongoDB, DataFlow, BigQuery, dbt, Python, and git.
Tasks
As a Data Engineer Intern, your responsibilities will focus on the following:
- Understanding needs: Work closely with the Data Science, Product, and Business teams to fully understand data analysis requirements and use cases.
- ETL pipeline deployment: Adjust data ingestion and transformation processes based on identified needs, in collaboration with the TechOps team.
- Optimization: Optimize pipelines to improve performance and minimize execution costs in collaboration with the TechOps team.
- Monitoring: Enhance existing tests and alerts to monitor pipeline performance. React promptly to anomalies to correct and secure pipelines in collaboration with the TechOps team.
Requirements
Are you in the final year of your Master’s degree, Engineering school, or equivalent, specializing in data? If the following describes you, get in touch with us:
- You have strong proficiency in Python;
- You have solid knowledge of databases and SQL queries;
- You know how to use git;
- Bonus: You’ve already worked on GCP (DataFlow/BigQuery), used Terraform or dbt.