They trusted our students
I work in a a Belgian consulting company specialised in AI, which has a partnership with Google. Therefore, we use Google Cloud Platform (GCP) infrastructures and services for nearly all of our projects. I have done some research and model developments for automatic expense validation. Enable automatic prediction of expense category and expense total amount, in order to facilitate expense validation. I have compared two types of Deep Learning models: text-based and image-based models. In both cases, I performed transfer-learning on top of pre-trained models (BERT, for NLP, and Inception V3, for image processing), using Tensorflow. In parallel I decided to focus only on expense categories with low intra-class variance, in order to achieve decent prediction accuracy.
I work as a Site Reliability Engineer for Koodoo.io, an API driven fintech business which connects mortgage lenders to customers. My mission is to use the latest tools to automate and develop towards for maximum site reliability. For example, I have automated the provisioning of all the company infrastructure as part of a deployment pipeline. This allows the developers to make changes much faster while reducing human error. I have also worked in building a monitoring and observability stack using tools like Prometheus and Jaeger, allowing the business to quickly identify performance issues in their microservices based architecture.
I am working as a Data Scientist NLP at Feedis. Feedis is a real-time user feedback analysis solution for mobile applications. It uses the latest advances in artificial intelligence and natural language processing to facilitate the delivery of high-quality, continuously updated insights, integrated directly with the stores. I use Python as my main programming language for my work and using NLTK, Spacy, CoreNLP and a few others for other text processing.
I’m currently doing my internship at Soladis, a company specialised in data analysis and statistics. My internship is about ECG classification using deep learning and neural network. I mostly program in Python using Amazon SageMaker and my files are stored in S3. I learned these tools at DSTI and I am happy to use them on a daily basis.
I’m a Data Science Consultant, so I work on projects for various clients. I have for example programmed a forecasting model for the finance department of a large shipping company which needed to budget late penalties in advance. Another type of mission I had was to study the longterm goals and current work practices of a startup, and make recommandations on how to improve their calculations and reach their objectives.
I am doing my internship at Arago consulting, a company specialized in the implementation of HRIS. My main task is to design an algorithm that will be able to predict the resignation of an employee in a company. I am using Python, merging a lot of algorithms and I focus on the way to provide the results for non-data scientist people. It’s very interesting because I have to translate a business problem into a data-problem then, to find and test the tools which will be able to reach the purpose.
During my internship, I developed a predictive model for geolocation with IoT data. I was granted the time I needed to test many various models. For instance, EM methods, spatial interpolation with CART and Graph-Based machine learning. I also had the opportunity to develop my own with HAC, Kmeans and Topological Modelling. That was a great time where I enjoyed much applied research!
I’m a Data Scientist at Yseop. My work consists in integrating intelligent algorithms with analytical tools in order to generate human-like financial reports with unique styles based on client preferences.
I am Junior Data Scientist at Vinci Autoroutes. My mission here is focused on deploying intelligence in the highway sector. We use Computer Vision to detect incidents in tunnels, read license plates of vehicles, etc.
I am doing my internship as a data scientist at Assystem in Paris. I am doing text mining. I have to develop a system of information extraction from plain text documents. I apply notions from DSTI as statistics, coding (python), sql, web semantic and deep learning.
The main objective of my end-of-studies project is to be at the heart of R&D innovation by participating in several transverse uses cases that deal with the topics of Machine Learning, Deep Learning, Big Data Architecture, Embedded System or even Business Applications. During the internship, I also studied their state-of-the-art, their technology as well as their application in a test/pre-production environment.
I am working with the dataset of hundreds of thousands illustrated artwork records from the French museums provided by the French Ministry of Culture. The dataset includes the structured metadata and the collection of digital images linked with the metadata.