Profile

Senior Software Engineer with over five years of experience in Python-based backend development for scalable web applications and APIs. Skilled in data-driven solutions that focus on performance and reliability, and has published two academic papers in AI, bridging research insights with real-world challenges. Enjoys collaborating across teams to deliver high-quality software and remains committed to continuous learning and best practices in software development.

Technical Stack:

Programming Languages:

Python, Go, JavaScript, C++, Bash

Data Processing & Analytics:

SQL, Pandas, PySpark

AI & Machine Learning:

TensorFlow, Keras, Scikit-learn, OpenCV

Backend Frameworks:

FastAPI, Gin (Go), Flask

Frontend Frameworks:

React, Vue.js

Web Servers & Databases:

Nginx, Apache
PostgreSQL, MySQL, MongoDB,DynamoDB Elasticsearch

Monitoring & Observability:

Prometheus, Grafana, AWS CloudWatch

DevOps & Cloud:

AWS (Lambda, SQS, SNS, EventBridge, S3, Step Functions...)
Docker, Docker-Compose, Git, Terraform, Jenkins

Experiences

Software Developer

AXA XL Icon
AXA XL, Paris

Property Intelligence Platform

July 2023 - Present

Contributed to the development of the Property Intelligence Platform, a solution designed to assess various risks for buildings worldwide, providing critical insights to underwriters and risk assessors.

Key Contributions:
  • Architectural Design & Documentation: Participated in the creation and documentation of technical solutions following Domain-Driven Design (DDD) principles and a Hexagonal Architecture approach, ensuring clear boundaries between core business logic and external services.
  • Technical Spikes & Performance Optimization: Performed spikes to explore new architectures, frameworks, and indexing strategies (e.g., MongoDB indexes) to optimize performance and scalability of data-heavy operations.
  • Backend Development & API Engineering: Developed and maintained RESTful APIs to expose building risk data and assessments, integrating third-party risk services and geospatial data processing.
  • Collaboration Across Teams: Worked closely with frontend developers and QA teams, detailing functional requirements, clarifying acceptance criteria, and ensuring robust testing for newly delivered features.
  • Data Pipelines & Validation: Built pipelines to ingest, validate, and store building information and risk metrics, ensuring a high level of accuracy and long-term scalability.
  • Worked closely with underwriters to incorporate real-time feedback, reducing manual effort and shortening the quote-to-bind cycle for high-risk properties.

Technologies Used: Python, FastAPI, MongoDB, Terraform, AWS (Lambda, S3, DynamoDB, CloudWatch)



Telematics Data Scoring

Spearheaded the development and optimization of data pipelines to process and score telematics data from insured vehicles, enabling personalized insurance offerings based on driving behavior.

Key Responsibilities:
  • Designed and implemented scalable data processing pipelines using PySpark on AWS Batch and Step Functions.
  • Automated data ingestion, transformation, and scoring processes to enhance real-time analytics.

Technologies Used: Python, PySpark, AWS (AWS Batch, Step Functions, S3, DynamoDB, Lambda, CloudWatch), Terraform

Senior Software Developer

Free Icon
Iliad / Free, Paris

Development Of FTTH Network System

Februrary 2021 - July 2023

Led the development of 'Arobase SI', a groundbreaking FTTH network management system utilizing innovative PON technology.

Key Achievements:
  • Orchestrated the development of Arobase SI for ILIAD's Play subsidiary in Poland.
  • Innovated an authentication server interface using React, enhancing user access and management capabilities.
  • Engineered a comprehensive application for third-party system integration with the Core API.
  • Designed intuitive graphical interfaces for efficient network and optical route management.
  • Developed robust relational databases using SQL for enhanced data management.
  • Implemented a continuous deployment strategy (CI/CD) with GitLab and Swarm.
  • Integrated AWS S3 for secure and efficient file storage solutions.
  • Created an ETL pipeline with Logstash to improve address search functionality.

Technical Environment: Python, FastAPI, SQL, PostgreSQL, ELK, AWS S3, Docker, Docker Swarm, OpenAPI, Prometheus, Grafana

Legacy System Modernization

Modernized legacy systems to enhance performance and scalability, ensuring seamless integration with modern architectures.

Key Contributions:
  • Transitioned a legacy PHP project to a scalable, modern Golang-based architecture.
  • Redesigned the Flux API in Go for improved performance and seamless integration.
  • Configured and refined a GitLab CI/CD pipeline for optimized deployment using Docker Swarm.

Technical Environment: GO, Gin, SQL, TDD, PostgreSQL, Docker, Docker Swarm, OpenAPI, Prometheus, Grafana

Invauth Platform Development

Developed a microservices-based document authentication system integrated with blockchain technology to enhance security and user experience.

Key Achievements:
  • Led the creation of a microservices-based document authentication system using FastAPI.
  • Integrated the platform with Solana blockchain for secure digital transactions.
  • Designed a user-friendly interface, focusing on enhanced user experience.
  • Automated workflows and data integration processes to streamline operations.
  • Collaborated with security teams to ensure compliance and data protection.

Technical Environment: FastAPI, Python, Docker, Solana Blockchain

Software Developer & AI Engineer

Air&D Icon
Air&D, Strasbourg, France

Project:

November 2019 - Februrary 2021

Designed and implemented an intelligent model capable of learning the physical laws of fluid mechanics, able to predict pollution dispersion in real-time over the Bas-Rhin region.

  • Development of a SaaS solution in Flask, which transfers the inputs provided by the user to a computation server.
  • Design and implementation of an AI model using Deep Learning to learn physics laws for real-time pollution dispersion prediction.
  • Deployment of the AI solution online with Dash, Flask, and Docker.
  • Built a computational server in Node.js that processes the studies received by the SaaS and returns the results.
  • Development of multiple APIs which receive data from all Air&D sensor networks and insert data into the database.
  • Development of the frontend in Vue.js to visualize the data flows received.

  • Technical Stack: DEV (Flask, Node.js, Vue.js, PostgreSQL, Docker), Machine Learning (CNN, MultiResUnet)

Publications and Certifications