Neural Networks beyond explainability: Selective inference for sequence motifs - Analytical Genomics Access content directly
Journal Articles Transactions on Machine Learning Research Journal Year : 2023

Neural Networks beyond explainability: Selective inference for sequence motifs

Abstract

Over the past decade, neural networks have been successful at making predictions from biological sequences, especially in the context of regulatory genomics. As in other fields of deep learning, tools have been devised to extract features such as sequence motifs that can explain the predictions made by a trained network. Here we intend to go beyond explainable machine learning and introduce SEISM, a selective inference procedure to test the association between these extracted features and the predicted phenotype. In particular, we discuss how training a one-layer convolutional network is formally equivalent to selecting motifs maximizing some association score. We adapt existing sampling-based selective inference procedures by quantizing this selection over an infinite set to a large but finite grid. Finally, we show that sampling under a specific choice of parameters is sufficient to characterize the composite null hypothesis typically used for selective inference-a result that goes well beyond our particular framework. We illustrate the behavior of our method in terms of calibration, power and speed and discuss its power/speed trade-off with a simpler data-split strategy. SEISM paves the way to an easier analysis of neural networks used in regulatory genomics, and to more powerful methods for genome wide association studies (GWAS).
Fichier principal
Vignette du fichier
SEISM (1).pdf (798.12 Ko) Télécharger le fichier
SEISM (1).ps (1.1 Mo) Télécharger le fichier

Dates and versions

hal-03895446 , version 1 (12-12-2022)

Identifiers

Cite

Antoine Villié, Philippe Veber, Yohann de Castro, Laurent Jacob. Neural Networks beyond explainability: Selective inference for sequence motifs. Transactions on Machine Learning Research Journal, 2023, ⟨10.48550/arXiv.2212.12542⟩. ⟨hal-03895446⟩
130 View
60 Download

Altmetric

Share

Gmail Facebook X LinkedIn More