Academic publications

These are my academic publications, see researchgate and google scholar.
Here, I'll leave some informal words about their topics.
Journal Papers
  • M. Fuchs: Conductive Heat Transfer in Thermal Bridges. Encyclopedia, 2022.
    https://doi.org/10.3390/encyclopedia2020067
    BibTeX

    @Article{encyclopedia2020067,
    AUTHOR = {Fuchs, Mathias},
    TITLE = {Conductive Heat Transfer in Thermal Bridges},
    JOURNAL = {Encyclopedia},
    VOLUME = {2},
    YEAR = {2022},
    NUMBER = {2},
    PAGES = {1019--1035},
    URL = {https://www.mdpi.com/2673-8392/2/2/67},
    ISSN = {2673-8392},
    DOI = {10.3390/encyclopedia2020067} }

A complete overview of the finite element modeling steps for the special case of the stationary heat equation in engineering.

  • M. Fuchs, R. Hornung, A.-L. Boulesteix, R. De Bin: On the asymptotic behavior of the variance estimator of a U-statistic. Journal of Statistical Planning and Inference, 2020.
    https://doi.org/10.1016/j.jspi.2020.03.003
    BibTeX

    @article{FUCHS2020101,
    title = {On the asymptotic behaviour of the variance estimator of a U-statistic},
    author = {Mathias Fuchs and Roman Hornung and Anne-Laure Boulesteix and Riccardo {De Bin}},
    journal = {Journal of Statistical Planning and Inference},
    volume = {209},
    pages = {101-111},
    year = {2020},
    issn = {0378-3758},
    doi = {https://doi.org/10.1016/j.jspi.2020.03.003},
    url = {https://www.sciencedirect.com/science/article/pii/S037837582030032X},
    keywords = {Asymptotic test, Best unbiased estimator, Cross-validation},
    }

We show that that the error rate of a supervised learning algorithm can be estimated without bias as long as the test set is at least twice as large as the learning set.

  • M. Fuchs, R. Neumayr: Agent-Based Semiology for Simulation and Prediction of Contemporary Spatial Occupation Patterns. In: Gengnagel C., Baverel O., Burry J., Ramsgaard Thomsen M., Weinzierl S. (eds) Impact: Design With All Senses. Proceedings of the Design Modelling Symposium, Berlin 2019,
    https://doi.org/10.1007/978-3-030-29829-6_50, pages 648 - 661
    BibTeX

    @InProceedings{fuchs2019,
    title={Agent-Based Semiology for Simulation and Prediction of Contemporary Spatial Occupation Patterns},
    author={Fuchs, Mathias and Neumayr, Robert},
    booktitle={Impact: Design With All Senses},
    year={2020},
    publisher={Springer International Publishing},
    address={Cham},
    pages={648--661},
    isbn={978-3-030-29829-6},
    doi={https://doi.org/10.1007/978-3-030-29829-6_50},
    }

We show a way to make the results of pedestrian simulation availabe to the designer. It relies on a spatial regression of the simulated density on distance fields and other architectural features.

  • V.Booshan, M. Fuchs, S. Bhooshan: 3D-Printing, Topology Optimization and Statistical Learning: A Case Study. Proceedings of the Symposium on Simulation for Architecture and Urban Design (2017),
    https://dl.acm.org/doi/10.5555/3289787.3289799,
    BibTeX

    @inproceedings{10.5555/3289787.3289799,
    author = {Bhooshan, Vishu and Fuchs, Mathias and Bhooshan, Shajay},
    title = {3D-Printing, Topology Optimization and Statistical Learning: A Case Study},
    year = {2017},
    isbn = {9781510870185},
    publisher = {Society for Computer Simulation International},
    address = {San Diego, CA, USA},
    booktitle = {Proceedings of the Symposium on Simulation for Architecture and Urban Design},
    articleno = {12},
    numpages = {8},
    keywords = {statistical learning, edit-and-observe, work-flows & digital tools, topology optimization, interactive editing, 3D-printing},
    location = {Toronto, Canada},
    series = {SIMAUD '17} } }

My first architectural paper. We described outputs of a finite element software in terms of geometrical features, by a linear regression. This was fun and interesting, but the most important part was finding out that the surface normal's z component was an important explanatory variable in that linear regression. We found out about that by investigating the Airy approach to structural mechanics more closely.

  • A.-L. Boulesteix, R. De Bin, X. Jiang, M. Fuchs: IPF-LASSO: Integrative L1-Penalized Regression with Penalty Factors for Prediction Based on Multi-Omics Data, Computational and Mathematical Methods in Medicine, 2017.
    https://doi.org/10.1155/2017/7691937
    BibTeX

    @Article{Boulesteix2017,
    author={Boulesteix, Anne-Laure and De Bin, Riccardo and Jiang, Xiaoyu and Fuchs, Mathias},
    title={IPF-LASSO: Integrative L1-Penalized Regression with Penalty Factors for Prediction Based on Multi-Omics Data},
    journal={Computational and Mathematical Methods in Medicine},
    year={2017},
    month={May},
    day={04},
    publisher={Hindawi},
    volume={2017},
    issn={1748-670X},
    doi={https://doi.org/10.1155/2017/7691937},
    url={https://doi.org/10.1155/2017/7691937}
    }



    This paper is mostly based on the preprint https://epub.ub.uni-muenchen.de/26551/
    (Technical Report 187), Department of Statistics, Ludwig Maximilian University of Munich.

Here, we took a look at the problem of integrating different group of covariables, the prototypical situation being when mRNA and miRNA are to be integrated. We pursue a fairly immediate line of thought: to apply different lasso penalizations to each of them. The details, however, are quite nitty-gritty. We show how penalized regression generalises naturally to multiple data sources, by penalising each modality differently.

  • M. Fuchs, N. Krautenbacher: Minimization and estimation of the variance of prediction errors for cross-validation designs. Journal of Statistical Theory and Practice (2016),
    http://dx.doi.org/10.1080/15598608.2016.1158675,
    freely available at https://epub.ub.uni-muenchen.de/27656/
    BibTeX

    @Article{Fuchs2016,
    author={Fuchs, Mathias and Krautenbacher, Norbert},
    title={Minimization and estimation of the variance of prediction errors for cross-validation designs},
    journal={Journal of Statistical Theory and Practice},
    year={2016},
    month={Jun},
    day={01},
    volume={10},
    number={2},
    pages={420-443},
    issn={1559-8616},
    doi={https://doi.org/10.1080/15598608.2016.1158675},
    url={https://doi.org/10.1080/15598608.2016.1158675}
    }

Yet another topic: theoretical statistics. This is my second favorite paper on this side. We identified and computed certain covariances between evaluations of error estimators. Furthermore, we identified U-statistics to show how these covariances can be estimated.

  • M. Fuchs: Equivariant K-homology of Bianchi groups in the case of non-trivial class group. Münster Journal of Mathematics 7 (2014), 589–607
    http://doi.org/10.17879/58269758846
    BibTeX

    @article{fuchsequivariant,
    title={Equivariant K-homology of Bianchi groups in the case of nontrivial class group},
    author={Fuchs, Mathias},
    journal={M{\"u}nster Journal of Mathematics},
    publisher={Mathematisches Institut (Universit{\"a}t M{\"u}nster)},
    doi={https://doi.org/10.17879/58269758846}
    }

Back to pure Mathematics. This paper is in spirit not too far away from that of my PhD thesis. I think this paper is the one I am most proud of.

  • M. Fuchs, T.Beissbarth, E.Wingender, K.Jung: Connecting high-dimensional mRNA and miRNA expression data for binary medical classification problems, Computer Methods and Programs in Biomedicine (2013), http://dx.doi.org/10.1016/j.cmpb.2013.05.013
    BibTeX

    @article{FUCHS2013592,
    title = {Connecting high-dimensional mRNA and miRNA expression data for binary medical classification problems},
    journal = {Computer Methods and Programs in Biomedicine},
    volume = {111},
    number = {3},
    pages = {592-601},
    year = {2013},
    issn = {0169-2607},
    doi = {https://doi.org/10.1016/j.cmpb.2013.05.013},
    url = {https://www.sciencedirect.com/science/article/pii/S0169260713001703},
    author = {Mathias Fuchs and Tim Beißbarth and Edgar Wingender and Klaus Jung},
    keywords = {Classifier combination, Discriminant analysis, Non-linear classification, MicroRNA, High-dimensional data},
    }

Yet another topic: a general overview of data combination strategies. In retrospect, there are better ones out there, though, than the ones we looked at.

  • V.Halacheva, M. Fuchs, J.Dönitz, T.Reupke, B.Püschel, C.Viebahn: Complex Planar Cell Movement and Oriented Cell Division Build the Early Primitive Streak in the Mammalian Embryo, Developmental Dynamics 240, 1905-1916 (2011),
    https://dx.doi.org/10.1002/dvdy.22687
    BibTeX

    @article{halacheva2011planar,
    title={Planar cell movements and oriented cell division during early primitive streak formation in the mammalian embryo},
    author={Halacheva, Viktoriya and Fuchs, Mathias and D{\"o}nitz, J{\"u}rgen and Reupke, Tobias and P{\"u}schel, Bernd and Viebahn, Christoph},
    journal={Developmental Dynamics},
    volume={240},
    number={8},
    pages={1905--1916},
    year={2011},
    publisher={Wiley Online Library},
    doi={https://dx.doi.org/10.1002/dvdy.22687}
    }

Once again, a completely different topic: biology, and in particular developmental biology. The primitive streak is the first morphogenic feature in the nascent mammalian embryo. We applied circular statistics to show that cell divisions prefer certain directions.

  • M.Ante, E.Wingender, M. Fuchs: Integration of gene expression data with prior knowledge for network analysis and validation, BMC Research Notes 4, 520 (2011),
    https://doi.org/10.1186/1756-0500-4-520
    BibTeX

    @Article{Ante2011,
    author={Ante, Michael and Wingender, Edgar and Fuchs, Mathias},
    title={Integration of gene expression data with prior knowledge for network analysis and validation},
    journal={BMC Research Notes},
    year={2011},
    month={Nov},
    day={28},
    volume={4},
    number={1},
    pages={520},
    issn={1756-0500},
    doi={https://doi.org/10.1186/1756-0500-4-520},
    url={https://doi.org/10.1186/1756-0500-4-520}
    }

My first non-mathematical paper: Some work on biological databases, of a rather algebraic/non-quantitative flavor. Biological databasesnothing I really worked on before or after anytime again ....

  • A.D.Rahm, M. Fuchs: The integral homology of PSL2 of imaginary quadratic integers with non-trivial class group: Journal of Pure and Applied Algebra (2010),
    https://dx.doi.org/10.1016/j.jpaa.2010.09.005
    BibTeX

    @article{RAHM20111443,
    title = {The integral homology of PSL2 of imaginary quadratic integers with nontrivial class group},
    journal = {Journal of Pure and Applied Algebra},
    volume = {215},
    number = {6},
    pages = {1443-1472},
    year = {2011},
    issn = {0022-4049},
    doi = {https://doi.org/10.1016/j.jpaa.2010.09.005},
    url = {https://www.sciencedirect.com/science/article/pii/S0022404910002082},
    author = {Alexander D. Rahm and Mathias Fuchs},
    }

The title says it all: we compute the group homology of a certain class of groups acting on hyperbolic three-space. Was (mostly) fun. I learned a lot about homological algebra, and about fundamental sets of actions.

Thesis

PhD thesis: M. Fuchs, "Cohomologie cyclique des produits croises associes aux groupes de Lie". Written at the Institut de Mathematiques de Luminy under the direction of Michael Puschnigg at the Universite de la Mediterranee Aix-Marseile,
https://arxiv.org/abs/math/0612023

I give a completely new proof of the fact that the group ring of torsion-free discrete co-compact subgroups of SL(n, C) satisfies the Kadison-Kaplansky conjecture: it is free of non-trivial idempotents. Unfortunately, I never came back to that topic later.

Technical Reports
  • M. Fuchs, X. Jiang, A.-L. Boulesteix, 2016. The computationally optimal test set size in simulation studies on supervised learning.
    https://epub.ub.uni-muenchen.de/26870/
    (Technical Report 189), Department of Statistics, Ludwig Maximilian University of Munich
    BibTeX

    @misc{epub26870,
    author = {Mathias Fuchs and Xiaoyu Jiang and Boulesteix Anne-Laure},
    series = {tech},
    volume = {189},
    keyword = {simulation study
    supervised learning},
    title = {The computationally optimal test set size in simulation
    studies on supervised learning},
    year = {2016},
    url = {http://nbn-resolving.de/urn/resolver.pl?urn=nbn:de:bvb:19-epub-26870-1},
    doi={https://doi.org/10.5282/ubm/epub.26870}
    }

A simulation study on supervised learning is when the following process is carried out repeatedly, with independent repetitions: One draws a learning sample from a distribution. The size of the learning sample, i.e., the number of observations it is comprised of, is to be kept the same throughout. Each observation consists of predictors or (covariables), and an outcome or value of response variable, for instance, a binary prediction target. Furthermore, one evaluates the performance of the predictive model thus achieved, by evaluating it on a test set. Averageing out across all independent iterations is guaranteed to converge towards the true expectated value of the error estimator - the true error. It might seem a little unintuitive that the latter statement holds true regardless of the size of the test set. The reason is that the mentioned expected value is unaffected by the size of the test set. Therefore, it is just a matter of computational convenience to choose it carefully. It is the goal of this paper to perform that choice in an educated way, as a function of the machine computation execution speed.

  • M. Fuchs, R. Hornung, R. De Bin, A.-L. Boulesteix, 2013. A U-statistic estimator for the variance of resampling-based error estimators.
    http://epub.ub.uni-muenchen.de/17654/
    (Technical Report 147), Department of Statistics, Ludwig Maximilian University of Munich
    BibTeX

    @misc{epub17654,
    volume = {148},
    series = {tech},
    author = {Mathias Fuchs and Roman Hornung and Riccardo De Bin and Anne-Laure Boulesteix},
    title = {A U-statistic estimator for the variance of resampling-based error estimators},
    year = {2013},
    keyword = {Unbiased Estimator; Penalized Regression Model; U-Statistic; Cross-Validation; Machine Learning;},
    url = {http://nbn-resolving.de/urn/resolver.pl?urn=nbn:de:bvb:19-epub-17654-2},
    doi={https://doi.org/10.5282/ubm/epub.17654}
    }

We show a bunch of things about the errror estimator of a machine learning algorithm, on a real data sample of fxed size (typically denoted by n in statistics). Among several statements, we show that the studentized error estimator is asymptotically normally distributed. This is a new contribution.