Publications
Published:
Peer-reviewed conference papers:
- MIGS: Multi-Identity Gaussian Splatting via Tensor Decomposition.
- Revisiting character-level adversarial attacks for Language Models.
- Going beyond compositional generalization, DDPMs can produce zero-shot interpolation.
- Learning to Remove Cuts in Integer Linear Programming.
Pol Puigdemont, Stratis Skoulakis, Grigorios Chrysos, Volkan Cevher
International Conference on Machine Learning (ICML), 2024.
We explore a new approach in solving integer linear programs (ILPs) with cutting plane methods: instead of only adding new cuts, we also consider the removal of previous cuts introduced at any of the preceding iterations of the method under a learnable parametric criteria. - REST: Efficient and Accelerated EEG Seizure Analysis through Residual State Updates.
- Multilinear Operator Networks.
Yixin Cheng, Grigorios Chrysos, Markos Georgopoulos, Volkan Cevher
International Conference on Learning Representations (ICLR), 2024.
We introduce a family of networks that rely on multilinear operations and capture high-degree interactions of the input elements. This family of networks, called MONet, performs on par with modern architectures on image recognition and beyond. - Generalization of Scaled Deep ResNets in the Mean-Field Regime.
- Robust NAS under adversarial training: benchmark, theory, and beyond.
Yongtao Wu, Fanghui Liu, Carl-Johann Simon-Gabriel, Grigorios Chrysos, Volkan Cevher
International Conference on Learning Representations (ICLR), 2024.
We release a benchmark for searching adversarially robust networks, while we also establish the generalization bounds for searched architectures under multi-objective adversarial training. - Efficient local linearity regularization to overcome catastrophic overfitting.
- Maximum Independent Set: Self-Training through Dynamic Programming.
Lorenzo Brusca*, Lars C.P.M. Quaedvlieg*, Stratis Skoulakis*, Grigorios Chrysos, Volkan Cevher
Conference on Neural Information Processing Systems (NeurIPS), 2023.
We design a framework for estimating the maximum independent set (MIS) without using supervised samples, inspired by dynamic programming. - On the Convergence of Encoder-Only Shallow Transformers.
- Benign Overfitting in Deep Neural Networks under Lazy Training.
Zhenyu Zhu, Fanghui Liu, Grigorios Chrysos, Francesco Locatello, Volkan Cevher
International Conference on Machine Learning (ICML), 2023.
In this work, we study the three interrelated concepts of overparameterization, benign overfitting, and the Lipschitz constant of DNNs and connect them in the lazy training regime. - Regularization of polynomial networks for image recognition.
Grigorios Chrysos, Bohan Wang, Jiankang Deng, Volkan Cevher
Computer Vision and Pattern Recognition Conference (CVPR), 2023.
We demonstrate how polynomial networks (without elementwise activation functions) can benefit from regularization schemes to reach the performance of standard neural networks. Then, we introduce a new class of polynomial networks that achieve even higher degree of expansions by using these additional regularization schemes. - Robustness in deep learning: The good (width), the bad (depth), and the ugly (initialization).
Zhenyu Zhu, Fanghui Liu, Grigorios Chrysos, Volkan Cevher
Conference on Neural Information Processing Systems (NeurIPS), 2022.
We explore the interplay of the width, the depth and the initialization(s) on the average robustness of neural networks with both theoretical bounds and empirical validation. - Generalization Properties of NAS under Activation and Skip Connection Search.
Zhenyu Zhu, Fanghui Liu, Grigorios Chrysos, Volkan Cevher
Conference on Neural Information Processing Systems (NeurIPS), 2022.
Using our theoretical guarantees of neural architecture search (NAS) under various activation functions and residual connections, we design an effective train-free algorithm for NAS. - Sound and Complete Verification of Polynomial Networks.
- Extrapolation and Spectral Bias of Neural Nets with Hadamard Product: a Polynomial Net Study.
- Augmenting Deep Classifiers with Polynomial Neural Networks.
Grigorios Chrysos* Markos Georgopoulos*, Jiankang Deng, Jean Kossaifi, Yannis Panagakis, Anima Anandkumar
European Conference on Computer Vision (ECCV), 2022.
We express modern architectures (e.g., residual and non-local networks) in the form of different degree polynomials of the input. This enables us to design extensions of successful architectures that perform favorably in various benchmarks. - Cluster-guided Image Synthesis with Unconditional Models.
Markos Georgopoulos, James Oldfield, Grigorios Chrysos, Yannis Panagakis
Computer Vision and Pattern Recognition Conference (CVPR), 2022.
We study controllable generation in unsupervised GAN models by leveraging clusters in the representation space of the generator. We show that these clusters, which capture semantic attributes, can be used for conditioning the generator. - The Spectral Bias of Polynomial Neural Networks.
- Controlling the Complexity and Lipschitz Constant improves Polynomial Nets.
Zhenyu Zhu, Fabian Latorre, Grigorios Chrysos, Volkan Cevher
International Conference on Learning Representations (ICLR), 2022.
We provide sample complexity results and bounds on the Lipschitz constant of polynomial networks, which we use to construct a regularization scheme that improves the robustness against adversarial noise. - Conditional Generation Using Polynomial Expansions.
- Poly-NL: Linear Complexity Non-local Layers with Polynomials.
Francesca Babiloni, Ioannis Marras, Filippos Kokkinos, Jiankang Deng, Grigorios Chrysos, Stefanos Zafeiriou
International Conference on Computer Vision (ICCV), 2021.
We cast non-local blocks as special cases of third degree polynomial functions. In addition, we propose a new non-local block that builds on this polynomial perspective but has more efficient operations, i.e., we aim to retain the expressivity of non-local layers while maintaining a linear complexity. - Unsupervised Controllable Generation with Self-Training.
- Reconstructing the Noise Manifold for Image Denoising.
- Multilinear Latent Conditioning for Generating Unseen Attribute Combinations.
Markos Georgopoulos, Grigorios Chrysos, Maja Pantic, Yannis Panagakis
International Conference on Machine Learning (ICML), 2020.
We extend conditional VAE to capture multiplicative interactions of the (annotated) attributes in the latent space. This enables generating images with unseen attribute combinations during training. - Π-nets: Deep Polynomial Neural Networks.
Grigorios Chrysos, Stylianos Moschoglou, Giorgos Bouritsas, Yannis Panagakis, Jiankang Deng, Stefanos Zafeiriou
Computer Vision and Pattern Recognition Conference (CVPR), 2020.
We use a high-order polynomial expansion as a function approximation method. The unknown parameters of the polynomial (i.e., high-order tensors) are estimated using a collective tensor factorization. - Robust Conditional Generative Adversarial Networks.
Grigorios Chrysos, Jean Kossaifi, Stefanos Zafeiriou
International Conference on Learning Representations (ICLR), 2019.
The topic of conditional data generation task (e.g., super-resolution) is the focus of this work. We introduce a new pathway in the encoder-decoder generator to improve the synthesized image. - Surface Based Object Detection in RGBD Images.
Peer-reviewed journal papers:
- Federated Learning under Covariate Shifts with Generalization Guarantees.
Ali Ramezani-Kebrya, Fanghui Liu, Thomas Pethick, Grigorios Chrysos, Volkan Cevher
Transactions on Machine Learning Research (TMLR), 2023.
We focus on the aspect of federated learning when there are coavariate shifts, which is a realistic scenario in multiple cases. We derive both theoretical guarantees and demonstrate how this can work in imbalanced data settings. - Linear Complexity Self-Attention with 3rd Order Polynomials.
Francesca Babiloni, Ioannis Marras, Filippos Kokkinos, Jiankang Deng, Matteo Maggioni, Grigorios Chrysos, Philip Torr, Stefanos Zafeiriou
IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI), 2023. (impact factor 2019: 17.861).
We cast self-attention (and non-local blocks) as special cases of third degree polynomial functions. In addition, we propose a new block that builds on this polynomial perspective but it is more computationally efficient, i.e., we aim to retain the expressivity of self-attention/non-local layers while maintaining a linear complexity. - Revisiting adversarial training for the worst-performing class.
Thomas Pethick, Grigorios Chrysos, Volkan Cevher
Transactions on Machine Learning Research (TMLR), 2023.
We propose a new training method called class focused online learning (CFOL) to reduce the gap between the top-performing and worst-performing classes in adversarial training, resulting in a min-max-max optimization formulation. - Deep Polynomial Neural Networks.
Grigorios Chrysos, Stylianos Moschoglou, Giorgos Bouritsas, Jiankang Deng, Yannis Panagakis, Stefanos Zafeiriou
IEEE Transactions on Pattern Analysis and Machine Intelligence (T-PAMI), 2021. (impact factor 2019: 17.861)
We propose a new class of architectures that use polynomial expansions to approximate the target functions. We validate the proposed polynomial expansions (i.e. Π-nets) in diverse experiments: data generation, data classifcation, face recognition and non-euclidean representation learning. - Tensor Methods in Computer Vision and Deep Learning.
Yannis Panagakis*, Jean Kossaifi*, Grigorios Chrysos, James Oldfield, Mihalis A. Nicolaou, Anima Anandkumar, Stefanos Zafeiriou
Proceedings of the IEEE, 2021.
We provide an in-depth review of tensors and tensor methods in the context of representation learning and deep learning, with a particular focus on computer vision applications. We also provide jupyter notebooks with accompanying code. - Non-adversarial polynomial synthesis.
- RoCGAN: Robust Conditional GAN.
Grigorios Chrysos, Jean Kossaifi, Stefanos Zafeiriou
International Journal of Computer Vision (IJCV), 2020. (impact factor 2019: 11.042)
We leverage structure in the output domain of a conditional data generation task (e.g., super-resolution) to improve the synthesized image. We experimentally validate that this results in synthesized images more robust to noise. Extension of the conference paper. - Motion Deblurring of Faces.
Grigorios Chrysos, Paolo Favaro, Stefanos Zafeiriou
International Journal of Computer Vision (IJCV), 2019. (impact factor 2019: 11.042)
We introduce a framework for tackling motion blur of faces. Our method simulates motion blur using averaging of video frames, while we collect a dataset that contains millions of such frames. - The Menpo Benchmark for Multi-pose 2D and 3D Facial Landmark Localisation and Tracking.
- A Comprehensive Performance Evaluation of Deformable Face Tracking ''In-the-Wild''.
Grigorios Chrysos, Epameinondas Antonakos, Patrick Snape, A. Asthana, Stefanos Zafeiriou
International Journal of Computer Vision (IJCV), 2018. (impact factor 2019: 11.042)
We conduct a large-scale study of deformable face tracking `in-the-wild', i.e., with videos captured in unrestricted conditions. - IPST: Incremental Pictorial Structures for model-free Tracking of deformable objects.
- PD2T: Person-specific Detection, Deformable Tracking.
Workshop papers:
- Self-Supervised Neural Architecture Search for Imbalanced Datasets.
Aleksandr Timofeev, Grigorios Chrysos, Volkan Cevher
International Conference on Machine Learning Workshops (ICMLW), 2021.
We propose a neural architecture search (NAS) framework for real world tasks: (a) in the absence of labels, (b) in the presence of imbalanced datasets, (c) on a constrained computational budget. - Unsupervised Controllable Generation with Self-Training.
- The 3D Menpo Facial Landmark Tracking Challenge.
- Deep Face Deblurring.
- The Menpo Facial Landmark Localisation Challenge.
- The First Facial Landmark Tracking in-the-Wild Challenge: Benchmark and Results.
- Offline Deformable Face Tracking in Arbitrary Videos.
Thesis
- Polynomial function approximation and its application to deep generative models