Research
In a past life I worked on efficient machine learning for problems in Astrophysics and some of the big-data challenges for handling alert streams from the Vera Rubin telescope ðŸ”. This led to follow-up work on efficient model deployment for low-latency high-throughput inference.
Following my PhD I kept with the theme of efficient data processing and efficient deep learning with my current research. While previous focused in the domain of Astrophysics/Astronomy, now this mix of efficient data processing and efficient machine learning is geared to generally improving AI deployment strategies for embedded systems/resource constrained environments — ranging from phones, routers, microcontrollers and more recently FPGAs — where the overarching problem statement is: How can we optimise a solution for a given task in terms of latency, power consumption and size (be that RAM usage or space on disk).
Below is a light touch of some areas of focus...
1. A.I. Compilers for Hardware Agnostic Deployment¶
This covers a broad range of topics including Model Compression (Quantization, Pruning, etc.) Graph Optimisation and Embedded System Design i.e. Hardware/Software Co-design.
Image credit: Sahib Dhanjal, Medium
2. Efficient Machine Learning and Efficient Data Processing¶
This looks at efficient algorithms for computer vision tasks, such as efficient Image and Video Reconstruction and Image and Video Compression. This also includes efficient architectures and the use of spare data representations for optimising the full data pipeline. This is complimented by research into Query Optimisation and Distributed System Design for Query Planning of structured and unstructured data.
Image credit: Wikipedia: Event-based Vision
Publications¶
-
Allam Jr, Tarek and McEwen JD (2023) An astronomical xception: Depthwise-separable convolutions for efficient photometric classification. ↩
-
Allam Jr, Tarek, Peloton J and McEwen JD (2023) The tiny time-series transformer: Low-latency high-throughput classification of astronomical transients using deep model compression. arXiv preprint arXiv:2303.08951. Available at: https://arxiv.org/abs/2303.08951. ↩
-
Alves CS, Peiris HV, Lochner M, et al. (2022) Considerations for optimizing the photometric classification of supernovae from the rubin observatory. The Astrophysical Journal Supplement Series 258(2). IOP Publishing: 23. ↩
-
Allam, Jr. T and McEwen JD (2021) Paying attention to astronomical transients: Photometric classification with the time-series transformer. ArXiv preprint arXiv:2105.06178. Available at: https://arxiv.org/abs/2105.06178. ↩
-
Allam, Tarek, Bohdal O, Dong N, et al. (2020) Semantic segmentation of 3D point clouds. Zenodo. ↩
-
Ponder K, Hlozek R, Allam, T Bahmanyar A, et al. (2020) The photometric LSST astronomical time series classification challenge (PLAsTiCC): Final results. AAS: 203--15. ↩
-
Malz A, Hložek R, Allam T, et al. (2019) The photometric LSST astronomical time-series classification challenge PLAsTiCC: Selection of a performance metric for classification probabilities balancing diverse science goals. The Astronomical Journal 158(5). IOP Publishing: 171. ↩
-
Allam Jr, Tarek, Biswas R, Hlozek R, et al. (2019) Optimising the LSST observing strategy for supernova light curve classification with machine learning. ↩
-
Hlozek R, Kessler R, Allam, Tarek, et al. (2019) The photometric LSST astronomical time series classification challenge (PLAsTiCC). AAS 233: 212--01. ↩
-
The PLAsTiCC team, Allam Jr T, Bahmanyar A, et al. (2018) The photometric lsst astronomical time-series classification challenge (plasticc): Data set. arXiv preprint arXiv:1810.00001. ↩
-
Allam Jr, Tarek (2016) Radio interferometric image reconstruction for the SKA: A deep learning approach. Master's thesis. University College London; https://tarekallamjr.com/assets/tarekallam-msc-thesis.pdf. ↩