Unsupervised Domain Adaptation for Constraining Star Formation Histories
Abstract
:1. Introduction
2. Data
3. Methodology
- First, we create three sets of experiments to ensure a comprehensive evaluation across all simulations. In each set, we utilize galaxies from two of the three simulations—simba, illustristng, and eagle—as our training and validation sets, and galaxies from the remaining simulation as our test set. This approach is designed to emulate real-world scenarios where the ground truth is unknown, and we must rely on models trained on different but related data. The 9:1 split between training and validation sets was chosen as a conventional starting point in machine learning practice, providing a substantial amount of data for training while reserving a reasonable subset for validation. The choice of this split ratio is somewhat arbitrary and commonly used in the field, offering a balance between learning from a large training set and having enough data to validate the model’s performance effectively. While other split ratios or more complex cross-validation schemes might influence the model’s performance, the impact of such variations is beyond the scope of this initial study and is earmarked for future exploration. This study aims to establish a baseline understanding of how well models trained on one set of simulated galaxies generalize to another, using a straightforward and widely accepted split ratio for initial experiments.
- Second, we normalize each SFH time series (each row of Y) by its sum and store the resultant normalized SFH (SFHnorm) and the sum (SFHsum) separately. This normalization is crucial given the vast dynamic range of star formation histories observed across different galaxies. See Appendix D): SFH curves have a large variety of scales, some increasing to more than 100, whereas others never increase over . Without normalization, galaxies with intrinsically high star formation rates could dominate the learning process, overshadowing subtler trends in galaxies with lower rates. By scaling each SFH time series, we facilitate the model’s ability to learn the underlying patterns in the SFHs, independent of their absolute scale. The learning of the SFHs is now decoupled in the learning of SFHnorm and SFHsum.However, this approach also introduces considerations that must be acknowledged. First, the normalization step assumes that the shape of the SFH curve is more informative than its absolute scale, which may not always hold true, especially if the scale itself carries significant astrophysical information. Second, this method may amplify the noise in SFHs with very low sums, potentially impacting the model’s ability to learn meaningful patterns from these histories. Additionally, decoupling the learning of SFHnorm and SFHsum assumes that these two components are independent, which may not fully capture the complexities of galaxy evolution, where the total star formation and its distribution over time could be interrelated. While this normalization facilitates the learning process by standardizing the range of input features, it is essential to consider these aspects when interpreting the model’s outputs and in future investigations to refine the approach.
- Third, we further ease the learning of the SFHnorm by reducing the curves to their first 3 Kernel-PCA (Principal Component Analysis) components [56,57]. This choice is predicated on a series of comparative analyses where Kernel-PCA demonstrated superior performance in preserving the essential characteristics of the SFH time series when compared to linear PCA [58,59] and discrete wavelet transform (DWT) [60,61]. Our evaluation focuses on the capacity of each method to reconstruct the original time series while maintaining its significant features. The effectiveness of this reconstruction is quantitatively assessed using the DILATE loss metric (Distortion Loss including shApe and TimE) [62], which incorporates both dynamic time warping (DTW) and the temporal distortion index (TDI) to evaluate the similarity between the original and reconstructed time series.During our experimentation, Kernel-PCA consistently outperforms its counterparts by achieving lower DILATE loss scores, suggesting that it retains more of the original series’ structural and temporal integrity. Specifically, in our tests, Kernel-PCA exhibits a more robust ability to capture non-linear patterns within the SFH data, a key aspect given the complex nature of these time series. The selection of the number of components (3) is based on a variance-explained criterion, where we aim to retain at least of the variance within the validation set, ensuring a balance between dimensionality reduction and information retention. This process and its outcomes are further detailed in Appendix A and visually represented in Figure A1, where we compare the reconstruction fidelity of SFH time series using different methods, clearly illustrating the advantage of Kernel-PCA advantage in our specific application context.
- Fourth, we normalize the input features (the columns from X) via log-scaling, and follow this up by standard scaling normalization. In Figure 2, we visualize the log-scaled flux densities, in units of Janskies, for all three simulations, in 3 out of 20 filters.
- Finally, we derive KLIEP weights for the training samples in all three experiments. The Kernelized Learning for Independence (KLIEP) algorithm [16] is employed to adjust the weights of the training samples to minimize the Kullback–Leibler (KL) divergence between the source and target distributions. This instance-based domain adaptation approach is chosen due to its robustness in handling regression domain adaptation challenges and its effectiveness in mitigating negative transfer [2,13,63,64].
4. Results
- The SFH predictions for individual galaxies within the test set are compared against their corresponding true SFHs. This direct comparison provides a granular assessment of the model’s predictive accuracy on a per-galaxy basis.
- The aggregate predicted star formation histories SFH, representing the sum of SFHs across all galaxies within a simulation, are compared to the true SFHs. The true SFH serves as an essential benchmark, reflecting the cumulative star formation activity and validating the simulation’s input physics. Observational evidence indicates that the universe’s star formation peaked approximately 2 billion years after its inception [55], a critical detail that all hydrodynamical simulations must replicate to be considered accurate. Ensuring concordance between the SFH derived from our model and that from established simulations underpins the robustness of our approach. This alignment not only corroborates our model’s efficacy but also facilitates the fine-tuning of its architecture, loss functions, and hyperparameters in response to any discrepancies. Additionally, this comparison sheds light on potential systematic biases when the model is trained on one simulation and tested on another.
5. Future Work
Author Contributions
Funding
Conflicts of Interest
Appendix A. Selection of Time Series Reduction Method
Appendix B. KLIEP Reweighing
Appendix C. Extended Results
RMSE (↓) | MAE (↓) | BE (↓) | DTW (↓) | TDI (↓) | |
---|---|---|---|---|---|
Base | |||||
KLIEP | |||||
MDD | |||||
DANN | |||||
DeepCORAL | |||||
KMM | |||||
CORAL |
RMSE (↓) | MAE (↓) | BE (↓) | DTW (↓) | TDI (↓) | |
---|---|---|---|---|---|
Base | |||||
KLIEP | |||||
MDD | |||||
DANN | |||||
DeepCORAL | |||||
KMM | |||||
CORAL |
RMSE (↓) | MAE (↓) | BE (↓) | DTW (↓) | TDI (↓) | |
---|---|---|---|---|---|
Base | |||||
KLIEP | |||||
MDD | |||||
DANN | |||||
DeepCORAL | |||||
KMM | |||||
CORAL |
RMSE (↓) | MAE (↓) | BE (↓) | DTW (↓) | TDI (↓) | |
---|---|---|---|---|---|
Base | |||||
KLIEP | |||||
MDD | |||||
DANN | |||||
DeepCORAL | |||||
KMM | |||||
CORAL |
RMSE (↓) | MAE (↓) | BE (↓) | DTW (↓) | TDI (↓) | |
---|---|---|---|---|---|
Base | |||||
KLIEP | |||||
MDD | |||||
DANN | |||||
DeepCORAL | |||||
KMM | |||||
CORAL |
RMSE (↓) | MAE (↓) | BE (↓) | DTW (↓) | TDI (↓) | |
---|---|---|---|---|---|
Base | |||||
KLIEP | |||||
MDD | |||||
DANN | |||||
DeepCORAL | |||||
KMM | |||||
CORAL |
Appendix D. Predictions for Individual SFHs
1 | The observed CSFRD (especially at ) is still a topic of active research currently, as the amount of star formation obscured by dust in the early universe is still unknown. |
2 | A representative sampling looks like this: (0.0, 0.48), (0.48, 0.95), (0.95, 1.43), (1.43, 1.91), (1.91, 2.38), (2.38, 2.86), (2.86, 3.34), (3.34, 3.81), (3.81, 4.29), (4.29, 4.77), (4.77, 5.24), (5.24, 5.72), (5.72, 6.2), (6.2, 6.67), (6.67, 7.15), (7.15, 7.63), (7.63, 8.1), (8.1, 8.58), (8.58, 9.05), (9.05, 9.53), (9.53, 10.01), (10.01, 10.48), (10.48, 10.96), (10.96, 11.44), (11.44, 11.91), (11.91, 12.39), (12.39, 12.87), (12.87, 13.34), (13.34, 13.82), where all the values are in gigayears (GYrs). |
3 | https://github.com/antoinedemathelin/adapt (accessed on 6 April 2024). |
References
- Dai, W.; Yang, Q.; Xue, G.R.; Yu, Y. Boosting for Transfer Learning. In Proceedings of the 24th International Conference on Machine Learning, Vienna, Austria, 21–27 July 2007; Volume 227, pp. 193–200. [Google Scholar] [CrossRef]
- de Mathelin, A.; Richard, G.; Mougeot, M.; Vayatis, N. Adversarial weighting for domain adaptation in regression. arXiv 2020, arXiv:2006.08251. [Google Scholar]
- Motiian, S.; Jones, Q.; Iranmanesh, S.M.; Doretto, G. Few-Shot Adversarial Domain Adaptation. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Red Hook, NY, USA, 4–9 December 2017; pp. 6673–6683. [Google Scholar]
- Motiian, S.; Piccirilli, M.; Adjeroh, D.A.; Doretto, G. Unified deep supervised domain adaptation and generalization. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 5715–5725. [Google Scholar]
- Liu, J.; Xuan, W.; Gan, Y.; Zhan, Y.; Liu, J.; Du, B. An End-to-end Supervised Domain Adaptation Framework for Cross-Domain Change Detection. Pattern Recognit. 2022, 132, 108960. [Google Scholar] [CrossRef]
- Hedegaard, L.; Sheikh-Omar, O.A.; Iosifidis, A. Supervised domain adaptation: A graph embedding perspective and a rectified experimental protocol. IEEE Trans. Image Process. 2021, 30, 8619–8631. [Google Scholar] [CrossRef]
- Kumar, A.; Saha, A.; Daume, H. Co-regularization based semi-supervised domain adaptation. Adv. Neural Inf. Process. Syst. 2010, 23, 478–486. [Google Scholar]
- Saito, K.; Kim, D.; Sclaroff, S.; Darrell, T.; Saenko, K. Semi-Supervised Domain Adaptation via Minimax Entropy. In Proceedings of the 2019 IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Republic of Korea, 27 October–2 November 2019; pp. 8049–8057. [Google Scholar]
- Tzeng, E.; Hoffman, J.; Darrell, T.; Saenko, K. Simultaneous Deep Transfer Across Domains and Tasks. In Proceedings of the 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, 7–13 December 2015; pp. 4068–4076. [Google Scholar]
- Daumé, H., III; Kumar, A.; Saha, A. Frustratingly easy semi-supervised domain adaptation. In Proceedings of the 2010 Workshop on Domain Adaptation for Natural Language Processing, Uppsala, Sweden, 15 July 2010; pp. 53–59. [Google Scholar]
- Li, K.; Liu, C.; Zhao, H.; Zhang, Y.; Fu, Y. Ecacl: A holistic framework for semi-supervised domain adaptation. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Virtual, 11–17 October 2021; pp. 8578–8587. [Google Scholar]
- Zhou, K.; Ziwei, L.; Yu, Q.; Xiang, T.; Loy, C.C. Domain Generalization: A Survey. IEEE Trans. Pattern Anal. Mach. Intell. 2023, 45, 4396–4415. [Google Scholar] [CrossRef]
- Huang, J.; Gretton, A.; Borgwardt, K.; Schölkopf, B.; Smola, A.J. Correcting Sample Selection Bias by Unlabeled Data. In Advances in Neural Information Processing Systems 19; Schölkopf, B., Platt, J.C., Hoffman, T., Eds.; MIT Press: Cambridge, MA, USA, 2007; pp. 601–608. [Google Scholar]
- Richard, G.; de Mathelin, A.; Hébrail, G.; Mougeot, M.; Vayatis, N. Unsupervised Multi-source Domain Adaptation for Regression. In Lecture Notes in Computer Science, Proceedings of the Machine Learning and Knowledge Discovery in Databases—European Conference, ECML PKDD 2020, Ghent, Belgium, 14–18 September 2020; Proceedings, Part I; Springer: Berlin/Heidelberg, Germany, 2020; Volume 12457, pp. 395–411. [Google Scholar]
- Saito, K.; Watanabe, K.; Ushiku, Y.; Harada, T. Maximum classifier discrepancy for unsupervised domain adaptation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–23 June 2018; pp. 3723–3732. [Google Scholar]
- Sugiyama, M.; Nakajima, S.; Kashima, H.; Bünau, P.v.; Kawanabe, M. Direct Importance Estimation with Model Selection and Its Application to Covariate Shift Adaptation. In Proceedings of the 20th International Conference on Neural Information Processing Systems, NIPS’07, Red Hook, NY, USA, 3–6 December 2007; pp. 1433–1440. [Google Scholar]
- Cortes, C.; Mohri, M.; Medina, A.M.n. Adaptation Based on Generalized Discrepancy. J. Mach. Learn. Res. 2019, 20, 1–30. [Google Scholar]
- Robotham, A.S.G.; Bellstedt, S.; Lagos, C.d.P.; Thorne, J.E.; Davies, L.J.; Driver, S.P.; Bravo, M. ProSpect: Generating spectral energy distributions with complex star formation and metallicity histories. Mon. Not. R. Astron. Soc. 2020, 495, 905–931. [Google Scholar] [CrossRef]
- da Cunha, E.; Charlot, S.; Elbaz, D. A simple model to interpret the ultraviolet, optical and infrared emission from galaxies. Mon. Not. R. Astron. Soc. 2008, 388, 1595–1617. [Google Scholar] [CrossRef]
- Da Cunha, E.; Charlot, S.; Dunne, L.; Smith, D.; Rowlands, K. MAGPHYS: A publicly available tool to interpret observed galaxy SEDs. Proc. Int. Astron. Union 2011, 7, 292–296. [Google Scholar] [CrossRef]
- Noll, S.; Burgarella, D.; Giovannoli, E.; Buat, V.; Marcillac, D.; Muñoz-Mateos, J.C. Analysis of galaxy spectral energy distributions from far-UV to far-IR with CIGALE: Studying a SINGS test sample. Astron. Astrophys. 2009, 507, 1793–1813. [Google Scholar] [CrossRef]
- Boquien, M.; Burgarella, D.; Roehlly, Y.; Buat, V.; Ciesla, L.; Corre, D.; Inoue, A.K.; Salas, H. CIGALE: A Python code investigating galaxy emission. Astron. Astrophys. 2019, 622, A103. [Google Scholar] [CrossRef]
- Carnall, A.C.; McLure, R.J.; Dunlop, J.S.; Davé, R. Inferring the star formation histories of massive quiescent galaxies with BAGPIPES: Evidence for multiple quenching mechanisms. Mon. Not. R. Astron. Soc. 2018, 480, 4379–4401. [Google Scholar] [CrossRef]
- Crocker, D.L.; French, K.D.; Tripathi, A.; Verrico, M.E. Modeling Star Formation Histories of Post Starburst Galaxies with BAGPIPES. Res. Notes AAS 2023, 7, 183. [Google Scholar] [CrossRef]
- Johnson, B.; Leja, J. Bd-J/Prospector: Initial Release; Zenodo: Geneva, Switzerland, 2017. [Google Scholar] [CrossRef]
- Johnson, B.D.; Leja, J.; Conroy, C.; Speagle, J.S. Stellar population inference with Prospector. Astrophys. J. Suppl. Ser. 2021, 254, 22. [Google Scholar] [CrossRef]
- Dattilo, A.; Vanderburg, A.; Shallue, C.J.; Mayo, A.W.; Berlind, P.; Bieryla, A.; Calkins, M.; Esquerdo, G.A.; Everett, M.E.; Howell, S.B.; et al. Identifying exoplanets with deep learning. ii. Two new super-earths uncovered by a neural network in K2 data. Astron. J. 2019, 157, 169. [Google Scholar] [CrossRef]
- Jara-Maldonado, M.; Alarcon-Aquino, V.; Rosas-Romero, R.; Starostenko, O.; Ramirez-Cortes, J.M. Transiting exoplanet discovery using machine learning techniques: A survey. Earth Sci. Inform. 2020, 13, 573–600. [Google Scholar] [CrossRef]
- Zucker, S.; Giryes, R. Shallow transits—Deep learning. I. Feasibility study of deep learning to detect periodic transits of exoplanets. Astron. J. 2018, 155, 147. [Google Scholar] [CrossRef]
- Caldeira, J.; Wu, W.K.; Nord, B.; Avestruz, C.; Trivedi, S.; Story, K.T. DeepCMB: Lensing reconstruction of the cosmic microwave background with deep neural networks. Astron. Comput. 2019, 28, 100307. [Google Scholar] [CrossRef]
- Escamilla-Rivera, C.; Carvajal, M.A.; Capozziello, S. A deep learning approach to cosmological dark energy models. J. Cosmol. Astropart. Phys. 2020, 2020, 8. [Google Scholar] [CrossRef]
- Hortúa, H.J.; Volpi, R.; Marinelli, D.; Malagò, L. Parameter estimation for the cosmic microwave background with Bayesian neural networks. Phys. Rev. D 2020, 102, 103509. [Google Scholar] [CrossRef]
- Cuoco, E.; Powell, J.; Cavaglià, M.; Ackley, K.; Bejger, M.; Chatterjee, C.; Coughlin, M.; Coughlin, S.; Easter, P.; Essick, R.; et al. Enhancing gravitational-wave science with machine learning. Mach. Learn. Sci. Technol. 2020, 2, 011002. [Google Scholar] [CrossRef]
- Bayley, J.; Messenger, C.; Woan, G. Robust machine learning algorithm to search for continuous gravitational waves. Phys. Rev. D 2020, 102, 083024. [Google Scholar] [CrossRef]
- Schäfer, M.B.; Ohme, F.; Nitz, A.H. Detection of gravitational-wave signals from binary neutron star mergers using machine learning. Phys. Rev. D 2020, 102, 063015. [Google Scholar] [CrossRef]
- Cavanagh, M.K.; Bekki, K.; Groves, B.A. Morphological classification of galaxies with deep learning: Comparing 3-way and 4-way CNNs. Mon. Not. R. Astron. Soc. 2021, 506, 659–676. [Google Scholar] [CrossRef]
- Barchi, P.H.; de Carvalho, R.R.; Rosa, R.R.; Sautter, R.A.; Soares-Santos, M.; Marques, B.A.D.; Clua, E.; Gonçalves, T.S.; de Sá-Freitas, C.; Moura, T.C. Machine and Deep Learning applied to galaxy morphology—A comparative study. Astron. Comput. 2020, 30, 100334. [Google Scholar] [CrossRef]
- D’Isanto, A.; Polsterer, K.L. Photometric redshift estimation via deep learning-generalized and pre-classification-less, image based, fully probabilistic redshifts. Astron. Astrophys. 2018, 609, A111. [Google Scholar] [CrossRef]
- Hoyle, B. Measuring photometric redshifts using galaxy images and Deep Neural Networks. Astron. Comput. 2016, 16, 34–40. [Google Scholar] [CrossRef]
- Gilda, S.; Ting, Y.-S.; Withington, K.; Wilson, M.; Prunet, S.; Mahoney, W.; Fabbro, S.; Draper, S.C.; Sheinis, A. Astronomical Image Quality Prediction based on Environmental and Telescope Operating Conditions. arXiv 2020, arXiv:2011.03132. [Google Scholar]
- Gilda, S.; Draper, S.C.; Fabbro, S.; Mahoney, W.; Prunet, S.; Withington, K.; Wilson, M.; Ting, Y.-S.; Sheinis, A. Uncertainty-aware learning for improvements in image quality of the Canada–France–Hawaii Telescope. Mon. Not. R. Astron. Soc. 2022, 510, 870–902. [Google Scholar] [CrossRef]
- Dainotti, M.; Petrosian, V.; Bogdan, M.; Miasojedow, B.; Nagataki, S.; Hastie, T.; Nuyngen, Z.; Gilda, S.; Hernandez, X.; Krol, D. Gamma-ray Bursts as distance indicators through a machine learning approach. arXiv 2019, arXiv:1907.05074. [Google Scholar]
- Ukwatta, T.N.; Woźniak, P.R.; Gehrels, N. Machine-z: Rapid machine-learned redshift indicator for Swift gamma-ray bursts. Mon. Not. R. Astron. Soc. 2016, 458, 3821–3829. [Google Scholar] [CrossRef]
- Gilda, S. deep-REMAP: Parameterization of Stellar Spectra Using Regularized Multi-Task Learning. arXiv 2023, arXiv:2311.03738. [Google Scholar]
- Ramachandra, N.; Chaves-Montero, J.; Alarcon, A.; Fadikar, A.; Habib, S.; Heitmann, K. Machine learning synthetic spectra for probabilistic redshift estimation: SYTH-Z. Mon. Not. R. Astron. Soc. 2022, 515, 1927–1941. [Google Scholar] [CrossRef]
- Gilda, S.; Lower, S.; Narayanan, D. mirkwood: Fast and Accurate SED Modeling Using Machine Learning. Astrophys. J. 2021, 916, 43. [Google Scholar] [CrossRef]
- Gilda, S. Beyond mirkwood: Enhancing SED Modeling with Conformal Predictions. Astronomy 2024, 3, 14–20. [Google Scholar] [CrossRef]
- Schaye, J.; Crain, R.A.; Bower, R.G.; Furlong, M.; Schaller, M.; Theuns, T.; Dalla Vecchia, C.; Frenk, C.S.; McCarthy, I.G.; Helly, J.C.; et al. The EAGLE project: Simulating the evolution and assembly of galaxies and their environments. Mon. Not. R. Astron. Soc. 2015, 446, 521–554. [Google Scholar] [CrossRef]
- Nelson, D.; Pillepich, A.; Springel, V.; Weinberger, R.; Hernquist, L.; Pakmor, R.; Genel, S.; Torrey, P.; Vogelsberger, M.; Kauffmann, G.; et al. First results from the IllustrisTNG simulations: The galaxy colour bimodality. Mon. Not. R. Astron. Soc. 2018, 475, 624–647. [Google Scholar] [CrossRef]
- Pillepich, A.; Nelson, D.; Hernquist, L.; Springel, V.; Pakmor, R.; Torrey, P.; Weinberger, R.; Genel, S.; Naiman, J.P.; Marinacci, F.; et al. First results from the IllustrisTNG simulations: The stellar mass content of groups and clusters of galaxies. Mon. Not. R. Astron. Soc. 2018, 475, 648–675. [Google Scholar] [CrossRef]
- Davé, R.; Anglés-Alcázar, D.; Narayanan, D.; Li, Q.; Rafieferantsoa, M.H.; Appleby, S. SIMBA: Cosmological simulations with black hole growth and feedback. Mon. Not. R. Astron. Soc. 2019, 486, 2827–2849. [Google Scholar] [CrossRef]
- Schaller, M.; Dalla Vecchia, C.; Schaye, J.; Bower, R.G.; Theuns, T.; Crain, R.A.; Furlong, M.; McCarthy, I.G. The EAGLE simulations of galaxy formation: The importance of the hydrodynamics scheme. Mon. Not. R. Astron. Soc. 2015, 454, 2277–2291. [Google Scholar] [CrossRef]
- McAlpine, S.; Helly, J.C.; Schaller, M.; Trayford, J.W.; Qu, Y.; Furlong, M.; Bower, R.G.; Crain, R.A.; Schaye, J.; Theuns, T.; et al. The EAGLE simulations of galaxy formation: Public release of halo and galaxy catalogues. Astron. Comput. 2016, 15, 72–89. [Google Scholar] [CrossRef]
- Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Sijacki, D.; Xu, D.; Snyder, G.; Nelson, D.; Hernquist, L. Introducing the Illustris Project: Simulating the coevolution of dark and visible matter in the Universe. Mon. Not. R. Astron. Soc. 2014, 444, 1518–1547. [Google Scholar] [CrossRef]
- Madau, P.; Dickinson, M. Cosmic Star-Formation History. Annu. Rev. Astron. Astrophys. 2014, 52, 415–486. [Google Scholar] [CrossRef]
- Schölkopf, B.; Burges, C.J.; Smola, A.J. (Eds.) Advances in Kernel Methods: Support Vector Learning; MIT Press: Cambridge, MA, USA, 1999. [Google Scholar]
- Mika, S.; Schölkopf, B.; Smola, A.; Müller, K.-R.; Scholz, M.; Rätsch, G. Kernel PCA and de-noising in feature spaces. Adv. Neural Inf. Process. Syst. 1998, 11. [Google Scholar]
- Jolliffe, I.T.; Cadima, J. Principal component analysis: A review and recent developments. Philos. Trans. R. Soc. Lond. Ser. A 2016, 374, 20150202. [Google Scholar] [CrossRef]
- Greenacre, M.; Groenen, P.J.F.; Hastie, T.; d’Enza, A.I.; Markos, A.; Tuzhilina, E. Principal component analysis. Nat. Rev. Methods Prim. 2022, 2, 100. [Google Scholar] [CrossRef]
- Shensa, M.J. The discrete wavelet transform: Wedding the a trous and Mallat algorithms. IEEE Trans. Signal Process. 1992, 40, 2464–2482. [Google Scholar] [CrossRef]
- Heil, C.E.; Walnut, D.F. Continuous and discrete wavelet transforms. SIAM Rev. 1989, 31, 628–666. [Google Scholar] [CrossRef]
- Le Guen, V.; Thome, N. Deep Time Series Forecasting With Shape and Temporal Criteria. IEEE Trans. Pattern Anal. Mach. Intell. 2023, 45, 342–355. [Google Scholar] [CrossRef]
- Cortes, C.; Mohri, M. Domain adaptation and sample bias correction theory and algorithm for regression. Theor. Comput. Sci. 2014, 519, 103–126. [Google Scholar] [CrossRef]
- Mansour, Y.; Mohri, M.; Rostamizadeh, A. Domain Adaptation: Learning Bounds and Algorithms. In Proceedings of the 22nd Annual Conference on Learning Theory (COLT 2009), Montreal, QC, Canada, 18–21 June 2009. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A Method for Stochastic Optimization. In Proceedings of the 3rd International Conference on Learning Representations, ICLR 2015, San Diego, CA, USA, 7–9 May 2015; Conference Track Proceedings. Bengio, Y., LeCun, Y., Eds.; ArXiv: Ithaca, NY, USA, 2015. [Google Scholar]
- Bellstedt, S.; Robotham, A.S.G.; Driver, S.P.; Thorne, J.E.; Davies, L.J.M.; Lagos, C.d.P.; Stevens, A.R.H.; Taylor, E.N.; Baldry, I.K.; Moffett, A.J.; et al. Galaxy And Mass Assembly (GAMA): A forensic SED reconstruction of the cosmic star formation history and metallicity evolution by galaxy type. Mon. Not. R. Astron. Soc. 2020, 498, 5581–5603. [Google Scholar] [CrossRef]
- Lagos, C.d.P.; Tobar, R.J.; Robotham, A.S.G.; Obreschkow, D.; Mitchell, P.D.; Power, C.; Elahi, P.J. Shark: Introducing an open source, free, and flexible semi-analytic model of galaxy formation. Mon. Not. R. Astron. Soc. 2018, 481, 3573–3603. [Google Scholar] [CrossRef]
- Zhang, Y.; Liu, T.; Long, M.; Jordan, M. Bridging theory and algorithm for domain adaptation. In Proceedings of the 36th International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; pp. 7404–7413. [Google Scholar]
- Ganin, Y.; Ustinova, E.; Ajakan, H.; Germain, P.; Larochelle, H.; Laviolette, F.; March, M.; Lempitsky, V. Domain-adversarial training of neural networks. J. Mach. Learn. Res. 2016, 17, 2096–2130. [Google Scholar]
- Huang, J.; Smola, A.; Gretton, A.; Borgwardt, K.; Scholkopf, B. Correcting sample selection bias by unlabeled data. In Proceedings of the 19th International Conference on Neural Information Processing Systems, Vancouver, BC, Canada, 4–9 December 2006; pp. 601–608. [Google Scholar]
- Sun, B.; Feng, J.; Saenko, K. Return of frustratingly easy domain adaptation. In Proceedings of the AAAI Conference on Artificial Intelligence, Phoenix, AZ, USA, 12–17 February 2016; Volume 30. No. 1. [Google Scholar]
- Sun, B.; Saenko, K. Deep coral: Correlation alignment for deep domain adaptation. In Proceedings of the Computer Vision–ECCV 2016 Workshops, Amsterdam, The Netherlands, 11–14 October 2016; Proceedings, Part III 14. pp. 443–450. [Google Scholar]
- Zhao, H.; Zhang, S.; Wu, G.; Moura, J.M.F.; Costeira, J.P.; Gordon, G.J. Adversarial Multiple Source Domain Adaptation. In Proceedings of the Advances in Neural Information Processing Systems 31, Montreal, QC, Canada, 3–8 December 2018; Bengio, S., Wallach, H., Larochelle, H., Grauman, K., Cesa-Bianchi, N., Garnett, R., Eds.; Curran Associates, Inc.: Nice, France, 2018; pp. 8559–8570. [Google Scholar]
- Richard, G.; Mathelin, A.; Hébrail, G.; Mougeot, M.; Vayatis, N. Unsupervised Multi-source Domain Adaptation for Regression. In Proceedings of the European Conference, ECML PKDD 2020, Ghent, Belgium, 14–18 September 2020. pp. 395–411. [CrossRef]
- Salpeter, E.E. The Luminosity Function and Stellar Evolution. Astrophys. J. 1955, 121, 161. [Google Scholar] [CrossRef]
- Kroupa, P. On the variation of the initial mass function. Mon. Not. R. Astron. Soc. 2001, 322, 231–246. [Google Scholar] [CrossRef]
- Chabrier, G. Galactic Stellar and Substellar Initial Mass Function. Publ. Astron. Soc. Pac. 2003, 115, 763–795. [Google Scholar] [CrossRef]
- Bruzual, G.; Charlot, S. Stellar population synthesis at the resolution of 2003. Mon. Not. R. Astron. Soc. 2003, 344, 1000–1028. [Google Scholar] [CrossRef]
- Tremonti, C.A.; Heckman, T.M.; Kauffmann, G.; Brinchmann, J.; Charlot, S.; White, S.D.M.; Seibert, M.; Peng, E.W.; Schlegel, D.J.; Uomoto, A.; et al. The Origin of the Mass-Metallicity Relation: Insights from 53,000 Star-forming Galaxies in the Sloan Digital Sky Survey. Astrophys. J. 2004, 613, 898–913. [Google Scholar] [CrossRef]
- Jimmy; Tran, K.V.; Saintonge, A.; Accurso, G.; Brough, S.; Oliva-Altamirano, P. The Gas Phase Mass Metallicity Relation for Dwarf Galaxies: Dependence on Star Formation Rate and HI Gas Mass. Astrophys. J. 2015, 812, 98. [Google Scholar] [CrossRef]
- Lara-Lopez, M.A.; Hopkins, A.M.; Lopez-Sanchez, A.R.; Brough, S.; Colless, M.; Bland-Hawthorn, J.; Driver, S.; Foster, C.; Liske, J.; Loveday, J.; et al. Galaxy and mass assembly (GAMA): The connection between metals, specific SFR and hi gas in galaxies: The Z-SSFR relation. Mon. Not. R. Astron. Soc. 2013, 433, L35–L39. [Google Scholar] [CrossRef]
Test Simulation | RMSE (↓) | MAE (↓) | BE (↓) | DTW (↓) | TDI (↓) | |
---|---|---|---|---|---|---|
illustristng | Baseline | |||||
UDA | ||||||
eagle | Baseline | |||||
UDA | ||||||
Baseline | ||||||
simba | UDA |
Test Simulation | RMSE (↓) | MAE (↓) | BE (↓) | DTW (↓) | TDI (↓) | |
---|---|---|---|---|---|---|
Baseline | ||||||
illustristng | UDA | |||||
Baseline | ||||||
eagle | UDA | |||||
Baseline | ||||||
simba | UDA |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Gilda, S.; de Mathelin, A.; Bellstedt, S.; Richard, G. Unsupervised Domain Adaptation for Constraining Star Formation Histories. Astronomy 2024, 3, 189-207. https://doi.org/10.3390/astronomy3030012
Gilda S, de Mathelin A, Bellstedt S, Richard G. Unsupervised Domain Adaptation for Constraining Star Formation Histories. Astronomy. 2024; 3(3):189-207. https://doi.org/10.3390/astronomy3030012
Chicago/Turabian StyleGilda, Sankalp, Antoine de Mathelin, Sabine Bellstedt, and Guillaume Richard. 2024. "Unsupervised Domain Adaptation for Constraining Star Formation Histories" Astronomy 3, no. 3: 189-207. https://doi.org/10.3390/astronomy3030012
APA StyleGilda, S., de Mathelin, A., Bellstedt, S., & Richard, G. (2024). Unsupervised Domain Adaptation for Constraining Star Formation Histories. Astronomy, 3(3), 189-207. https://doi.org/10.3390/astronomy3030012