| Keyword search (4,163 papers available) | ![]() |
"Neural networks" Keyword-tagged Publications:
| Title | Authors | PubMed ID | |
|---|---|---|---|
| 1 | Tuning Deep Learning for Predicting Aluminum Prices Under Different Sampling: Bayesian Optimization Versus Random Search | Alicia Estefania Antonio Figueroa | 41751647 CONCORDIA |
| 2 | Distinguishing Between Healthy and Unhealthy Newborns Based on Acoustic Features and Deep Learning Neural Networks Tuned by Bayesian Optimization and Random Search Algorithm | Lahmiri S; Tadj C; Gargour C; | 41294952 ENCS |
| 3 | Efficient neural encoding as revealed by bilingualism | Moore C; Donhauser PW; Klein D; Byers-Heinlein K; | 40828024 PSYCHOLOGY |
| 4 | Personalizing brain stimulation: continual learning for sleep spindle detection | Sobral M; Jourde HR; Marjani Bajestani SE; Coffey EBJ; Beltrame G; | 40609549 PSYCHOLOGY |
| 5 | Parallel boosting neural network with mutual information for day-ahead solar irradiance forecasting | Ahmed U; Mahmood A; Khan AR; Kuhlmann L; Alimgeer KS; Razzaq S; Aziz I; Hammad A; | 40185800 PHYSICS |
| 6 | Large language models deconstruct the clinical intuition behind diagnosing autism | Stanley J; Rabot E; Reddy S; Belilovsky E; Mottron L; Bzdok D; | 40147442 ENCS |
| 7 | MuscleMap: An Open-Source, Community-Supported Consortium for Whole-Body Quantitative MRI of Muscle | McKay MJ; Weber KA; Wesselink EO; Smith ZA; Abbott R; Anderson DB; Ashton-James CE; Atyeo J; Beach AJ; Burns J; Clarke S; Collins NJ; Coppieters MW; Cornwall J; Crawford RJ; De Martino E; Dunn AG; Eyles JP; Feng HJ; Fortin M; Franettovich Smith MM; Galloway G; Gandomkar Z; Glastras S; Henderson LA; Hides JA; Hiller CE; Hilmer SN; Hoggarth MA; Kim B; Lal N; LaPorta L; Magnussen JS; Maloney S; March L; Nackley AG; O' Leary SP; Peolsson A; Perraton Z; Pool-Goudzwaard AL; Schnitzler M; Seitz AL; Semciw AI; Sheard PW; Smith AC; Snodgrass SJ; Sullivan J; Tran V; Valentin S; Walton DM; Wishart LR; Elliott JM; | 39590726 HKAP |
| 8 | A protocol for trustworthy EEG decoding with neural networks | Borra D; Magosso E; Ravanelli M; | 39549492 ENCS |
| 9 | Near-optimal learning of Banach-valued, high-dimensional functions via deep neural networks | Adcock B; Brugiapaglia S; Dexter N; Moraga S; | 39454372 MATHSTATS |
| 10 | Deep neural network-based robotic visual servoing for satellite target tracking | Ghiasvand S; Xie WF; Mohebbi A; | 39440297 ENCS |
| 11 | Generalization limits of Graph Neural Networks in identity effects learning | D' Inverno GA; Brugiapaglia S; Ravanelli M; | 39426036 ENCS |
| 12 | The immunomodulatory effect of oral NaHCO3 is mediated by the splenic nerve: multivariate impact revealed by artificial neural networks | Alvarez MR; Alkaissi H; Rieger AM; Esber GR; Acosta ME; Stephenson SI; Maurice AV; Valencia LMR; Roman CA; Alarcon JM; | 38549144 CSBN |
| 13 | Reinforcement learning for automatic quadrilateral mesh generation: A soft actor-critic approach | Pan J; Huang J; Cheng G; Zeng Y; | 36375347 ENCS |
| 14 | Comparative Evaluation of Artificial Neural Networks and Data Analysis in Predicting Liposome Size in a Periodic Disturbance Micromixer | Ocampo I; López RR; Camacho-León S; Nerguizian V; Stiharu I; | 34683215 ENCS |
| 15 | X-Vectors: New Quantitative Biomarkers for Early Parkinson's Disease Detection From Speech | Jeancolas L; Petrovska-Delacrétaz D; Mangone G; Benkelfat BE; Corvol JC; Vidailhet M; Lehéricy S; Benali H; | 33679361 PERFORM |
| Title: | Parallel boosting neural network with mutual information for day-ahead solar irradiance forecasting | ||||
| Authors: | Ahmed U, Mahmood A, Khan AR, Kuhlmann L, Alimgeer KS, Razzaq S, Aziz I, Hammad A | ||||
| Link: | https://pubmed.ncbi.nlm.nih.gov/40185800/ | ||||
| DOI: | 10.1038/s41598-025-95891-1 | ||||
| Publication: | Scientific reports | ||||
| Keywords: | Dimensionality reduction; Integrated approach; Neural networks; Parallel computing; Solar irradiance forecasting; | ||||
| PMID: | 40185800 | Category: | Date Added: | 2025-04-05 | |
| Dept Affiliation: |
PHYSICS
1 Department of Electrical Engineering, Mirpur University of Science and Technology (MUST), Mirpur, 10250, Pakistan. 2 James Watt School of Engineering, University of Glasgow, Glasgow, G128QQ, UK. 3 Department of Data Science and AI, Faculty of Information Technology, Monash University, Room 273, Woodside Building, Clayton Campus, Clayton, Australia. 4 Department of Electrical and Computer Engineering, COMSATS University Islamabad, Islamabad, 45550, Pakistan. 5 Faculty of Information and Technology, Majan University College, Muscat, Sultanate of Oman. 6 Department of Physics and Astronomy, Uppsala University, P.O Box: 75120, Uppsala, Sweden. imran.aziz@physics.uu.se. 7 Concordia Institute for Information Systems Engineering, Concordia University, Montreal, QC, Canada. |
||||
Description: |
The transition to sustainable energy has become imperative due to the depletion of fossil fuels. Solar energy presents a viable alternative owing to its abundance and environmental benefits. However, the intermittent nature of solar energy requires accurate forecasting of solar irradiance (SI) for reliable operation of photovoltaics (PVs) integrated systems. Traditional deep learning (DL) models and decision tree (DT)-based algorithms have been widely employed for this purpose. However, DL models often demand substantial computational resources and large datasets, while DT algorithms lack generalizability. To address these limitations, this study proposes a novel parallel boosting neural network (PBNN) framework that integrates boosting algorithms with a feedforward neural network (FFNN). The proposed framework leverages three boosting DT algorithms, Extreme Gradient Boosting (XgBoost), Categorical Boosting (CatBoost), and Random Forest (RF) regressors as base learners, operating in parallel. The intermediary forecasts from these base learners are concatenated and input into the FFNN, which assigns optimal weights to generate the final prediction. The proposed PBNN is trained and evaluated on two geographical datasets and compared with state-of-the-art techniques. The mutual information (MI) algorithm is implemented as a feature selection technique to identify the most important features for forecasting. Results demonstrate that when trained with the selected features, the mean absolute percentage error (MAPE) of PBNN is improved by [Formula: see text], and [Formula: see text] for Islamabad and San Diego city datasets, respectively. Furthermore, a literature comparison of the PBNN is also performed for robustness analysis. Source code and datasets are available at https://github.com/Ubaid014/Parallel-Boosting-Neural-Network/tree/main. |



