Showing 1 - 6 results of 6 for search 'Dead OR Alive Xtreme', query time: 0.05s Refine Results
  1. 1

    Forecasting mental states in schizophrenia using digital phenotyping data. by Thierry Jean, Rose Guay Hottin, Pierre Orban

    Published 2025-02-01
    “…Besides it remains unclear which machine learning algorithm is best suited for forecast tasks, the eXtreme Gradient Boosting (XGBoost) and long short-term memory (LSTM) algorithms being 2  popular choices in digital phenotyping studies. …”
    Get full text
    Article
  2. 2

    Constructing a machine learning model for systemic infection after kidney stone surgery based on CT values by Jiaxin Li, Yao Du, Gaoming Huang, Yawei Huang, Xiaoqing Xi, Zhenfeng Ye

    Published 2025-02-01
    “…All five machine learning models demonstrated strong discrimination on the validation set (AUC: 0.690–0.858). The eXtreme Gradient Boosting (XGBoost) model was the best performer [AUC: 0.858; sensitivity: 0.877; specificity: 0.981; accuracy: 0.841; positive predictive value: 0.629; negative predictive value: 0.851]. …”
    Get full text
    Article
  3. 3

    Multiple PM Low-Cost Sensors, Multiple Seasons’ Data, and Multiple Calibration Models by S Srishti, Pratyush Agrawal, Padmavati Kulkarni, Hrishikesh Chandra Gautam, Meenakshi Kushwaha, V. Sreekanth

    Published 2023-02-01
    “…The ML models included (i) Decision Tree, (ii) Random Forest (RF), (iii) eXtreme Gradient Boosting, and (iv) Support Vector Regression (SVR). …”
    Get full text
    Article
  4. 4

    Using Deep Learning to Identify High-Risk Patients with Heart Failure with Reduced Ejection Fraction by Zhibo Wang, Xin Chen, Xi Tan, Lingfeng Yang, Kartik Kannapur, Justin L. Vincent, Garin N. Kessler, Boshu Ru, Mei Yang

    Published 2021-07-01
    “…For comparison, we also tested multiple traditional machine learning models including logistic regression, random forest, and eXtreme Gradient Boosting (XGBoost). Model performance was assessed by area under the curve (AUC) values, precision, and recall on an independent testing dataset. …”
    Get full text
    Article
  5. 5

    RCE-IFE: recursive cluster elimination with intra-cluster feature elimination by Cihan Kuzudisli, Burcu Bakir-Gungor, Bahjat Qaqish, Malik Yousef

    Published 2025-02-01
    “…Furthermore, RCE-IFE surpasses several state-of-the-art FS methods, such as Minimum Redundancy Maximum Relevance (MRMR), Fast Correlation-Based Filter (FCBF), Information Gain (IG), Conditional Mutual Information Maximization (CMIM), SelectKBest (SKB), and eXtreme Gradient Boosting (XGBoost), obtaining an average AUC of 0.76 on five gene expression datasets. …”
    Get full text
    Article
  6. 6

    Establishing a radiomics model using contrast-enhanced ultrasound for preoperative prediction of neoplastic gallbladder polyps exceeding 10 mm by Dong Jiang, Yi Qian, Yijun Gu, Ru Wang, Hua Yu, Zhenmeng Wang, Hui Dong, Dongyu Chen, Yan Chen, Haozheng Jiang, Yiran Li

    Published 2025-02-01
    “…This model, derived from machine learning frameworks including Support Vector Machine (SVM), Logistic Regression (LR), Multilayer Perceptron (MLP), k-Nearest Neighbors (KNN), and eXtreme Gradient Boosting (XGBoost) with fivefold cross-validation, showed AUCs of 0.95 (95% CI: 0.90–0.99) and 0.87 (95% CI: 0.72–1.0) in internal validation. …”
    Get full text
    Article