CD4+ T lymphocytes are an important part of human immune cells and the main target cells of human immunodeficiency virus (HIV) infection. Acquired immune deficiency syndrome (AIDS) is a chronic infectious disease that usually takes 2–8 years from infection to onset. Research has found that only a small number of CD4+ T cells in HIV infected individuals are in an HIV infected state, and the dynamic changes of HIV-infected CD4+ T lymphocytes (HICTLs) at different stages of HIV infection are still unclear. Meanwhile, HICTLs are the source of the HIV reservoir. Therefore, analyzing the percentage and dynamic changes of infected cells in infected individuals is of great significance for AIDS research. However, due to the high variability of the HIV gene and the lack of highly sensitive detection methods, the standard method for distinguishing HIV infected and uninfected CD4+ T cells has not been established so far. This article summarizes the markers and detection methods of HICTLs at present.
CD4+ T lymphocytes are an important part of human immune cells and the main target cells of human immunodeficiency virus (HIV) infection. Acquired immune deficiency syndrome (AIDS) is a chronic infectious disease that usually takes 2–8 years from infection to onset. Research has found that only a small number of CD4+ T cells in HIV infected individuals are in an HIV infected state, and the dynamic changes of HIV-infected CD4+ T lymphocytes (HICTLs) at different stages of HIV infection are still unclear. Meanwhile, HICTLs are the source of the HIV reservoir. Therefore, analyzing the percentage and dynamic changes of infected cells in infected individuals is of great significance for AIDS research. However, due to the high variability of the HIV gene and the lack of highly sensitive detection methods, the standard method for distinguishing HIV infected and uninfected CD4+ T cells has not been established so far. This article summarizes the markers and detection methods of HICTLs at present.
Molecular diagnostics has become an integral part of modern clinical oncology. There are several dozen hereditary cancer syndromes; the detection of germline pathogenic variants in tumor-predisposing genes allows for the identification of subjects at-risk as well as guides the administration of cytotoxic and targeted drugs. The development of predictive tests for personalized drug-target matching is the best-known achievement of molecular oncology. For the time being, these assays are routinely utilized for the management of lung, breast, ovarian, colorectal, thyroid, biliary tract, endometrial, urothelial, and other malignancies. We are currently witnessing the emergence of practical applications of liquid biopsy. The detection of circulating tumor DNA (ctDNA) is a highly sensitive and specific procedure, which is currently used for the detection of secondary drug-resistant mutations, and holds great promise for the monitoring of malignant disease in oncological patients and early cancer detection in healthy individuals. While the utilization of molecular tests is currently limited to particular categories of cancer patients, their use is likely to become significantly more widespread in the near future. This trend will affect educational standards, requiring practicing physicians to become more familiar with molecular biology, and, vice versa, claiming some fluency in clinical oncology from laboratory specialists.
Molecular diagnostics has become an integral part of modern clinical oncology. There are several dozen hereditary cancer syndromes; the detection of germline pathogenic variants in tumor-predisposing genes allows for the identification of subjects at-risk as well as guides the administration of cytotoxic and targeted drugs. The development of predictive tests for personalized drug-target matching is the best-known achievement of molecular oncology. For the time being, these assays are routinely utilized for the management of lung, breast, ovarian, colorectal, thyroid, biliary tract, endometrial, urothelial, and other malignancies. We are currently witnessing the emergence of practical applications of liquid biopsy. The detection of circulating tumor DNA (ctDNA) is a highly sensitive and specific procedure, which is currently used for the detection of secondary drug-resistant mutations, and holds great promise for the monitoring of malignant disease in oncological patients and early cancer detection in healthy individuals. While the utilization of molecular tests is currently limited to particular categories of cancer patients, their use is likely to become significantly more widespread in the near future. This trend will affect educational standards, requiring practicing physicians to become more familiar with molecular biology, and, vice versa, claiming some fluency in clinical oncology from laboratory specialists.
Cardiovascular diseases (CVDs) are a leading cause of mortality globally, necessitating innovative approaches for improved diagnosis, prognosis, and treatment. Recent advances in artificial intelligence (AI) and machine learning (ML) have revolutionized cardiovascular medicine by leveraging vast multi-modal datasets—including genetic markers, imaging, and electronic health records (EHRs)—to provide patient-specific insights. This review highlights the transformative potential of AI applications, such as AI-enabled electrocardiograms (ECGs) and deep learning (DL)-based analysis, in enhancing diagnostic and prognostic accuracy and personalizing patient care. Notable progress includes predictive models for a variety of CVDs, including ischemic heart disease, atrial fibrillation, and heart failure, with performance metrics significantly surpassing traditional methods. Emerging technologies, such as explainable AI, large language models, and digital-twin technologies, further expand the horizons of precision cardiology. This paper also discusses challenges facing the AI and ML applications in CVDs and promising future directions.
Cardiovascular diseases (CVDs) are a leading cause of mortality globally, necessitating innovative approaches for improved diagnosis, prognosis, and treatment. Recent advances in artificial intelligence (AI) and machine learning (ML) have revolutionized cardiovascular medicine by leveraging vast multi-modal datasets—including genetic markers, imaging, and electronic health records (EHRs)—to provide patient-specific insights. This review highlights the transformative potential of AI applications, such as AI-enabled electrocardiograms (ECGs) and deep learning (DL)-based analysis, in enhancing diagnostic and prognostic accuracy and personalizing patient care. Notable progress includes predictive models for a variety of CVDs, including ischemic heart disease, atrial fibrillation, and heart failure, with performance metrics significantly surpassing traditional methods. Emerging technologies, such as explainable AI, large language models, and digital-twin technologies, further expand the horizons of precision cardiology. This paper also discusses challenges facing the AI and ML applications in CVDs and promising future directions.
There are controversies surrounding indications for prostate-specific membrane antigen (PSMA) positron-emission tomography (PET) and the subsequent management of localized disease. Conventional imaging is not a necessary prerequisite to PSMA PET, which serves as an equally effective, if not more effective frontline imaging tool. However, research conducted in different countries has shown conflicting results regarding its cost-effectiveness. Following accurate staging using PSMA PET, subsequent management is discussed by our expert team in this review, which incorporates the latest updates: (1) Brief global overview: the sustainability and cost-effectiveness of routine PET, as well as the treatment sequences of neoadjuvant vs. adjuvant androgen deprivation therapy (ADT) with radiotherapy, require further research. (2) Gonadotropin-releasing hormone antagonists demonstrate better response rates, lower recurrence rates, and fewer complications compared to agonists. (3) The unfavorable intermediate-risk group may undergo prostatectomy or radiotherapy combined with 4–6 months of ADT. Radiotherapy alone may be considered for patients with co-morbidities, Gleason score 7 (3 + 4), and positive biopsy cores < 50%, provided an escalated radiation dose is applied. (4) Three Prostate Advances in Comparative Evidence (PACE) studies demonstrated that stereotactic radiotherapy, greatly relying on PSMA PET, is as effective as surgery or conventional radiotherapy. (5) Findings from clinical trials indicate that pelvic nodal radiotherapy coverage provides a survival benefit. (6) A brachytherapy boost provides better outcomes compared to external beam boost, eliminating the need for ADT in intermediate-risk cancers and reducing ADT duration to 6 months in high-risk cancers. Even short-term use (4–6 months) of gonadotropin releasing hormone agonists can lead to cardiac morbidity. In summary, localized prostate cancer, as identified through the relatively new PSMA PET, can be managed in various ways. This review highlights significant updates on controversial issues relevant to both cancer patients and researchers.
There are controversies surrounding indications for prostate-specific membrane antigen (PSMA) positron-emission tomography (PET) and the subsequent management of localized disease. Conventional imaging is not a necessary prerequisite to PSMA PET, which serves as an equally effective, if not more effective frontline imaging tool. However, research conducted in different countries has shown conflicting results regarding its cost-effectiveness. Following accurate staging using PSMA PET, subsequent management is discussed by our expert team in this review, which incorporates the latest updates: (1) Brief global overview: the sustainability and cost-effectiveness of routine PET, as well as the treatment sequences of neoadjuvant vs. adjuvant androgen deprivation therapy (ADT) with radiotherapy, require further research. (2) Gonadotropin-releasing hormone antagonists demonstrate better response rates, lower recurrence rates, and fewer complications compared to agonists. (3) The unfavorable intermediate-risk group may undergo prostatectomy or radiotherapy combined with 4–6 months of ADT. Radiotherapy alone may be considered for patients with co-morbidities, Gleason score 7 (3 + 4), and positive biopsy cores < 50%, provided an escalated radiation dose is applied. (4) Three Prostate Advances in Comparative Evidence (PACE) studies demonstrated that stereotactic radiotherapy, greatly relying on PSMA PET, is as effective as surgery or conventional radiotherapy. (5) Findings from clinical trials indicate that pelvic nodal radiotherapy coverage provides a survival benefit. (6) A brachytherapy boost provides better outcomes compared to external beam boost, eliminating the need for ADT in intermediate-risk cancers and reducing ADT duration to 6 months in high-risk cancers. Even short-term use (4–6 months) of gonadotropin releasing hormone agonists can lead to cardiac morbidity. In summary, localized prostate cancer, as identified through the relatively new PSMA PET, can be managed in various ways. This review highlights significant updates on controversial issues relevant to both cancer patients and researchers.
Hepatitis B virus (HBV) infection is an important global public issue with several routes of transmission. HBV infection among healthcare workers (HCWs) is of particular concern due to the high transmissibility of the virus compared with other bloodborne viruses, putting infected HCWs at risk of transmitting HBV to patients. These considerations necessitate the implementation of standard recommendations for preventing and managing HBV in HCWs. Over the past four decades, the incidence of HBV transmission from HCWs to patients has significantly decreased in frequency, and this could be due to enhanced screening, routine vaccination of HCWs, and following general precautions such as double-gloving during exposure-prone procedures. The protocols for monitoring infected HCWs differ based on the guidelines provided by health authorities.
Hepatitis B virus (HBV) infection is an important global public issue with several routes of transmission. HBV infection among healthcare workers (HCWs) is of particular concern due to the high transmissibility of the virus compared with other bloodborne viruses, putting infected HCWs at risk of transmitting HBV to patients. These considerations necessitate the implementation of standard recommendations for preventing and managing HBV in HCWs. Over the past four decades, the incidence of HBV transmission from HCWs to patients has significantly decreased in frequency, and this could be due to enhanced screening, routine vaccination of HCWs, and following general precautions such as double-gloving during exposure-prone procedures. The protocols for monitoring infected HCWs differ based on the guidelines provided by health authorities.
To analyze cancer cases among male long-lived in the Lviv region (Ukraine) during 1991–2019.
Our retrospective study analyzed cancer cases among 312 males aged 90+ from the Lviv region (Ukraine) during 1991–2019 using data from the Cancer Registry of Lviv Oncologic Regional Treatment and Diagnostic Center.
Among 312 long-lived men, 25 types of cancer were diagnosed (median age 92.5 ± 2 years): skin cancer (30.77%), prostate cancer (19.87%), and stomach cancer (7.69%) were the most common. A higher proportion of cancer patients lived in urban areas (72.12%) compared to rural areas (27.88%), indicating better healthcare access in urban areas, which may contribute to better survival rates. A strong negative correlation was found between age and survival duration, with a Pearson’s correlation coefficient (r) of −0.93. As age increases, the number of long-lived and the number of cancer diagnoses decreases, especially after age 93. Early-stage cancers (Stages 0 and I) showed significantly better survival outcomes, with 100% survival for 3–4 years in Stage 0, while Stage IV had a 78.13% mortality rate within one year, with no survivors beyond three years. For patients over 96, survival beyond one year is rare, indicating the need for a palliative care approach focused on symptom management rather than curative treatment.
Future research should focus on improving early cancer detection for elderly patients, developing tailored treatments for aging individuals, and improving healthcare access for rural populations.
To analyze cancer cases among male long-lived in the Lviv region (Ukraine) during 1991–2019.
Our retrospective study analyzed cancer cases among 312 males aged 90+ from the Lviv region (Ukraine) during 1991–2019 using data from the Cancer Registry of Lviv Oncologic Regional Treatment and Diagnostic Center.
Among 312 long-lived men, 25 types of cancer were diagnosed (median age 92.5 ± 2 years): skin cancer (30.77%), prostate cancer (19.87%), and stomach cancer (7.69%) were the most common. A higher proportion of cancer patients lived in urban areas (72.12%) compared to rural areas (27.88%), indicating better healthcare access in urban areas, which may contribute to better survival rates. A strong negative correlation was found between age and survival duration, with a Pearson’s correlation coefficient (r) of −0.93. As age increases, the number of long-lived and the number of cancer diagnoses decreases, especially after age 93. Early-stage cancers (Stages 0 and I) showed significantly better survival outcomes, with 100% survival for 3–4 years in Stage 0, while Stage IV had a 78.13% mortality rate within one year, with no survivors beyond three years. For patients over 96, survival beyond one year is rare, indicating the need for a palliative care approach focused on symptom management rather than curative treatment.
Future research should focus on improving early cancer detection for elderly patients, developing tailored treatments for aging individuals, and improving healthcare access for rural populations.
Technological breakthroughs over the past quarter century have had a huge impact on the broad area of medicine. Biological processes are now amenable to integrative examination at multiple levels of analysis, ranging from molecular to organismal levels. The Human Genome Project, in particular, paved the way for a new age in medicine that is commonly referred to as Precision (or Personalized) Medicine. These changes in the health sciences world are perceived both from the patient and clinician’s perspectives. The present article focuses on the insulin-like growth factor-1 (IGF1) axis, an important endocrine network with key roles in physiological and pathological conditions. We aim to provide the reader with an overview of the basic and clinical aspects of the IGF system, with a particular emphasis on ongoing efforts to target the IGF axis for therapeutic purposes. The potential impact of precision medicine on IGF1 clinical research is discussed.
Technological breakthroughs over the past quarter century have had a huge impact on the broad area of medicine. Biological processes are now amenable to integrative examination at multiple levels of analysis, ranging from molecular to organismal levels. The Human Genome Project, in particular, paved the way for a new age in medicine that is commonly referred to as Precision (or Personalized) Medicine. These changes in the health sciences world are perceived both from the patient and clinician’s perspectives. The present article focuses on the insulin-like growth factor-1 (IGF1) axis, an important endocrine network with key roles in physiological and pathological conditions. We aim to provide the reader with an overview of the basic and clinical aspects of the IGF system, with a particular emphasis on ongoing efforts to target the IGF axis for therapeutic purposes. The potential impact of precision medicine on IGF1 clinical research is discussed.
Lung cancer is a leading cause of cancer-related deaths globally, where early and accurate diagnosis significantly improves survival rates. This study proposes an AI-based diagnostic framework integrating U-Net for lung nodule segmentation and a custom convolutional neural network (CNN) for binary classification of nodules as benign or malignant.
The model was developed using the Barnard Institute of Radiology (BIR) Lung CT dataset. U-Net was used for segmentation, and a custom CNN, compared with EfficientNet B0, VGG-16, and Inception v3, was implemented for classification. Due to limited subtype labels and diagnostically ambiguous “suspicious” cases, classification was restricted to a binary task. These uncertain cases were reserved for validation. Overfitting was addressed through stratified 5-fold cross-validation, dropout, early stopping, L2 regularization, and data augmentation.
EfficientNet B0 achieved ~99.3% training and ~97% validation accuracy. Cross-validation yielded consistent metrics (accuracy: 0.983 ± 0.014; F1-score: 0.983 ± 0.006; AUC = 0.990), confirming robustness. External validation on the LIDC-IDRI dataset demonstrated generalizability across diverse populations.
The proposed AI model shows strong potential for clinical deployment in lung cancer diagnosis. Future work will address demographic bias, expand multi-center data inclusion, and explore regulatory pathways for real-world integration.
Lung cancer is a leading cause of cancer-related deaths globally, where early and accurate diagnosis significantly improves survival rates. This study proposes an AI-based diagnostic framework integrating U-Net for lung nodule segmentation and a custom convolutional neural network (CNN) for binary classification of nodules as benign or malignant.
The model was developed using the Barnard Institute of Radiology (BIR) Lung CT dataset. U-Net was used for segmentation, and a custom CNN, compared with EfficientNet B0, VGG-16, and Inception v3, was implemented for classification. Due to limited subtype labels and diagnostically ambiguous “suspicious” cases, classification was restricted to a binary task. These uncertain cases were reserved for validation. Overfitting was addressed through stratified 5-fold cross-validation, dropout, early stopping, L2 regularization, and data augmentation.
EfficientNet B0 achieved ~99.3% training and ~97% validation accuracy. Cross-validation yielded consistent metrics (accuracy: 0.983 ± 0.014; F1-score: 0.983 ± 0.006; AUC = 0.990), confirming robustness. External validation on the LIDC-IDRI dataset demonstrated generalizability across diverse populations.
The proposed AI model shows strong potential for clinical deployment in lung cancer diagnosis. Future work will address demographic bias, expand multi-center data inclusion, and explore regulatory pathways for real-world integration.
Cardiovascular diseases (CVDs) are the leading causes of morbidity and mortality worldwide. Yet, drug discovery for these conditions faces significant challenges due to the complexity and heterogeneity of their underlying pathology. Recently, artificial intelligence (AI) techniques—particularly explainable AI (XAI)—have emerged as powerful multi-omics data analyzing tools to unravel pathological mechanisms and novel therapeutic targets. However, the application of XAI in cardiovascular drug discovery remains in its infancy. This review discusses the potential for the integration of AI with multi-omics data to identify novel therapeutic targets and repurpose existing drugs for myocardial infarction (MI) and heart failure (HF). This review highlights the current gap in leveraging XAI for CVDs and discusses key challenges such as data heterogeneity, model interpretability, and translational validation. This review also describes emerging approaches, including combining AI with mechanistic models, that aim to enhance the biological relevance of AI predictions. By utilizing genomic, transcriptomic, epigenomic, proteomic, and metabolomic datasets, AI-driven methods can uncover new biomarkers and predict drug responses with greater precision. The application of AI in analyzing large-scale clinical and molecular data offers significant promise in accelerating drug discovery, refining therapeutic strategies, and improving outcomes for patients with CVDs. This review highlights recent advancements, challenges, and future directions for AI-guided drug discovery in the context of MI and HF.
Cardiovascular diseases (CVDs) are the leading causes of morbidity and mortality worldwide. Yet, drug discovery for these conditions faces significant challenges due to the complexity and heterogeneity of their underlying pathology. Recently, artificial intelligence (AI) techniques—particularly explainable AI (XAI)—have emerged as powerful multi-omics data analyzing tools to unravel pathological mechanisms and novel therapeutic targets. However, the application of XAI in cardiovascular drug discovery remains in its infancy. This review discusses the potential for the integration of AI with multi-omics data to identify novel therapeutic targets and repurpose existing drugs for myocardial infarction (MI) and heart failure (HF). This review highlights the current gap in leveraging XAI for CVDs and discusses key challenges such as data heterogeneity, model interpretability, and translational validation. This review also describes emerging approaches, including combining AI with mechanistic models, that aim to enhance the biological relevance of AI predictions. By utilizing genomic, transcriptomic, epigenomic, proteomic, and metabolomic datasets, AI-driven methods can uncover new biomarkers and predict drug responses with greater precision. The application of AI in analyzing large-scale clinical and molecular data offers significant promise in accelerating drug discovery, refining therapeutic strategies, and improving outcomes for patients with CVDs. This review highlights recent advancements, challenges, and future directions for AI-guided drug discovery in the context of MI and HF.
Major gaps in hypertension control continue despite effective treatments. Among patients who are diagnosed and treated, adherence to therapy remains a barrier to achieving and maintaining control. Multiple modalities have demonstrated success in facilitating adherence. Using fixed dose combination therapy and home self blood pressure (BP) monitoring are 2 major approaches offering significant advantages to improve adherence, lower BP, and ultimately improve outcomes. While no single modality is universal for all patients, exploring the advantages and challenges of these modalities is a key strategy to identify the ideal approach to achieve better BP control.
Major gaps in hypertension control continue despite effective treatments. Among patients who are diagnosed and treated, adherence to therapy remains a barrier to achieving and maintaining control. Multiple modalities have demonstrated success in facilitating adherence. Using fixed dose combination therapy and home self blood pressure (BP) monitoring are 2 major approaches offering significant advantages to improve adherence, lower BP, and ultimately improve outcomes. While no single modality is universal for all patients, exploring the advantages and challenges of these modalities is a key strategy to identify the ideal approach to achieve better BP control.
To study the effect of body mass index (BMI) on interleukin (IL)-6 and IL-18 levels in children and adolescents with bronchial asthma (BA), taking into account the presence or absence of bronchial obstruction.
A single-center observational cross-sectional pilot study was conducted. Eighty-six patients with BA aged from 8 to 17 years were studied. Anthropometric and spirometric parameters and IL-6, IL-18 levels were assessed. The study participants were divided into 2 groups: 1—patients with normal body weight (BW), 2—overweight/obese.
The serum levels of IL-6 and IL-18 were statistically significantly higher in overweight/obese patients than in normal BW patients, being 1.18 [0.10; 3.27] pg/mL vs. 0.52 [0.10; 1.34] pg/mL, P = 0.036 and 251.0 [207.0; 346.0] pg/mL vs. 208.0 [134.0; 293.0] pg/mL, P = 0.012, respectively. Statistically significant direct correlations of IL-6 and IL-18 with z BMI were obtained in the total group: R = 0.35, P = 0.001; R = 0.37, P = 0.002, respectively. In the overweight/obese group, IL-6 and IL-18 levels were statistically significantly higher in patients with obstruction [z forced expiratory volume in one second (FEV1)/forced vital capacity (FVC) < –1.645 z] than in patients without obstruction (z FEV1/FVC > –1.645 z), respectively: 1.74 [1.10; 5.41] pg/mL vs. 0.59 [0.50; 1.48] pg/mL, P = 0.026 for IL-6 and 298.50 [207.00; 425.00] pg/mL vs. 234.50 [207.00; 300.00] pg/mL, P = 0.046 for IL-18.
In patients with BA and overweight/obese, but not in patients with BA and normal BW, the presence of bronchial obstruction is associated with higher serum levels of IL-6, IL-18. This may indicate the involvement of these ILs in the genesis of bronchial obstruction in patients with BA and overweight/obese.
To study the effect of body mass index (BMI) on interleukin (IL)-6 and IL-18 levels in children and adolescents with bronchial asthma (BA), taking into account the presence or absence of bronchial obstruction.
A single-center observational cross-sectional pilot study was conducted. Eighty-six patients with BA aged from 8 to 17 years were studied. Anthropometric and spirometric parameters and IL-6, IL-18 levels were assessed. The study participants were divided into 2 groups: 1—patients with normal body weight (BW), 2—overweight/obese.
The serum levels of IL-6 and IL-18 were statistically significantly higher in overweight/obese patients than in normal BW patients, being 1.18 [0.10; 3.27] pg/mL vs. 0.52 [0.10; 1.34] pg/mL, P = 0.036 and 251.0 [207.0; 346.0] pg/mL vs. 208.0 [134.0; 293.0] pg/mL, P = 0.012, respectively. Statistically significant direct correlations of IL-6 and IL-18 with z BMI were obtained in the total group: R = 0.35, P = 0.001; R = 0.37, P = 0.002, respectively. In the overweight/obese group, IL-6 and IL-18 levels were statistically significantly higher in patients with obstruction [z forced expiratory volume in one second (FEV1)/forced vital capacity (FVC) < –1.645 z] than in patients without obstruction (z FEV1/FVC > –1.645 z), respectively: 1.74 [1.10; 5.41] pg/mL vs. 0.59 [0.50; 1.48] pg/mL, P = 0.026 for IL-6 and 298.50 [207.00; 425.00] pg/mL vs. 234.50 [207.00; 300.00] pg/mL, P = 0.046 for IL-18.
In patients with BA and overweight/obese, but not in patients with BA and normal BW, the presence of bronchial obstruction is associated with higher serum levels of IL-6, IL-18. This may indicate the involvement of these ILs in the genesis of bronchial obstruction in patients with BA and overweight/obese.
To assess the effect of a lipophilic angiotensin-converting enzyme inhibitor (ACEI) in combination with a calcium antagonist (CA) on the 24-hour blood pressure (BP) profile, systemic inflammation in patients with arterial hypertension (AH) and metabolic disorders (MD).
Fifty-eight patients with the ≥ 2nd degree of AH was divided into 3 groups: patients with AH without metabolic syndrome (MS), patients with AH and MS, and patients with AH and diabetes mellitus (DM). Taking into account the BP profile characteristics, therapy with ACEI perindopril and CA amlodipine in a fixed combination (FC) was prescribed. The observation period for patients was 12 weeks.
A profile with an insufficient decrease in BP at night was more often detected in persons with MS having DM and nocturnal hypertension. In patients with AH and DM, the values of daily BP variability exceeded those in persons without MS (P < 0.05). Patients with MS had a higher concentration of high-sensitivity C-reactive protein (hsCRP) compared to patients without MS (3.5 mg/L; P < 0.01). Patients with DM and AH achieved target BP in 60% of cases during treatment: office BP decreased to 134.8 (17.97 kPa) ± 8.7/83.2 (11.09 kPa) ± 6.7 mmHg (∆ = –31/–16 mmHg), 90% of patients required maximum therapeutic doses of antihypertensive therapy (AHT). A decrease in the hsCRP concentration was detected (P < 0.05) in patients of groups 2 and 3, which showed practical possibility of average/maximum therapeutic doses influence on the activity of systemic inflammation (∆ = –12.8% in patients group 2 and ∆ = –11.2% in patients of group 3).
A combination of a lipophilic ACEI and a vasoselective СA promotes good BP control, a decrease in the activity of systemic inflammation, and hypersympathicotonia in patients with MD.
To assess the effect of a lipophilic angiotensin-converting enzyme inhibitor (ACEI) in combination with a calcium antagonist (CA) on the 24-hour blood pressure (BP) profile, systemic inflammation in patients with arterial hypertension (AH) and metabolic disorders (MD).
Fifty-eight patients with the ≥ 2nd degree of AH was divided into 3 groups: patients with AH without metabolic syndrome (MS), patients with AH and MS, and patients with AH and diabetes mellitus (DM). Taking into account the BP profile characteristics, therapy with ACEI perindopril and CA amlodipine in a fixed combination (FC) was prescribed. The observation period for patients was 12 weeks.
A profile with an insufficient decrease in BP at night was more often detected in persons with MS having DM and nocturnal hypertension. In patients with AH and DM, the values of daily BP variability exceeded those in persons without MS (P < 0.05). Patients with MS had a higher concentration of high-sensitivity C-reactive protein (hsCRP) compared to patients without MS (3.5 mg/L; P < 0.01). Patients with DM and AH achieved target BP in 60% of cases during treatment: office BP decreased to 134.8 (17.97 kPa) ± 8.7/83.2 (11.09 kPa) ± 6.7 mmHg (∆ = –31/–16 mmHg), 90% of patients required maximum therapeutic doses of antihypertensive therapy (AHT). A decrease in the hsCRP concentration was detected (P < 0.05) in patients of groups 2 and 3, which showed practical possibility of average/maximum therapeutic doses influence on the activity of systemic inflammation (∆ = –12.8% in patients group 2 and ∆ = –11.2% in patients of group 3).
A combination of a lipophilic ACEI and a vasoselective СA promotes good BP control, a decrease in the activity of systemic inflammation, and hypersympathicotonia in patients with MD.
Nutrients containing a trimethylamine (TMA) moiety in their structure can be metabolized by the gut microbiota through enzymatic cleavage of the C-N bond, producing TMA. In the liver, TMA is subsequently oxidized to trimethylamine N-oxide (TMAO) by flavin monooxygenases (FMOs). TMAO exerts pro-atherogenic and pro-inflammatory effects that contribute mechanistically to several chronic inflammatory diseases including cardiovascular disease, chronic kidney disease, obesity, non-alcoholic fatty liver disease, and neurodegenerative diseases. Targeting this metaorganismal pathway may offer substantial health benefits in the prevention and treatment of chronic inflammatory conditions.
Nutrients containing a trimethylamine (TMA) moiety in their structure can be metabolized by the gut microbiota through enzymatic cleavage of the C-N bond, producing TMA. In the liver, TMA is subsequently oxidized to trimethylamine N-oxide (TMAO) by flavin monooxygenases (FMOs). TMAO exerts pro-atherogenic and pro-inflammatory effects that contribute mechanistically to several chronic inflammatory diseases including cardiovascular disease, chronic kidney disease, obesity, non-alcoholic fatty liver disease, and neurodegenerative diseases. Targeting this metaorganismal pathway may offer substantial health benefits in the prevention and treatment of chronic inflammatory conditions.
The interplay between the gut microbiota and the central nervous system is increasingly recognized as a critical factor in the pathogenesis and treatment responsiveness of brain tumors. The brain interacts with microbial communities, both systemically through the gut-brain axis and locally within the tumor microenvironment. The gut microbiota regulates systemic immunity and modulates key processes such as blood-brain barrier integrity, cytokine signaling, and neuroinflammation—all of which influence glioma development and resistance to therapies. Evidence from preclinical models indicates that modulation of the gut microbiota can enhance anti-tumor immunity and improve responses to immune checkpoint inhibitors (ICIs). In parallel, recent discoveries reveal the presence of bacterial DNA and viable microbes within glioma tissue initiating signaling cascades that modulate immune cell recruitment and polarization. These microbial-immune interactions shape the tumor’s immune landscape, favoring either anti-tumor immunity or immune evasion depending on the context. Additionally, microbial-derived metabolites, such as short-chain fatty acids, have been shown to influence gene expression through epigenetic mechanisms, including histone acetylation and regulation by non-coding RNAs. Such effects may contribute to tumor cell plasticity, metabolic reprogramming, and resistance to therapy. The reciprocal influence of glioma and its treatment on gut microbial ecology is also an important consideration. Therapeutic interventions such as antibiotics, corticosteroids, and chemotherapy can significantly disrupt the gut microbiota, potentially diminishing the efficacy of microbiota-driven immunomodulation. Therefore, understanding the bidirectional dynamics of the gut-brain-tumor axis is essential for the development of microbiome-informed therapies. Despite these promising insights, several challenges remain. In this review, we synthesize current knowledge on the role of the gut and intratumoral microbiota in glioma biology and treatment, focusing on immune modulation, therapeutic responsiveness, and potential for microbiota-informed interventions. We also discuss existing controversies, methodological limitations, and future research priorities in the context of advancing microbiome-based strategies in neuro-oncology.
The interplay between the gut microbiota and the central nervous system is increasingly recognized as a critical factor in the pathogenesis and treatment responsiveness of brain tumors. The brain interacts with microbial communities, both systemically through the gut-brain axis and locally within the tumor microenvironment. The gut microbiota regulates systemic immunity and modulates key processes such as blood-brain barrier integrity, cytokine signaling, and neuroinflammation—all of which influence glioma development and resistance to therapies. Evidence from preclinical models indicates that modulation of the gut microbiota can enhance anti-tumor immunity and improve responses to immune checkpoint inhibitors (ICIs). In parallel, recent discoveries reveal the presence of bacterial DNA and viable microbes within glioma tissue initiating signaling cascades that modulate immune cell recruitment and polarization. These microbial-immune interactions shape the tumor’s immune landscape, favoring either anti-tumor immunity or immune evasion depending on the context. Additionally, microbial-derived metabolites, such as short-chain fatty acids, have been shown to influence gene expression through epigenetic mechanisms, including histone acetylation and regulation by non-coding RNAs. Such effects may contribute to tumor cell plasticity, metabolic reprogramming, and resistance to therapy. The reciprocal influence of glioma and its treatment on gut microbial ecology is also an important consideration. Therapeutic interventions such as antibiotics, corticosteroids, and chemotherapy can significantly disrupt the gut microbiota, potentially diminishing the efficacy of microbiota-driven immunomodulation. Therefore, understanding the bidirectional dynamics of the gut-brain-tumor axis is essential for the development of microbiome-informed therapies. Despite these promising insights, several challenges remain. In this review, we synthesize current knowledge on the role of the gut and intratumoral microbiota in glioma biology and treatment, focusing on immune modulation, therapeutic responsiveness, and potential for microbiota-informed interventions. We also discuss existing controversies, methodological limitations, and future research priorities in the context of advancing microbiome-based strategies in neuro-oncology.
Ebola virus (EBOV) infection usually leads to highly lethal EBOV disease (EVD) with associated viraemia. Viraemia is cleared in those that do survive, however, EBOV may remain hidden in the testes and other immune privileged niches (IPNs) where it can persist for years during asymptomatic convalescence. Viral shedding into seminal fluid may result in sexual transmission to naive contacts years after EBOV outbreaks have been declared over. This leads to flare-ups of cases, redefining our understanding of the shaping and origin of EBOV outbreaks. Such delayed sexual transmission eliminates the geographical boundaries which typically constrain EBOV outbreaks, thus posing a significant global health security threat. Despite hints of EBOV persistence dating over half a century, it was only until the unprecedented scale of the 2013–2016 Western Africa EBOV epidemic that the true importance of this phenomenon was revealed to scientists, public health officials and policy makers alike. This review summarises the evidence for EBOV persistence, suggests the possible underlying molecular mechanisms and proposes future directions for research in the field. A meta-analysis is presented to further investigate the duration of EBOV shedding in seminal fluid. The ultimate aim is to develop therapeutics that clear sites of persistence. Such therapeutics could prevent the re-emergence of the persistent virus, eliminating the chance of new outbreaks whilst alleviating the severe stigmatisation facing the EBOV survivor population.
Ebola virus (EBOV) infection usually leads to highly lethal EBOV disease (EVD) with associated viraemia. Viraemia is cleared in those that do survive, however, EBOV may remain hidden in the testes and other immune privileged niches (IPNs) where it can persist for years during asymptomatic convalescence. Viral shedding into seminal fluid may result in sexual transmission to naive contacts years after EBOV outbreaks have been declared over. This leads to flare-ups of cases, redefining our understanding of the shaping and origin of EBOV outbreaks. Such delayed sexual transmission eliminates the geographical boundaries which typically constrain EBOV outbreaks, thus posing a significant global health security threat. Despite hints of EBOV persistence dating over half a century, it was only until the unprecedented scale of the 2013–2016 Western Africa EBOV epidemic that the true importance of this phenomenon was revealed to scientists, public health officials and policy makers alike. This review summarises the evidence for EBOV persistence, suggests the possible underlying molecular mechanisms and proposes future directions for research in the field. A meta-analysis is presented to further investigate the duration of EBOV shedding in seminal fluid. The ultimate aim is to develop therapeutics that clear sites of persistence. Such therapeutics could prevent the re-emergence of the persistent virus, eliminating the chance of new outbreaks whilst alleviating the severe stigmatisation facing the EBOV survivor population.
Parkinson’s disease (PD) is a progressive neurodegenerative disorder marked by dopaminergic neuron loss, leading to motor and non-motor symptoms. Cognitive impairment in PD (PD-CI), ranging from mild cognitive deficits to dementia, significantly reduces quality of life and increases caregiver burden. This study aims to identify predictors of PD-CI, potentially supporting early interventions.
This one-year prospective observational study included 80 idiopathic PD patients recruited from Al-Azhar University Hospitals. Inclusion followed the Movement Disorder Society (MDS) Clinical Diagnostic Criteria for PD, with evaluations conducted at baseline, 3, 6, and 12 months. Patients underwent clinical assessments [Unified Parkinson’s Disease Rating Scale, Part III (UPDRS-III), Hoehn and Yahr (H-Y), Beck Depression Inventory (BDI), and Hamilton Anxiety Rating Scale (HAM-A)], cognitive evaluations [Brief International Cognitive Assessment for MS (BICAMS) and Montreal Cognitive Assessment (MoCA)], magnetic resonance imaging (MRI), and laboratory testing. Exclusion criteria included conditions such as cerebellar abnormalities, early dementia diagnoses, and other Parkinsonism causes.
Cognitive impairment was observed in 41.25% of patients. Those with cognitive impairment were older, had a longer disease duration, and exhibited higher fasting blood glucose (FBS) levels and lower thyroid-stimulating hormone (TSH) levels compared to patients without cognitive impairment (p < 0.05). Brain atrophy was detected in 4 (5%) patients in a subset of patients with PD, which was particularly pronounced in regions associated with cognitive function, such as the hippocampus and frontal lobe. Higher H-Y, UPDRS-III, and BDI scores correlated with cognitive decline, while lower MoCA and Symbol Digit Modalities Test (SDMT) scores predicted impairment (p < 0.05).
Cognitive impairment in PD is associated with advanced age, longer disease duration, metabolic factors, and structural brain changes. These findings suggest the potential for a predictive model to identify early cognitive decline in PD, enabling timely intervention and improved patient outcomes.
Parkinson’s disease (PD) is a progressive neurodegenerative disorder marked by dopaminergic neuron loss, leading to motor and non-motor symptoms. Cognitive impairment in PD (PD-CI), ranging from mild cognitive deficits to dementia, significantly reduces quality of life and increases caregiver burden. This study aims to identify predictors of PD-CI, potentially supporting early interventions.
This one-year prospective observational study included 80 idiopathic PD patients recruited from Al-Azhar University Hospitals. Inclusion followed the Movement Disorder Society (MDS) Clinical Diagnostic Criteria for PD, with evaluations conducted at baseline, 3, 6, and 12 months. Patients underwent clinical assessments [Unified Parkinson’s Disease Rating Scale, Part III (UPDRS-III), Hoehn and Yahr (H-Y), Beck Depression Inventory (BDI), and Hamilton Anxiety Rating Scale (HAM-A)], cognitive evaluations [Brief International Cognitive Assessment for MS (BICAMS) and Montreal Cognitive Assessment (MoCA)], magnetic resonance imaging (MRI), and laboratory testing. Exclusion criteria included conditions such as cerebellar abnormalities, early dementia diagnoses, and other Parkinsonism causes.
Cognitive impairment was observed in 41.25% of patients. Those with cognitive impairment were older, had a longer disease duration, and exhibited higher fasting blood glucose (FBS) levels and lower thyroid-stimulating hormone (TSH) levels compared to patients without cognitive impairment (p < 0.05). Brain atrophy was detected in 4 (5%) patients in a subset of patients with PD, which was particularly pronounced in regions associated with cognitive function, such as the hippocampus and frontal lobe. Higher H-Y, UPDRS-III, and BDI scores correlated with cognitive decline, while lower MoCA and Symbol Digit Modalities Test (SDMT) scores predicted impairment (p < 0.05).
Cognitive impairment in PD is associated with advanced age, longer disease duration, metabolic factors, and structural brain changes. These findings suggest the potential for a predictive model to identify early cognitive decline in PD, enabling timely intervention and improved patient outcomes.
Obesity is a multifactorial chronic disease characterized by an excess of adipose tissue, placing a growing burden on individual health and public health systems worldwide. Here we aim to elucidate how obesity contributes to liver dysfunction and highlight the preventive, diagnostic, and management strategies that are most relevant to healthcare providers, researchers, and policy makers. To this end, a comprehensive literature search using major scientific databases was conducted. Various clinically heterogenous pathophenotypes, such as android, gynoid, sarcopenic, metabolically healthy and unhealthy obesity, exhibit variable associations with liver health in the context of chronic liver disease (CLD), including alcohol-related CLD, viral hepatitis B and C, and, particularly, metabolic dysfunction-associated steatotic liver disease (MASLD), which is the prototypic manifestation of obesity-associated CLD. Regardless of the etiology of CLD, obesity is a major risk factor for the progression to cirrhosis and hepatocellular carcinoma through a variety of lipotoxic, proinflammatory, pro-fibrotic, and carcinogenic pathomechanisms involving genetics and epigenetics, altered adipokine profile, oxidative stress, endoplasmic reticulum stress, apoptosis, intestinal dysbiosis, and altered gut-liver axis. Various strategies are available to address obesity-associated CLD, including lifestyle changes, endoscopic techniques, and metabolic/bariatric surgery. Integrative approaches bringing together clinicians, basic researchers, and public health experts will be crucial in developing a coherent, holistic framework to address, with a precision medicine approach, the rising tide of obesity-related CLD on a global scale.
Obesity is a multifactorial chronic disease characterized by an excess of adipose tissue, placing a growing burden on individual health and public health systems worldwide. Here we aim to elucidate how obesity contributes to liver dysfunction and highlight the preventive, diagnostic, and management strategies that are most relevant to healthcare providers, researchers, and policy makers. To this end, a comprehensive literature search using major scientific databases was conducted. Various clinically heterogenous pathophenotypes, such as android, gynoid, sarcopenic, metabolically healthy and unhealthy obesity, exhibit variable associations with liver health in the context of chronic liver disease (CLD), including alcohol-related CLD, viral hepatitis B and C, and, particularly, metabolic dysfunction-associated steatotic liver disease (MASLD), which is the prototypic manifestation of obesity-associated CLD. Regardless of the etiology of CLD, obesity is a major risk factor for the progression to cirrhosis and hepatocellular carcinoma through a variety of lipotoxic, proinflammatory, pro-fibrotic, and carcinogenic pathomechanisms involving genetics and epigenetics, altered adipokine profile, oxidative stress, endoplasmic reticulum stress, apoptosis, intestinal dysbiosis, and altered gut-liver axis. Various strategies are available to address obesity-associated CLD, including lifestyle changes, endoscopic techniques, and metabolic/bariatric surgery. Integrative approaches bringing together clinicians, basic researchers, and public health experts will be crucial in developing a coherent, holistic framework to address, with a precision medicine approach, the rising tide of obesity-related CLD on a global scale.
Since its emergence in 2019, coronavirus disease 2019 (COVID-19) has evolved into a global pandemic, placing extraordinary strain on healthcare systems and societies worldwide. Accurate forecasting of COVID-19 case trends is essential for effective public health planning and intervention.
This study employs the Susceptible-Infectious-Recovered (SIR) model to predict the progression of COVID-19 in three countries: Belarus, Thailand, and Lithuania. Instead of relying on static or globally derived estimates, the model parameters—infection rate (β) and removal rate (γ)—were dynamically calculated for distinct time periods in each country, using country-specific data extracted from Worldometer. This segmented approach accounts for temporal changes in transmission dynamics and public health responses.
The country-specific, phase-based parameter estimation improved the model’s alignment with real-world COVID-19 trends observed in Belarus, Thailand, and Lithuania. The refined forecasts closely matched the actual progression patterns in each country, demonstrating the value of adapting parameter estimates to local epidemiological contexts.
The proposed approach enhances the predictive accuracy of the SIR model, providing a practical and adaptable framework for forecasting COVID-19 trends in countries with varying pandemic responses. These findings highlight the importance of dynamic parameter adjustment when applying mathematical models to evolving public health crises, ensuring more reliable projections to guide decision-making.
Since its emergence in 2019, coronavirus disease 2019 (COVID-19) has evolved into a global pandemic, placing extraordinary strain on healthcare systems and societies worldwide. Accurate forecasting of COVID-19 case trends is essential for effective public health planning and intervention.
This study employs the Susceptible-Infectious-Recovered (SIR) model to predict the progression of COVID-19 in three countries: Belarus, Thailand, and Lithuania. Instead of relying on static or globally derived estimates, the model parameters—infection rate (β) and removal rate (γ)—were dynamically calculated for distinct time periods in each country, using country-specific data extracted from Worldometer. This segmented approach accounts for temporal changes in transmission dynamics and public health responses.
The country-specific, phase-based parameter estimation improved the model’s alignment with real-world COVID-19 trends observed in Belarus, Thailand, and Lithuania. The refined forecasts closely matched the actual progression patterns in each country, demonstrating the value of adapting parameter estimates to local epidemiological contexts.
The proposed approach enhances the predictive accuracy of the SIR model, providing a practical and adaptable framework for forecasting COVID-19 trends in countries with varying pandemic responses. These findings highlight the importance of dynamic parameter adjustment when applying mathematical models to evolving public health crises, ensuring more reliable projections to guide decision-making.
Lactoferrin (LF), an iron-binding protein, is found in mammalian milk. LF is also secreted by different cell phenotypes. LF shows a wide range of biological activities, as many preclinical and clinical studies indicate that LF and its derived peptides have many biological functions in host defence, including not only antibacterial, but also antiviral, antifungal, and antiparasitic effects. These results raise the view that these compounds might affect the composition of the intestinal microbiota. LF is generally recognized as safe (GRAS). This protein has been shown in experimental studies to exert beneficial effects on intestinal inflammation. This review will target the beneficial effects of oral LF supplements on the intestinal ecosystem during inflammation and highlight the mechanisms by which LF may contribute to reducing inflammatory flare, and present perspectives for future research.
Lactoferrin (LF), an iron-binding protein, is found in mammalian milk. LF is also secreted by different cell phenotypes. LF shows a wide range of biological activities, as many preclinical and clinical studies indicate that LF and its derived peptides have many biological functions in host defence, including not only antibacterial, but also antiviral, antifungal, and antiparasitic effects. These results raise the view that these compounds might affect the composition of the intestinal microbiota. LF is generally recognized as safe (GRAS). This protein has been shown in experimental studies to exert beneficial effects on intestinal inflammation. This review will target the beneficial effects of oral LF supplements on the intestinal ecosystem during inflammation and highlight the mechanisms by which LF may contribute to reducing inflammatory flare, and present perspectives for future research.
This study aimed to identify the predictors of treatment outcomes among children under five years of age hospitalized with severe acute malnutrition.
A hospital-based prospective cohort study was conducted among children under five years diagnosed with severe acute malnutrition. A total of 143 children were recruited using a consecutive sampling method. Univariable and multivariable logistic regression analyses were utilized to identify predictors of treatment outcomes. Survival analyses, including life-table analysis, Kaplan-Meier survival curves, the log-rank test, and the Cox proportional hazards model, were employed to estimate survival probabilities, recovery rates over time, and predictors of time to recovery.
Of the 143 enrolled children, 55.2% were male, and 58% were between 6 and 24 months of age. During a total of 1,802 child days of follow-up, the treatment outcomes were as follows: 60.8% of children recovered, 32.9% transferred out for other medical reasons, 4.2% defaulted, and 2.1% died. Key predictors of poor treatment outcomes included hypothermia [adjusted odds ratio (AOR) = 0.17; 95% confidence interval (CI): 0.03–0.94; p = 0.042], diarrhea (AOR = 0.28; CI: 0.12–0.66; p = 0.004), edema (AOR = 0.21; CI: 0.07–0.64; p = 0.006), and feeding with ready-to-use therapeutic food (RUTF) [recovery (AOR = 2.01, 95% CI: 0.92–4.96; p = 0.098)]. The median recovery time was 14 days (95% CI: 12.9–15.1).
The study highlighted suboptimal recovery rates and average daily weight gain among children treated for severe acute malnutrition. Diarrhea, hypothermia, and edema on admission were associated with lower nutritional recovery rates. These findings underscore the need for targeted interventions to address these factors and improve treatment outcomes in children with severe acute malnutrition.
This study aimed to identify the predictors of treatment outcomes among children under five years of age hospitalized with severe acute malnutrition.
A hospital-based prospective cohort study was conducted among children under five years diagnosed with severe acute malnutrition. A total of 143 children were recruited using a consecutive sampling method. Univariable and multivariable logistic regression analyses were utilized to identify predictors of treatment outcomes. Survival analyses, including life-table analysis, Kaplan-Meier survival curves, the log-rank test, and the Cox proportional hazards model, were employed to estimate survival probabilities, recovery rates over time, and predictors of time to recovery.
Of the 143 enrolled children, 55.2% were male, and 58% were between 6 and 24 months of age. During a total of 1,802 child days of follow-up, the treatment outcomes were as follows: 60.8% of children recovered, 32.9% transferred out for other medical reasons, 4.2% defaulted, and 2.1% died. Key predictors of poor treatment outcomes included hypothermia [adjusted odds ratio (AOR) = 0.17; 95% confidence interval (CI): 0.03–0.94; p = 0.042], diarrhea (AOR = 0.28; CI: 0.12–0.66; p = 0.004), edema (AOR = 0.21; CI: 0.07–0.64; p = 0.006), and feeding with ready-to-use therapeutic food (RUTF) [recovery (AOR = 2.01, 95% CI: 0.92–4.96; p = 0.098)]. The median recovery time was 14 days (95% CI: 12.9–15.1).
The study highlighted suboptimal recovery rates and average daily weight gain among children treated for severe acute malnutrition. Diarrhea, hypothermia, and edema on admission were associated with lower nutritional recovery rates. These findings underscore the need for targeted interventions to address these factors and improve treatment outcomes in children with severe acute malnutrition.