meta
dict
text
stringlengths
1
14.7k
{ "pmid": 35872090, "language": "eng" }
Quality of Care Among Patients with Diabetes and Cerebrovascular Disease. Insights from The Diabetes Collaborative Registry. Although secondary cardiovascular prevention is a focus among patients with type 2 diabetes (T2D) and coronary artery disease (CAD) or peripheral artery disease (PAD), the application of guideline-recommended therapy in T2D patients and isolated cerebrovascular disease (CeVD) remains unknown. In a US outpatient registry, T2D patients with established cardiovascular disease from 2014-2018 were categorized as: isolated CeVD, CeVD plus CAD or PAD, or isolated CAD/PAD. In each group, we determined the proportion with optimal secondary prevention (hemoglobin [Hb]A1C <8%, blood pressure <130/80 mm Hg, use of antithrombotics, use of statins, non-smoking/cessation counseling, and use of glucose-lowering medications with cardioprotective effects (sodium-glucose cotransporter [SGLT]-2 inhibitors, glucagon-like peptide [GLP]-1 receptor agonists, thiazolidinediones [TZDs]). Hierarchical Poisson regression was used to estimate relative rate of achieving each target across groups, adjusted for age and chronic kidney disease (where relevant). Our study included 727,467 T2D outpatients with cardiovascular disease (isolated CeVD [n = 99,777], CeVD plus CAD/PAD [n = 158,361], isolated CAD/PAD [n = 469,329]). Compared with isolated CAD/PAD patients, isolated CeVD patients more often had an HbA1c <8% (adjusted relative risk [aRR] 1.10; 95% confidence interval [CI], 1.08-1.11) but less often had a blood pressure of ≤130/80 mm Hg (aRR 0.93; 95% CI, 0.92-0.94) or were prescribed antithrombotics (0.84; 95% CI, 0.83-0.85), statins (0.86; 95% CI, 0.85-0.87), GLP-1 agonists (0.75; 95% CI, 0.73-0.78), SGLT2 inhibitors (0.73; 95% CI, 0.71-0.76), and TZDs (aRR 0.76; 95% CI, 0.73-0.78). Among T2D patients, those with isolated CeVD had the lowest rates of secondary cardiovascular prevention goals attainment. More focus is needed on secondary prevention in patients with CeVD.
{ "pmid": 35872095, "language": "eng" }
Xpert MTB/RIF Ultra outperformed the Xpert assay in tuberculosis lymphadenitis diagnosis: a prospective head-to-head cohort study. Xpert Mycobacterium tuberculosis/rifampin (MTB/RIF) Ultra (Xpert-Ultra) has shown better sensitivity in comparison with Xpert MTB/RIF (Xpert) in extrapulmonary tuberculosis (TB), whereas the head-to-head comparison of these methods in TB lymphadenitis had barely been performed. Patients with undiagnosed lymphadenopathy were recruited prospectively and consecutively, and fine-needle aspiration (FNA) biopsy or lymph node tissue was collected. The specimen was subjected to smear, culture, Xpert, and Xpert-Ultra assays. Culture and/or smear for acid-fast bacilli (AFB) or AFB observed on histopathology were performed as a reference. A total of 106 participants were recruited, including 41 confirmed TB, 33 probable TB, and 32 non-TB lymphadenopathies. The head-to-head comparison for MTB detection showed that Xpert-Ultra produced the highest sensitivity when compared with smear, culture, and Xpert (75.7% vs 5.4 %, 13.5%, and 48.7%). When Xpert-Ultra outcomes were integrated for diagnosis, the percentage of confirmed TB lymphadenitis cases increased from 55.4% (41/74) to 85.1% (63/74). The sensitivities of Xpert-Ultra and Xpert on tissue were 73.6% (95% CI: 59.4-84.3) and 39.6% (95% CI: 26.8-54.0), respectively. The sensitivity of Xpert-Ultra on FNA samples (81.0%, 95% CI: 57.4-93.7) was higher than that of Xpert (71.4%, 95% CI: 47.7-87.8). Xpert-Ultra detected significantly more TB lymphadenitis cases than Xpert or culture. This superiority was particularly distinct using lymph node tissue than FNA detection.
{ "pmid": 35872100, "language": "eng" }
Cardiac autonomic dysfunction in school age children with overweight and obesity. We investigated cardiac autonomic function in overweight and obese school-age children. Quantitative cross-sectional study conducted with children (n = 110) of both genders. Children were divided by normal weight (NW; n = 54), overweight (OW; n = 24) and obese (OB; n = 32). Systolic (SBP) and diastolic (DBP) blood pressure and electrocardiograms were recorded and analyzed for heart rate and the heart rate variability (HRV) in time (SDRR, RMSSD, PRR50, SD1 and SD2) and frequency domains (HF, LF and LF/HF). The OB group presented higher SBP (p ≤ 0.01) and DBP (p ≤ 0.01). For HRV, the OB group had a lower PRR50 (p ≤ 0.01) and HF (p ≤ 0.01), associated with higher LF (p ≤ 0.01). Moderate negative correlations were found between the HF, BMI (r = -0.37; p ≤ 0.01) and WC (r = -0.38; p ≤ 0.01). Positive moderate correlation were found between LF, LF/HF and BMI (LF: r = 0.32; p ≤ 0.01; LF/HF: r = 0.31; p ≤ 0.01) and WC (LF: r = 0.34; p ≤ 0.01; LF/HF: = 0.34; p ≤ 0.01). Multiple linear regression showed a positive association between body fat and the SDRR (β: 0.48; CI: 0.2-4.2; p = 0.02). No differences were observed in cardiac electrical activity. Children with obesity but not overweight presented higher blood pressure and cardiac autonomic dysfunction, with sympathetic predominance on the heart rate. This fact was positively correlated with BMI and may be considered an important marker for cardiovascular risk in children.
{ "pmid": 35872101, "language": "eng" }
Smoking and alcohol consumption influence the risk of cardiovascular diseases in Korean adults with elevated blood pressure. Cardiovascular disease (CVD) and hypertension are the main causes of global death. We aimed to investigate the independent and combined effects of smoking and alcohol consumption on CVD risk among Koreans with elevated blood pressure (BP). Adults aged 20-65 years with elevated BP and without pre-existing CVDs were selected from the National Health Insurance Service-National Sample Cohort version 2.0. We followed up 59,391 men and 35,253 women between 2009 and 2015. The association of CVD incidence with smoking pack-years and alcohol consumption was investigated using the multivariate Cox proportional hazard model. Among women, smokers (10.1-20.0 pack-years) and alcohol drinkers (≥30.0 g/day) had higher CVD risks (hazard ratio [HR] = 1.15, 95% confidence intervals [CI] 1.06-1.25, HR = 1.06, 95% CI 1.00-1.12, respectively) compared to each referent group. However, men who smoked exhibited an increased CVD risk only with pack-years >20.0 (HR = 1.09, 1.03-1.14 and HR = 1.18, 1.11-1.26 for smokers with 20.1-30.0 and ≥ 30.1 pack-years, respectively) compared to nonsmokers. In the combined groups of those smoking and consuming alcohol, only nonsmoking men consuming alcohol 1.0-29.9 g/day had a lower CVD risk than did nonsmoking, nondrinking men (HR = 0.90, 0.83-0.97). Women smoking 1.0-10.0 pack-years and consuming alcohol ≥30.0 g/day had a higher CVD risk (HR = 1.25, 1.11-1.41) than nonsmoking and nondrinking women. Smoking and alcohol consumption, independently and jointly, were associated with CVD risk in men and women. Women had a greater CVD risk than did men among Korean adults with elevated BP.
{ "pmid": 35872102, "language": "eng" }
Abnormalities of T cells in systemic lupus erythematosus: new insights in pathogenesis and therapeutic strategies. Systemic lupus erythematosus (SLE) is an autoimmune disease characterized by loss of immune tolerance and sustained production of autoantibodies. Multiple and profound T cell abnormalities in SLE are intertwined with disease expression. Both numerical and functional disturbances have been reported in main CD4+ T helper cell subsets including Th1, Th2, Th17, regulatory, and follicular helper cells. SLE CD4+ T cells are known to provide help to B cells, produce excessive IL-17 but insufficient IL-2, and infiltrate tissues. In the absence of sufficient amounts of IL-2, regulatory T cells, do not function properly to constrain inflammation. A complicated series of early signaling defects and aberrant activation of kinases and phosphatases result in complex cell phenotypes by altering the metabolic profile and the epigenetic landscape. All main metabolic pathways including glycolysis, glutaminolysis and oxidative phosphorylation are altered in T cells from lupus prone mice and patients with SLE. SLE CD8+ cytotoxic T cells display reduced cytolytic activity which accounts for higher rates of infection and the sustenance of autoimmunity. Further, CD8+ T cells in the context of rheumatic diseases lose the expression of CD8, acquire IL-17+CD4-CD8- double negative T (DNT) cell phenotype and infiltrate tissues. Herein we present an update on these T cell abnormalities along with underlying mechanisms and discuss how these advances can be exploited therapeutically. Novel strategies to correct these aberrations in T cells show promise for SLE treatment.
{ "pmid": 35872103, "language": "eng" }
Autoantibodies in systemic lupus erythematosus: From immunopathology to therapeutic target. Systemic lupus erythematosus (SLE) is an autoimmune disease characterized by multiple organ inflammatory damage and wide spectrum of autoantibodies. The autoantibodies, especially anti-dsDNA and anti-Sm autoantibodies are highly specific to SLE, and participate in the immune complex formation and inflammatory damage on multiple end-organs such as kidney, skin, and central nervous system (CNS). However, the underlying mechanisms of autoantibody-induced tissue damage and systemic inflammation are still not fully understood. Single cell analysis of autoreactive B cells and monoclonal antibody screening from patients with active SLE has improved our understanding on the origin of autoreactive B cells and the antigen targets of the pathogenic autoantibodies. B cell depletion therapies have been widely studied in the clinics, but the development of more specific therapies against the pathogenic B cell subset and autoantibodies with improved efficacy and safety still remain a big challenge. A more comprehensive autoantibody profiling combined with functional characterization of autoantibodies in diseases development will shed new insights on the etiology and pathogenesis of SLE and guide a specific treatment to individual SLE patients.
{ "pmid": 35872104, "language": "eng" }
Pregnancy-related complications in systemic lupus erythematosus. Systemic lupus erythematosus (SLE) is a systemic autoimmune inflammatory disease that predominantly affects women of childbearing age and results in various adverse pregnancy outcomes (APOs). Pregnancy was formerly discouraged in patients with SLE because of unstable disease activity during the gestation period, increased thrombosis risk, severe organ damage, and inevitable side effects of immunosuppressive agents. Currently, most patients with SLE have successful pregnancies due to preconception counselling, strict monitoring, and improved therapy with minimised complications for both the mother and foetus. Hydroxychloroquine (HCQ) is extensively used and is beneficial for improving pregnancy outcomes. However, pregnant women with SLE have a high-risk of APOs, such as disease flare, preterm birth, intrauterine growth restriction (IUGR), preeclampsia, and pregnancy loss. Better understanding of the changes in maternal immunity and serum biomarkers, as well as their relationships with SLE-related APOs progression, would facilitate the investigation of molecular mechanisms for triggering and ameliorating APOs. Furthermore, it would enable us to explore and develop novel and effective therapeutic strategies to prevent disease activation. Therefore, this review briefly introduces the interaction between pregnancy outcomes and SLE, elucidates pathophysiological and immunological changes during SLE pregnancy. Furthermore, this review systematically expounds on the effective predictors of APOs and the molecular mechanisms underlying the SLE-related APOs to provide a solid foundation for the advanced management of lupus pregnancy.
{ "pmid": 35872105, "language": "eng" }
Voice Problems Among School Teachers employing the Tele-teaching Modality. To assess the prevalence of voice problems among teachers in Riyadh during tele-teaching and examine the relationship between the Voice Handicap Index 10 (VHI10) scores and a variety of risk factors believed to be related to voice problems. We also assessed awareness of voice hygiene and therapy among teachers. An observational cross-sectional study conducted using a multistage random sampling method among Riyadh school teachers who taught by tele-teaching for a minimum of one year. A self-assessment questionnaire which included demographic information about teachers, factors related to their teaching backgrounds, tele-teaching settings, effects of tele-teaching on the voice, medical and social histories, reports of voice and reflux symptoms, VHI10, and general knowledge about voice hygiene. This was distributed to school teachers using an SMS link through the Ministry of Education's IT department. A total 495 were included in the study after exclusions. The prevalence of teachers who had significant voice problems during tele-teaching (VHI10>11) was 21.6%. Multiple risk factors significantly increased the risk of voice problems during tele-teaching. These factors included being female, teacher age, the presence of background noise from both teachers and students, loud voices, using an open camera during the teaching, stress and anxiety, allergies, respiratory disease, reflux, hearing problems, and a family history of voice problems. Only 4.6% of respondents were familiar with voice hygiene and voice therapy, but 65% believe that it is important for teachers to be knowledgeable about them. Due to the lower prevalence of voice disorders among tele-teaching compared to traditional teaching methods, tele-teaching may be a viable option for teachers who have voice problems. There are still several factors influencing voice problems among tele-teachers. To attenuate potential risks, it is crucial that teachers are aware of the concepts of voice hygiene and voice therapy.
{ "pmid": 35872108, "language": "eng" }
Site-specific characteristics of bone and progenitor cells in control and ovariectomized rats. One-third of postmenopausal women experience at least one osteoporotic bone fracture in their lifetime that occurs spontaneously or from low-impact events. However, osteoporosis-associated jaw bone fractures are extremely rare. It was also observed that jaw bone marrow stem cells (BMSCs) have a higher capacity to form mineralized tissues than limb BMSCs. At present, the underlying causes and mechanisms of variations between jaw bone and limb bone during postmenopause are largely unknown. Thus, the objective of the current study was to examine the site-specific effects of estrogen deficiency using comprehensive analysis of bone quantity and quality, and its association with characterization of cellular components of bone. Nine rats (female, 6 months old) for each bilateral sham and ovariectomy (OVX) surgery were obtained and maintained for 2 months after surgery. A hemi-mandible and a femur from each rat were characterized for parameters of volume, mineral density, cortical and trabecular morphology, and static and dynamic mechanical analysis. Another set of 5 rats (female, 9 months old) was obtained for assays of BMSCs. Following cytometry to identify BMSCs, bioassays for proliferation, and osteogenic, adipogenic, chondrogenic differentiation, and cell mitochondrial stress tests were performed. In addition, mRNA expression of BMSCs was analyzed. OVX decreased bone quantity and quality (mineral content, morphology, and energy dissipation) of femur while those of mandible were not influenced. Cellular assays demonstrated that mandible BMSCs showed greater differentiation than femur BMSCs. Gene ontology pathway analysis indicated that the mandibular BMSCs showed most significant differential expression of genes in the regulatory pathways of osteoblast differentiation, SMAD signaling, cartilage development, and glucose transmembrane transporter activity. These findings suggested that active mandibular BMSCs maintain bone formation and mineralization by balancing the rapid bone resorption caused by estrogen deficiency. These characteristics likely help reduce the risk of osteoporotic fracture in postmenopausal jawbone.
{ "pmid": 35872110, "language": "eng" }
Immune regulatory effects of microRNA9-3. MicroRNAs are known to regulate cell proliferation, differentiation, and apoptosis. However, the immunological mechanism and role of microRNA9-3 (miR9-3) are unknown. This study used CRISPR/cas9 technology to knock out miR9-3 to modulate its expression level. FACS results showed that the absolute number of total B cells declined in miR9-3-deficiency in the spleen (Sp), bone marrow (BM), and lymph node (LN) to different levels compared to the wild-type. Also, the absolute numbers of Fo, T1, and T2 cells decreased both in Sp and LN. The absolute numbers of total T cells in Sp and LN declined sharply; CD4+ and CD8+ T cells showed a dramatic decrease in Sp, LN, and Th (thymus) of the miR9-3- group. In BM, the cells number of immature B cells, pro-pre-B cells, pro-B cells, and pre-B cells reduced to different levels, while mature B cells were comparable to wild-type. These data illustrated that miR9-3-deficiency impaired the development of B cells in BM. Also, the development of T cells was severely impaired. In Th, the numbers of DN and DP cells were remarkably reduced in the miR9-3 mutant mice. Also, the numbers of DN-1, DN-3, and DN-4 cells decreased. The absolute number of cells in the hematopoietic stem cell (HSC) system such as LT-HSC (long-term HSC), ST-HSC (short-term HSC), MPP (multipotent progenitor), GMP (granulocyte-macrophage progenitor), CMP (common myeloid progenitors), MEP (megakaryocyte-erythroid progenitor), and CLP (common lymphoid progenitor) all were decreased in miR9-3 deficient mice. These results showed that miR9-3 deficiency initiated the damage to the entire hematopoietic system. Moreover, the absolute number of myeloid cells in both Sp and BM decreased in mutant mice. The cells number of NK cells showed a sharp reduction in Sp whereas the change was not significant in BM. The above results suggest that miR9-3 participates in the immune regulation of B cells, T cells, and the HSC system, highlighting its regulatory roles.
{ "pmid": 35872113, "language": "eng" }
Reappraisal of the clinical role of metronidazole therapy for Clostridioides difficile infection in Taiwan: A multicenter prospective study. Although metronidazole is not recommended to treat Clostridioides difficile infection (CDI) in Western countries, it was still to be recommended for the treatment of non-severe CDI among Taiwanese adults in 2020. This controversy in the clinical role of metronidazole therapy for CDI was examined in a prospective clinical study. The study was conducted from January 2015 to December 2016 in three hospitals in Taiwan. Metronidazole treatment failure (MTF) was defined as the persistence of diarrhea after six days of treatment, medication modification (shifting to oral vancomycin), or death after five days of therapy. Overall, 325 patients receiving metronidazole for CDI were included. The overall MTF rate was 48.6% (158 patients). Leukocyte counts of >15,000 cells/mL in peripheral blood (odd ratio [OR] 1.81; P = 0.04) and congestive heart failure (OR 3.26; P = 0.02) were independently associated with MTF. The MTF rate for patients with leukocyte counts of ≤15,000 cells/mL and no congestive heart failure, leukocyte counts of >15,000 cells/mL and no congestive heart failure, leukocyte counts of ≤15,000 cells/mL and congestive heart failure, and leukocyte counts of >15,000 cells/mL and congestive heart failure were 44.2%, 51.8%, 73.3%, and 66.7%, respectively. Of note, patients who experienced MTF had a higher recurrence rate of CDI than those with metronidazole treatment success (13.9% vs. 6.0%, P = 0.02). For Taiwanese adults with CDI, the failure rate of metronidazole therapy approached 50%, which suggests the reappraisal of the therapeutic role of metronidazole therapy, especially for patients with leukocytosis or underlying congestive heart failure.
{ "pmid": 35872116, "language": "eng" }
Arthroscopic accessibility of the first metatarsophalangeal joint for osteochondral defects of the metatarsal head by two-portal technique - comparing joint distraction and plantarflexion. Several techniques and approaches for first metatarsophalangeal (MTP1) joint arthroscopy have been reported, where joint accessibility plays a key role. This study aimed to evaluate differences in arthroscopic accessibility of the first metatarsal head (MTH1) comparing non-invasive distraction and maximum plantarflexion in a two-portal approach. Forty fresh-frozen lower leg specimens were included and divided into a distraction group (D-group) and a plantarflexion group (PF-group). A two-portal technique (1.9 mm-30°-scope) was used for arthroscopy, maximum reach at the MTH1 was marked. Following arthroscopy, specimens were dissected and examined for iatrogenic injuries. The reached area at the chondral surface was pinned and accessibility calculated. Accessibility of the MTH1 was 58.03 % ± 13.64 (D-group) and 55.93 % ± 10.30 (PF-group, p = 0.51). The dorsomedial hallucal nerve was injured in one specimen (2.5 %). Maximum plantarflexion showed no difference in arthroscopic MTP1 joint accessibility compared to non-invasive distraction in a two-portal approach. During dorsomedial portal placement, the dorsomedial hallucal nerve is at risk for iatrogenic injury.
{ "pmid": 35872114, "language": "eng" }
The Impact of Optical Impressions on Dog Feeding Practice. According to the literature, as many as 60% of domestic dogs are overweight, whereby obesity is implicated in many serious diseases and hence a reduction of body weight results in a reduced risk of disease. Approximately 32% of reduction diets are unsuccessful in helping dogs to reach their ideal body weight. The likely reasons for this high drop-out rate include, among others, the fear of increased hunger-induced distress or a loss of affection on the part of the pet towards the owner. To alleviate these apprehensions, the use of optical effects that increase the perceived food intake could be useful. To investigate this, a mixed-methods study design was applied and 100 test persons-including dog owners and non-owners-were instructed to fill up 11 separate dog bowls with the same amount of dogfood. The bowls varied in 5 different variables (total height, upper diameter, angulation of sidewall, volume, and color). The influence of the shape and color of the dog bowls in relation to the filling quantity was evaluated. Overall, the body of the inner food bowl-especially its diameter and shape-showed a significant impact on the feeder as the wider the diameter, the more the dog bowl was filled. Moreover, the flatter the sidewall was angulated, the larger the fill-up quantity. Significantly, the volume on its own did not have a significant impact on the feeder. A difference of up to 37.6% in fill quantity resulted depending on the type of dog bowl used. Furthermore, the use of inner cones confused the test persons whereas different colors and the total height of the bowl showed no impact. Dog bowls with a small upper diameter and a steep sidewall-regardless of volume and color-were filled less by the test persons. This tendency could be useful for adapting the feeding of overweight dogs or those with an increased risk of obesity.
{ "pmid": 35872117, "language": "eng" }
Overlapping repair and epitenon healing are more stable biomechanically than side to side repair and endotenon healing in achilles tendon lengthening with Z plasty. The current study aimed to compare biomechanical stability and healing process of side-to-side repair with overlapping repair after Achilles tendon lengthening with Z-plasty. In our study, 22 Sprague Dawley male rats were used. Side-to-side repairs were classified as group 1 and overlapping repairs as group 2. The left and right legs of seven rats were used to compare early group 1 and early group 2 biomechanical test results at day 0. Seven rats were used to compare late group 1 and late group 2 biomechanical test results at day 28. Both the right and left tendons were tested from the four rats examined in the biomechanically in the untreated control group. The last remaining four rats were used for histopathological evaluation of tendon repair, at 28-days from the index procedure.The ultimate load to failure was compared between groups. At time 0, there were no measurable differences between group 1 (3.8 ± 1.4 N) and group 2 (3.7 ± 1.1 N), and both could endure less than one-tenth of the untreated control (49 ± 12). At 28 days, ultimate load to failure improved significantly in both group 1 (16.2 ± 3.5 N) and even more in group 2 (36 ± 8.1 N). While there was a significant difference between group 1 and group 2, neither were able to meet the untreated control (49 ± 12). Histopathological evaluation in the post-healing period showed that fibrosis, neovascularization, and inflammation increased in both groups. The overlapping suture technique and epitenon healing have more stability compared to side-to-side suture technique and endotenon healing. Human population trials may or may not exist, our study suggests it should be considered and further investigation needed before actual clinical application.
{ "pmid": 35872119, "language": "eng" }
The influence of partial weight bearing on plantar peak forces using three different types of postoperative shoes. Therapeutic shoes and partial weight bearing regimes are used after foot surgery to prevent the operated region from excessive load. It remains unclear to which extent partial weight bearing reduces the plantar peak forces. Therefore, we investigated the correlation of weight bearing and plantar peak forces in commonly used therapeutic shoes. Three different weight bearing regimes (20 kg, 40 kg, full weight) were investigated in 20 healthy volunteers. Sensor insoles were used to measure peak forces of the forefoot, midfoot, heel and the complete foot using four kind of shoes (bandage shoe, forefoot relief shoe, short walker and standard sneaker). Peak forces were compared between shoes using one-way ANOVA. The influence of partial weight bearing relative to the peak forces was examined by linear regression analysis. All therapeutic shoes reduced significantly peak forces of the fore- and midfoot when compared to the reference shoe; the largest reduction was achieved by the forefoot relief shoe (-70 % at forefoot). Weight load and the resulting peak force showed a positive linear correlation for all regions and shoe types. Partial weight bearing significantly reduced the forefoot's force ratio compared to full weight bearing for all shoes except the forefoot relief shoe. Partial weight bearing is a strong instrument to reduce plantar peak forces of the forefoot, additionally to the proven offloading effect of therapeutic shoes.
{ "pmid": 35872118, "language": "eng" }
Complications following total ankle arthroplasty: A systematic literature review and meta-analysis. Total ankle arthroplasty (TAA) is increasingly used as a treatment for end-stage ankle arthropathy. However, TAA may be more sensitive to complications, failure and subsequent re-operations compared to ankle arthrodesis. The aim of this systematic review and meta-analysis is to generate an overview of complications of TAA surgery. PubMed, EMBASE and the Cochrane library were searched between 2000 and 2020 to identify all papers reporting on complications in TAA surgery. Meta-analysis was conducted based on type of complication in TAA surgery. Pooled estimates of complications were calculated using a random effects model. Risk of bias and quality was assessed using the Cochrane risk of bias and ROBINS-I tools. The confidence in estimates was rated and described according to the recommendations of the GRADE working group. One hundred twenty-seven studies were included in this systematic review. All combined, they reported on 16.964 TAAs with an average follow-up of 47.99 ± 29.18 months. Complications with highest reported pooled incidence were intra-operative fracture 0.06 (95 %CI 0.04-0.08) (GRADE Very low) and impingement 0.06 (95 %CI 0.04-0.08) (GRADE low) respectively. Reported complication incidence of TAA surgery is still high and remains a significant clinical problem that can be severely hampering long-term clinical survival of the prosthesis. The results of this systematic review and meta-analysis can help guide surgeons in informing their patient about complication risks. Implementation of more stringent patient selection criteria might contribute to diminishing TAA complication rates.
{ "pmid": 35872120, "language": "eng" }
Enhanced gait variability index and cognitive performance in Asian adults: Results from the Yishun Study. Although gait variability has been linked to cognitive decline among older adults, the lack of a comprehensive composite gait variability score has dampened the application of gait variability. Does the enhanced gait variability index (EGVI) - a composite score gait variability index - provide differential and useful information on cognitive decline in community-dwelling adults from that using gait speed? Healthy community-dwelling adults (n = 311) aged 21-90 were individually administered the Repeatable Battery for the Assessment of Neuropsychological Status (RBANS). Habitual gait speed and spatiotemporal parameters were measured using a 6 m instrumented walkway system. The EGVI for each participant was calculated from five spatiotemporal parameters - step length(cm), step time(s), stance time(s), single support time(s) and stride velocity(cm/s). Linear regression models, controlling for age, gender, and education, were built to examine the independent effects of EGVI or gait speed on global cognition and individual domains. Multiple regression revealed that gait speed contributed significantly to the performance of the domain "Attention" (p = 0.04) whereas EGVI contributed significantly for the performance of the domain "Visuospatial" (p = 0.04) and "Delayed Memory" (p = 0.02). EGVI provides differential and useful information from using gait speed alone. The EGVI may offer a solution to measure or track GV changes in relation to cognitive changes.
{ "pmid": 35872122, "language": "eng" }
Leisure-time physical activity and mortality risk in type 2 diabetes: A nationwide cohort study. Physical activity improves insulin resistance, inhibits inflammation, and decreases the incidence of cardiovascular disease. These are major causes of death in patients with diabetes. The Taiwan National Health Interview Survey collected baseline characteristics of socioeconomic level, education, marriage, and health behaviour, including leisure time physical activity in 2001, 2005, 2009, and 2013. The National Health Insurance research dataset 2000-2016 contained detailed information on medical conditions, including all comorbidities. All-cause and cardiovascular deaths were confirmed by the National Death Registry. A total of 4859 adults with type 2 diabetes were included in the analysis; 2389 (49 %) were men and the mean±SD age was 60±13 years. Kaplan-Meier curve of all-cause (log-rank P<0.001) and cardiovascular death (log-rank P=0.038) categorized by leisure-time physical activity showed a significant difference. The multivariable Cox regression model showed that those who had more leisure time physical activity had a significantly lower risk of all-cause death than those with no physical activity (physical activity of 1-800 MET-min/week HR = 0.66, 95% CI: 0.54-0.81, physical activity of >800 MET-min/week HR = 0.67, 95% CI: 0.56-0.81). A significant trend was also observed (P <0.001). Similar results were also observed for cardiovascular mortality (physical activity of 1-800 MET-min/week HR = 0.54, 95% CI: 0.36-0.84, physical activity of >800 MET-min/week HR = 0.78, 95% CI: 0.55-1.13). For those with diabetes, increased leisure-time physical activity significantly reduced risk of all-cause and cardiovascular death. Further research is warranted to determine the proper prescription for physical activity to prolong healthy life.
{ "pmid": 35872123, "language": "eng" }
Preoperative heart disease and risk for postoperative complications after pancreatoduodenectomy. Comorbidities increase the risk for postoperative complications after pancreatoduodenectomy. The importance of different categories of heart disease on postoperative outcomes has not been thoroughly studied. Patients aged ≥18 years undergoing pancreatoduodenectomy between 2008 and 2019 at Karolinska University Hospital, Sweden were included. Heart disease was defined as a preoperatively established diagnosis, and subcategorized into ischaemic, valvular, heart failure and atrial fibrillation. Postoperative outcome was analysed by multivariable regression. Out of 971 patients, 225 (23.3%) had heart disease. Heart disease was associated with an increased risk for complications; Clavien-Dindo score ≥ IIIa (Odds Ratio [OR] 1.53, 95% confidence interval [CI] 1.07-2.18; p = 0.019), intensive care unit admissions (OR 3.20, 95% CI 1.81-5.66; p < 0.001) and longer hospitalizations (median 14 vs. 11 days; p < 0.001). Although heart disease was not associated with 90-day mortality, it conferred a shorter median overall survival (22 vs. 32 months; p < 0.001). Atrial fibrillation and heart failure were each associated with increased risk for postoperative complications, whereas ischaemic and valvular heart disease were not. Atrial fibrillation and heart failure were independently associated with increased risk for postoperative complications. Despite no association with early postoperative mortality, heart disease negatively affected long-term survival.
{ "pmid": 35872125, "language": "eng" }
Glycemic status and the association of change in blood pressure with incident cardiovascular disease. The clinical benefit of blood pressure (BP) reduction in individuals with diabetes has not been fully elucidated. We sought to identify the clinical impact of BP reduction on incident cardiovascular disease in people having diabetes and hypertension. We conducted a retrospective cohort study including 754,677 individuals (median age 47 years, 75.8 % men) with stage 1/stage 2 hypertension. Participants were categorized using fasting plasma glucose (FPG) at baseline as normal FPG (FPG < 100 mg/dL) (n = 517,372), prediabetes (FPG:100-125 mg/dL) (n = 197,836), or diabetes mellitus (FPG ≥126 mg/dL) (n = 39,469). The primary outcome was heart failure (HF), and the secondary outcomes included ischemic heart disease (IHD) including myocardial infarction and angina pectoris, and stroke. Over a mean follow-up of 1111 ± 909 days, 18,429 HFs, 17,058 IHDs, and 8,795 strokes were recorded. Reduction in BP of< 120/80 mmHg at 1year was associated with a lower risk of developing HF (HR:0.77, 95% CI:0.72-0.82), IHD (HR:0.84, 95% CI:0.79-0.89), and stroke (HR:0.75, 95% CI:0.69-0.82) in individuals with normal FPG, whereas it was not associated with a risk of developing HF (HR:0.98, 95% CI:0.81-1.17) and stroke (HR:0.82, 95% CI:0.62-1.09) in those with DM. Interaction analyses showed that the influence of BP reduction on incident HF was attenuated with people with prediabetes or DM. A multitude of sensitivity analyses confirmed our results. The association of BP reduction with the risk of developing HF was attenuated with deteriorating glucose tolerance. The optimal management strategy for hypertensive people with prediabetes or DM for the prevention of developing cardiovascular disease (particularly HF) is needed to be established.
{ "pmid": 35872126, "language": "eng" }
Predictors of Intensive Care Unit Stay in Patients with Acute Traumatic Spinal Cord Injury Above T6. The objective of this study was to identify factors associated with the intensive care unit (ICU) length of stay (LOS) of patients with an acute traumatic spinal cord injury above T6. We performed a retrospective, observational study of patients admitted to an ICU between 1998 and 2017 (n = 241). The LOS was calculated using a cumulative incidence function, with events of death being considered a competing event. Factors associated with the LOS were analyzed using both a cause-specific Cox proportional hazards regression model and a competing risk model. A multistate approach was also used to analyze the impact of nosocomial infections on the LOS. A total of 211 patients (87.5%) were discharged alive from the ICU (median LOS = 23 days), and 30 (12.4%) died (median LOS = 11 days). In the multivariate analysis after adjusting for variables collected 4 days after the ICU admission, a higher American Spinal Injury Association motor score (subdistribution hazards ratio [sHR] = 1.01), neurological level C5-C8 (HR = 0,64), and lower Sequential Organ Failure Assessment score (sHR = 0.82) and fluid balance (sHR = 0.95) on day 4 were linked to a lower LOS in this unit. In the multivariate analysis, the onset of an infection was significantly associated with a longer LOS when adjusting for variables collected both at ICU admission (adjusted sHR = 0.62; 95% confidence interval = 0.50-0.77) and on day 4 (adjusted hazards ratio = 0.65; 95% confidence interval = 0.52-0.80). After adjusting the data for conventional variables, we identified a lower American Spinal Injury Association motor score, injury level C5-C8, a higher Sequential Organ Failure Assessment score on day 4, a more positive fluid balance on day 4, and the onset of an infection as factors independently associated with a longer ICU LOS.
{ "pmid": 35872127, "language": "eng" }
Equipoise for Lateral Access Surgery. To investigate the use of lateral access surgery among surgeons from the Asia-Pacific region to determine equipoise for areas of contentious use. A questionnaire was distributed to members of the Asia Pacific Spine Society. Surgeons were asked about their past experiences with lateral access surgery, including their advantages and disadvantages, specific surgical strategies, choices in implant-related factors, order of levels to operate on in multilevel reconstruction surgery, and postoperative complications. A total of 69 of 102 surgeons (67.6%) had performed lateral access surgery previously. In total, 56 participating surgeons (54.9%) agreed that anterior column reconstruction via lateral access is most of time superior to transforaminal lumbar interbody fusion and other techniques. Surgeons would consider laminectomy instead of indirect decompression in the presence of severe central or lateral recess stenosis, thickened ligamentum flavum, and facet joint hypertrophy. For the order of levels to operate on in multiple level reconstruction for deformity, where 1 stands for L3-L4 or higher, 2 stands for L4-L5, and 3 stands for L5-S1, 2-1-3 (28/95, 29.5%) was most common, followed by 1-2-3 (26/95, 27.4%), and 3-2-1 (21/95, 22.1%). Lateral access surgery is seeing greater use in the Asia-Pacific region, especially in upper middle- to high-income countries, whereas keenness of surgeons who practice in lower middle- to low-income countries can be improved by more training, resources, and reasonable cost. A high percentage of surgeons do not consider indirect decompression for spinal stenosis. There was no consensus on the order of levels in multiple level reconstruction for deformity.
{ "pmid": 35872128, "language": "eng" }
Influence of Spinal Deformity Construct Design on Adjacent-Segment Biomechanics. Adjacent level degeneration is a precursor to construct failure in adult spinal deformity surgery, but whether construct design affects adjacent level degeneration risk remains unclear. Here we present a biomechanical profile of common deformity correction constructs and assess adjacent level biomechanics. Standard nondestructive flexibility tests (7.5 Nm) were performed on 21 cadaveric specimens: 14 pedicle subtraction osteotomies (PSOs) and 7 anterior column realignment (ACR) constructs. The ranges of motion (ROM) at the adjacent free level in flexion, extension, axial rotation, and lateral bending were measured and analyzed. ACR constructs had a lower ROM change on flexion at the proximal adjacent free level than constructs with PSO (1.02 vs. 1.32, normalized to the intact specimen, P < 0.01). Lateral lumbar interbody fusion adjacent to PSO and 4 rods limits ROM at the free level more effectively than transforaminal interbody fusion and 2 rods in correction constructs with PSO. Use of 2 screws to anchor the ACR interbody further decreased ROM at the proximal adjacent free level on flexion, but adding 4 rods in this setting added no further limitation to adjacent segment motion. ACR constructs have less ROM change at the adjacent level compared to PSO constructs. Among constructs with ACR, anchoring the ACR interbody with 2 screws reduces motion at the proximal adjacent free level. When PSOs are used, lateral lumbar interbody fusion adjacent to the PSO level has a greater reduction in adjacent-segment motion than transforaminal interbody fusion, suggesting that deformity construct configuration influences proximal adjacent-segment biomechanics.
{ "pmid": 35872130, "language": "eng" }
Development of Consensus-Based Recommendations to Prevent/Minimize Medication Errors in the Perioperative Care of Patients with Epilepsy: A Mixed-Method. This study explored medication errors in the perioperative care of patients with epilepsy and developed consensus-based recommendations to prevent/minimize these errors. A mixed method was used in this study. Medication error situations were explored in semi-structured in-depth interviews with nurses (n = 12), anesthesiologists (n = 5), anesthesia technicians (n = 5), surgeons (n = 4), neurologists (n = 4), and patients with epilepsy (n = 10). The qualitative data were analyzed using the qualitative interpretive description approach. A two-round Delphi technique was used among nurses (n = 22), anesthesiologists (n = 9), anesthesia technicians (n = 7), surgeons (n = 7), and neurologists (n = 5). A total of 1400 minutes of interview time was analyzed in this study. Of the panelists, 39 (78.0%) agreed that patients with epilepsy present unique challenges to providers of perioperative care that make them prone to medication errors. The interviewees in this study described 32 different medication error situations that occurred while providing perioperative care services to patients with epilepsy. In this study, 35 consensus-based recommendations to prevent/minimize medication errors in the perioperative care of patients with epilepsy were developed. The findings of this study are informative to decision-makers in health care facilities and other stakeholders in health regulatory authorities who need to design measures to prevent/minimize medication errors and improve perioperative outcomes of patients with epilepsy. Studies are needed to investigate if these recommendations can be effective in preventing/reducing medication errors in the perioperative care of patients with epilepsy.
{ "pmid": 35872131, "language": "eng" }
Determination of Patient Acceptable Symptom State for the Oswestry Disability Index Score in Patients Who Underwent Minimally Invasive Discectomy for Lumbar Disc Herniation: 2-Year Follow-up Data from a Randomized Controlled Trial. We aim to determinate the patient acceptable symptom state (PASS) for the Oswestry Disability Index (ODI) score in patients undergoing minimally invasive discectomy for the treatment of lumbar disc herniation. A post hoc analysis of prospectively collected, 2-year follow-up data was conducted. The anchor for determination of PASS was the European Quality of Life Visual Analog Scales question, and the Pearson correlation test was performed to evaluate its validity. The receiver operating characteristics (ROC) curve analysis was conducted to determine the PASS thresholds for ODI and its discriminative ability assessment. Sensitivity analyses were also carried out for alternative definition of PASS, different follow-up periods, and different subgroups. A total of 222 patients (92.1%) completed the 2-year follow-up, 92.8% of whom considered their state to be acceptable. The area under the ROC curve (AUC) were all >0.8, indicating a high discriminative ability. The PASS threshold for the ODI was suggested to be 5 at 6 months (AUC: 0.80; sensitivity: 79.0%, specificity: 73.7%) and 2 years (AUC: 0.98; sensitivity: 90.3%, specificity: 100%) postoperatively. Despite some variations found in different body mass index and baseline ODI subgroups, sensitivity analysis showed that the above-mentioned threshold was robust. An ODI of 5 was noted to be the PASS threshold for patients received minimally invasive discectomy for the treatment of LDH. This ODI threshold was robust, and therefore recommended as the ultimate goal of minimally invasive treatment for LDH, which can help to present results of clinical research at an individual level.
{ "pmid": 35872132, "language": "eng" }
Decompressive Craniectomy for Pediatric Traumatic Brain Injury in Low-and-Middle Income and High Income Countries. Traumatic brain injury is one of the leading causes of mortality and morbidity in children worldwide. In severe cases, high intracranial pressure is the most frequent cause of death. When first-line medical management fails, the neurosurgical procedure of decompressive craniectomy (DC) has been proposed for controlling intracranial pressure and improving the long-term outcomes for children with severe traumatic brain injury. However, the use of this procedure is controversial. The evidence from clinical trials shows some promise for the use of DC as an effective second-line treatment. However, it is limited by conflicting trial results, a lack of trials, and a high risk of bias. Furthermore, most research comes from retrospective observational studies and case series. This narrative review considers the strength of evidence for the use of DC in both a high income country and low-and-middle income country setting and examine how we can improve study design to better assess the efficacy of this procedure and increase the clinical translatability of results to centers worldwide. Specifically, we argue for a need for further studies with higher pediatric participant numbers, multicenter collaboration, and the use of a more consistent methodology to enable comparability of results among settings.
{ "pmid": 35872134, "language": "eng" }
Anterior Thoracic Diskectomy and Fusion: Surgical Technique and Anatomic Considerations. Ossification of the posterior longitudinal ligament (OPLL) is a rare condition that can lead to progressive spinal cord compression.1 Currently, surgical decompression remains the optimal treatment in symptomatic patients.2,3 In cases with significant thoracic stenosis and concern for ventral erosion of the dura, an anterior approach may be necessary for direct decompression.4 In Video 1, we demonstrate the successful application of a multidisciplinary approach for surgical resection of a large OPLL lesion located at the T2-3 disk space. A 37-year-old female with medical history significant for rickets presented a year after a fall with bilateral lower extremity paraparesis and saddle anesthesia. Exposure consisted of a manubrial window, followed by thoracic diskectomy and fusion with drilling of the calcified posterior longitudinal ligament. Major steps within this video include 1) a summary of the patient presentation and preoperative imaging, 2) exposure of thoracic vertebrae via a manubrial window approach, 3) thoracic diskectomy and fusion with take-down of calcified posterior longitudinal ligament, and 4) a review of the postoperative imaging. The patient tolerated the procedure well with immediate relief of symptoms and was subsequently discharged on postoperative day 1 with no complications. This operative video illustrates the technical steps and capabilities of an anterior approach, achieving near-complete gross total resection of an OPLL lesion using a multidisciplinary approach. The patient consented to this procedure.
{ "pmid": 35872133, "language": "eng" }
Recent Trends in Medicare Utilization and Reimbursement for Spinal Cord Stimulators: 2000-2019. Spinal cord stimulators (SCS) allow spine surgeons to provide relief for patients who suffer from chronic pain due to several disorders, such as failed back surgery syndrome, complex regional pain syndrome, and neuropathy. Despite this, there remains a paucity of data regarding the utilization and reimbursement of SCS. Therefore, the purpose of this study is to evaluate the monetary and procedural trends of spinal cord stimulators among the Medicare database from 2000 to 2019. Medicare Part B National Summary Data files, which are publicly available, were used. These files contain data from the years 2000-2019 on all services billed to Medicare within that time frame. Each service is given a Current Procedural Terminology (CPT) code and the number of times that service was performed, as well as the total physician Medicare charges and reimbursements for each service annually are included in that data set. The CPT codes for percutaneous and open placement of spinal cord stimulators were identified: 63650 and 63655, respectively. The total allowed services allowed charges and actual payments were isolated from the data set for each year for each CPT code. The total allowed charges and actual payments for the year were then divided by the total allowed services to find and trend the allowed charges and actual payment for each individual service performed for both percutaneous and open placement of spinal cord stimulators. There were 992,372 Medicare-approved total percutaneous spinal cord stimulator operations and 99,736 Medicare-approved total open spinal cord stimulator operations from 2000 to 2019. Medicare paid $1.02 billion (2019 U.S. dollars) in reimbursement to physicians for percutaneous spinal cord stimulator operations and nearly $145 million (2019 U.S. dollars) in reimbursement to physicians for open spinal cord stimulator operations. From the years 2000 to 2019, there was an average 21.9% increase annually in Medicare-approved percutaneous spinal stimulator placement operations and a 18.4% increase annually in Medicare-approved open spinal stimulator placement operations. During this time, there was also an average 8.7% increase annually in Medicare reimbursement per each percutaneous spinal stimulator placement operation and a 9.1% increase annually in Medicare reimbursement per each open spinal stimulator placement operation. The results of this study show that the number of percutaneous and open procedures have steadily increased from 2000 to 2019. Reimbursement per procedure has also increased steadily over this time. Identifying these trends is important to promote research into costs of these surgeries and ensure adequate resource allocation.
{ "pmid": 35872135, "language": "eng" }
A hybrid return to baseline imputation method to incorporate MAR and MNAR dropout missingness. Missing data are inevitable in longitudinal clinical trials due to intercurrent events (ICEs) such as treatment interruption or premature discontinuation for different reasons. Missing at random (MAR) assumption is usually unverifiable and sensitivity analyses are often requested under missing not at random (MNAR) assumption. Return to baseline (RTB) imputation is a commonly used MNAR method. In practice, not all dropout missingness can be assumed MNAR. For example, missingness or dropouts due to COVID-19 can be reasonably assumed MAR. Therefore, traditional RTB is not applicable when there is both MAR and MNAR dropout missingness. Here we propose a hybrid strategy for RTB imputation which can handle missing data due to MAR and MNAR dropouts at the same time. Standard multiple imputation approach is proposed and an analytic likelihood based approach is derived to improve efficiency.
{ "pmid": 35872140, "language": "eng" }
Intravascular ultrasound evaluation during iliofemoral venous stenting is associated with improved midterm patency outcomes. Intravascular ultrasound (IVUS) examination is increasingly used in the treatment of iliofemoral venous disease and provides more sensitive and specific detection of stenotic lesions when compared with traditional multiplanar venography alone. Correlations with deep venous stent patency, however, have not yet been investigated. The objective of the study was to evaluate the impact of the use of IVUS examination in addition to multiplanar venography on iliofemoral venous patency. Consecutive patients who underwent stenting for symptomatic thrombotic or nonthrombotic iliofemoral venous lesions (NIVLs) between 2014 and 2020 at a single institution were identified and divided into two groups based on whether IVUS examination was used before stent deployment in addition to multiplanar venography compared with venography alone. A retrospective review of demographic, operative, and follow-up data was performed. Thirty-day and 2-year stent patency were measured as primary end points. χ2 analysis, logistic regression models, and Kaplan-Meier survival analysis were used to determine outcomes. Technical details and outcomes were additionally examined among patients treated for acute deep venous thrombosis, post-thrombotic syndrome, or NIVLs separately on subgroup analysis. We identified 150 patients (173 limbs, 23 bilateral) who underwent iliofemoral stenting during the study period at our institution (mean age: 48.8 ± 16.8 years, 61% female). Adjunctive IVUS utilization before stent deployment was reported in 69 of 173 (39.9%) treated limbs. IVUS examination was more likely to be used in patients who underwent stenting for NIVLs compared with thrombotic disease (41.0% vs 11.2%, P < .01). There was no difference in the number of stents deployed between IVUS and non-IVUS cohorts. However, IVUS examination was associated with the increased total length of the stent deployed (126 ± 56 vs 112 ± 48 mm, P = .04) and a higher rate of infrainguinal stent extension (17.4% vs 6.7%, P = .03). In addition, mean stent diameter was significantly higher when IVUS examination was performed before stent placement (16.3 ± 3.7 vs 15.2 ± 1.9 mm, P < .01). Both 30-day (98.5% vs 89.4%, P = .02) and 2-year (90.3% vs 78.7%, P = .03) primary patency were significantly higher in the IVUS cohort. Adjunctive IVUS use was found to significantly protect against stent reintervention at 2 years on adjusted Cox regression analysis (hazard ratio: 0.22, 95% confidence interval: 0.07-0.71, P = .01). Adjunctive IVUS utilization is associated with differences in stent diameter and length selections as well as landing segments in the treatment of thrombotic and nonthrombotic iliofemoral venous disease. IVUS examination before stent deployment significantly protects against 30-day and 2-year stent reintervention when compared with the use of multiplanar venography alone. These data provide stronger evidence for routine IVUS use in addition to venography before iliofemoral venous stenting.
{ "pmid": 35872142, "language": "eng" }
Treatment and contemporary outcomes associated with adjunct tourniquet use during phlebectomy of complex, voluminous truncular varicosities. Phlebectomy of large voluminous varicose veins comes with a risk of substantial blood loss. The purpose of the present study was to investigate the outcomes associated with the use of an adjunct tourniquet during varicose vein surgery of complex and large truncular varicosities. The prospectively collected registry data included anatomic and outcomes details for patients who presented with complex and large truncular varicosities with a CEAP clinical class of C2, or higher (indicating more serious venous disease) from December 2014 to December 2021. Of all patients, those treated with an adjunct tourniquet for large complex varicosities (largest diameter varicosity ≥1 cm by visual inspection) were selected for analysis. The venous clinical severity scores (VCSSs) and patient-reported outcomes (PROs) were obtained. Additional parameters, including operative time, tourniquet time, and blood loss, were obtained retrospectively via a review of the medical records. Univariate descriptive statistics of the demographic and procedural data were performed pre- and postoperatively, with comparisons performed using the Student two-tailed t test. The data from 19 patients (22 limbs; 7 women and 12 men) were analyzed. Of the 22 limbs, 11 (50%) had advanced venous disease of C4 or higher preoperatively. A review of the preoperative duplex ultrasound scans confirmed the presence of large varicosities (average, 1.0 ± 0.54 cm; n = 18). All the limbs were treated using radiofrequency ablation of axial reflux and phlebectomy (a combination of powered and stab) under tourniquet control (82%) or phlebectomy under tourniquet control alone (18%). The average tourniquet time was 40 ± 12 minutes, with a median blood loss of 50 mL (interquartile range, 30-100 mL). The average follow-up was 332 ± 422 days after 19 procedures for 16 patients (2 patients moved out of state during the immediate postoperative period and 1 patient was lost to follow-up). Of the patients who completed >3 months of follow-up, 14 limbs experienced improvement in the CEAP class, 5 limbs had no change, and 3 were limbs of patients who moved or were lost to follow-up. The VCSSs significantly improved (8.8 ± 2.8 vs 3.9 ± 1.9; P < .0001). The PROs also improved significantly (16.1 ± 5.0 vs 2.2 ± 2.3; P < .0001). Tourniquet use in the treatment of varicosities has only been described in the setting of high ligation and stripping. Our data suggest that in the modern era of minimally invasive endovenous treatment of axial reflux and phlebectomy, adjunct tourniquet use during the treatment of large complex varicosities can result in significant improvements in the VCSSs and PROs, with minimal blood loss.
{ "pmid": 35872138, "language": "eng" }
Computed tomography reference dimensions for identification of stented surgical mitral bioprostheses valve size. Selection of the transcatheter heart valve size for a mitral valve-in-valve procedure is based on the type and manufacturer's labelled size. However, accurate information of surgical heart valve (SHV) size may not be available in the patient's medical record. The purpose of this study is to establish reference data for computed tomography (CT) dimensions of commonly used mitral SHV in order to determine the manufacturer's labelled size from a cardiac CT data set. CT datasets of 105 patients with surgical mitral bioprosthesis and available manufacturer labeled datasets were included in the analysis. CT derived valve dimensions were assessed by two observers using multiplanar reformats aligned with the basal sewing ring. A circular region of interest was used in a standardized fashion to minimize influence of image acquisition and reconstruction parameters. Interobserver variability was assessed by Bland-Altman analysis. The CT-derived dimensions were stratified by valve size and type, and SHV properties were demonstrated for 5 common valve types. Variability of measurements was small and inter-observer limits of agreement were narrow. Stratified by SHV type, no overlap was noted for CT-derived dimensions among different SHV sizes . A reference table of CT characteristics of surgical mitral bioprosthesis types was created. The study provides reference CT data for determining the manufacturers' labeled SHV size across a range of commonly used mitral SHVs. The findings will be important to help identify types of surgical mitral bioprosthesis utilizing CT characteristics for patients without SHV size documentation.
{ "pmid": 35872143, "language": "eng" }
Direct oral anticoagulant agents might be safe for patients undergoing endovenous radiofrequency and laser ablation. Studies assessing the effect of the use of anticoagulant agents on endovenous thermal ablation (ETA) have been limited to patients taking warfarin. Thus, the aim of the present study was to assess the efficacy and safety of ETA for patients taking direct oral anticoagulants (DOACs). We hypothesized that the outcome of ETA for patients taking DOACs would not be superior to the outcomes for patients taking DOACs. We performed a retrospective review to identify patients who had undergone radiofrequency ablation or endovenous laser ablation with 1470-nm diode laser fibers for symptomatic great or small saphenous venous reflux from 2018 to 2020. The patients were dichotomized into those who had received a therapeutic dose of DOACs periprocedurally and those who had not (control group). The outcomes of interest included the rates of treated vein closure at 7 days and 9 months and the incidence of deep vein thrombosis (DVT), endothermal heat-induced thrombosis (EHIT), and bleeding periprocedurally. Of the 301 patients (382 procedures), 69 patients (87 procedures) had received DOACs and 232 control patients (295 procedures) had not received DOACs. The patients receiving DOACs were more often older (mean age, 65 years vs 55 years; P < .001) and male (70% vs 37%; P < .001), with a higher prevalence of venous thromboembolism and more severe CEAP (clinical, etiologic, anatomic, pathophysiologic) classification (5 or 6), than were the control patients. Those receiving DOACs were more likely to have had a history of DVT (44% vs 6%; P < .001), pulmonary embolism (13% vs 0%; P < .001), and phlebitis (32% vs 15%; P < .001). Procedurally, radiofrequency ablation had been used more frequently in the control group (92% vs 84%; P = .029), with longer segments of treated veins (mean, 38 mm vs 35 mm, respectively; P = .028). No major or minor bleeding events nor any EHIT had occurred in either group. Two patients in the control group (0.7%) developed DVT; however, no DVT was observed in those in the DOAC group (P = .441). At 9 months, the treated vein had remained ablated after 94.4% of procedures for patients receiving DOACs and 98.4% of the control group (P = .163). On multivariable analysis, DOAC usage was not associated with an increased risk of vein recanalization (hazard ratio, 5.76; 95% confidence interval, 0.57-58.64; P = .139). An increased preprocedural vein diameter and the use of endovenous laser ablation were associated with an increased risk of recanalization. In our study of patients who had undergone ETA for symptomatic saphenous venous reflux, the periprocedural use of DOACs did not adversely affect the efficacy of endovenous ablation to ≥9 months. Furthermore, DOAC use did not confer an additional risk of bleeding, DVT, or EHIT periprocedurally. DOACs may be safely continued without affecting the efficacy and durability of ETA.
{ "pmid": 35872141, "language": "eng" }
Venous thromboembolism as the first sign of malignancy. Venous thromboembolism (VTE) is commonly associated with hypercoagulability in patients with cancer; however, there have been few investigations of VTE as the first sign of malignancy and even fewer performed in the United States. The aim of our study was to evaluate the incidence and predictors of unrecognized malignancy in patients presenting with VTE. We performed a 1-year retrospective analysis of the Nationwide Readmission Database, including patients aged 18 years or older, presenting with a primary diagnosis of deep vein thrombosis (DVT) or a pulmonary embolism (PE). Patients known to have preexisting malignant diseases were excluded. Outcomes included the rate of newly diagnosed malignancy within 6 months from the discovery of VTE and demographic or associated illness predictors for the diagnosis of malignancy. A regression analysis was performed, based on which a VTE malignancy score was developed. A total of 116,048 patients were identified with VTE (49.8% DVT, 41.7% PE, 8.6% DVT and PE), 16% (n = 18,294) with malignancy. Of the remaining 97,754 patients, 31% were readmitted within 6 months. The incidence of newly diagnosed malignancy within 6 months was 2.4% (n = 2354). The most common malignancies were gastrointestinal in origin (29.2%). Demographic and diagnostic predictors for malignancy included age 65 years or older, female sex, inferior vena cava (IVC) thrombus, upper extremity thrombus, and a Charlson Comorbidity Index score of 5 or more. Receiver operating characteristic curve analysis found a cutoff VTE Malignancy score of 3 (sensitivity, 86%; specificity, 89%) to be predictive of an increased risk of a newly discovered malignancy within 6 months. VTE can be a risk indicator of underlying malignancy. Validation of a patient risk stratification score using multiple demographic or comorbid predictors for VTE on index admission may offer an opportunity for earlier diagnosis of occult malignancy.
{ "pmid": 35872137, "language": "eng" }
Association of left ventricular diastolic function with coronary artery calcium score: A Project Baseline Health Study. Coronary artery calcium (CAC) and left ventricular diastolic dysfunction (LVDD) are strong predictors of cardiovascular events and share common risk factors. However, their independent association remains unclear. In the Project Baseline Health Study (PBHS), 2082 participants underwent cardiac-gated, non-contrast chest computed tomography (CT) and echocardiography. The association between left ventricular (LV) diastolic function and CAC was assessed using multidimensional network and multivariable-adjusted regression analyses. Multivariable analysis was conducted on continuous LV diastolic parameters and categorical classification of LVDD and adjusted for traditional cardiometabolic risk factors. LVDD was defined using reference limits from a low-risk reference group without established cardiovascular disease, cardiovascular risk factors or evidence of CAC, (n ​= ​560). We also classified LVDD using the American Society of Echocardiography recommendations. The mean age of the participants was 51 ​± ​17 years with 56.6% female and 62.6% non-Hispanic White. Overall, 38.1% had hypertension; 13.7% had diabetes; and 39.9% had CAC >0. An intertwined network was observed between diastolic parameters, CAC score, age, LV mass index, and pulse pressure. In the multivariable-adjusted analysis, e', E/e', and LV mass index were independently associated with CAC after adjustment for traditional risk factors. For both e' and E/e', the effect size and statistical significance were higher across increasing CAC tertiles. Other independent correlates of e' and E/e' included age, female sex, Black race, height, weight, pulse pressure, hemoglobin A1C, and HDL cholesterol. The independent association with CAC was confirmed using categorical analysis of LVDD, which occurred in 554 participants (26.6%) using population-derived thresholds. In the PBHS study, the subclinical coronary atherosclerotic disease burden detected using CAC scoring was independently associated with diastolic function. NCT03154346.
{ "pmid": 35872144, "language": "eng" }
Comparison between a dedicated venous stent and standard composite Wallstent-Z stent approach to iliofemoral venous stenting: Intermediate-term outcomes. Dedicated venous stents have not been used in the management of symptomatic chronic iliofemoral venous obstruction (CIVO) until recently. The Bard Venovo stent (Becton, Dickinson, and Co, Franklin Lakes, NJ) is one such stent noted to have an increased chronic outward force and radial resistive force compared with the Wallstent (Boston Scientific, Marlborough, MA). In the present study, we evaluated the outcomes following the use of the Bard Venovo stent vs a matched cohort of limbs that had undergone stenting with the Wallstent-Zenith (Z) stent (Cook Medical Inc, Bloomington, IN) composite configuration. A review of contemporaneously entered electronic medical record data for 167 patients (167 limbs) with initial iliofemoral stents placed from 2019 to 2020 for quality of life (QOL)-impairing CIVO that had failed conservative therapy was performed. The visual analog scale for pain score (score, 0-10), grade of swelling (score, 0-4), venous clinical severity score (score, 0-27), and the 20-item chronic venous insufficiency quality of life questionnaire instrument for QOL were evaluated before and after intervention to assess the effects of stenting. A Kaplan-Meier analysis was used to examine primary, primary-assisted and secondary stent patency, and analysis of variance with repeated measures was used to compare clinical outcomes. A total of 167 limbs had undergone Bard Venovo stenting (56 men and 111 women). Their median age was 61 years. The laterality was right and left in 70 and 97 limbs, respectively. Post-thrombotic syndrome was seen in 84 limbs and nonthrombotic iliac vein lesions/May-Thurner syndrome in 83 limbs. At 6 months, the venous clinical severity score had improved from 7 to 4 in the limbs with a unilateral Venovo (UV) stent and from 5 to 4 in the composite Wallstent-Z stent group (P = .9). The grade of swelling had improved from 3 to 1 in the UV group and from 3 to 1 in the composite group (P = .6), and the visual analog scale for pain score had improved from 7 to 2 in the UV group and from 5 to 0 in the composite group (P = .007). At 12 months, ulcers had healed in 53% (8 of 15) of the UV group and 56% (5 of 9) of the composite group (P = .7). The global 20-item chronic venous insufficiency quality of life questionnaire scores had improved from 58 to 28 in the UV group and from 59 to 40 in the composite group (P = .6). The cumulative primary, primary-assisted, and secondary patency at 18 months was 81%, 97%, and 98% in the UV group and 87%, 98%, and 100% in the composite group, respectively (P > .4). No difference in the reintervention rates was noted between the two groups (P = .5). For patients who had undergone stenting for QOL-impairing CIVO, the results with the Bard Venovo venous stent were comparable to those with the composite Wallstent-Z stent configuration for clinical outcomes, QOL improvement, and stent patency. Further study is, however, required to confirm this improvement in the long term.
{ "pmid": 35872145, "language": "eng" }
Pulmonary embolism in pregnancy and the puerperium. Pulmonary embolism (PE) in pregnant women appears to be increasing. This could be related in part to improved health care allowing more women with risk factors to conceive, as well as increase in the high-risk groups which include pregnancies conceived on artificial reproductive technology, advancing maternal age, obesity, and caesarean deliveries. Prevention and early diagnosis with prompt effective treatment can reduce maternal mortality and improve pregnancy outcome, so that obstetricians should be on the lookout for venous thrombosis and PE, especially when in the majority of cases, risk factors only start to emerge or develop in the course of pregnancy and delivery. Management includes accurate diagnosis with ventilation/perfusion scan and CT pulmonary angiography, followed by effective anticoagulation and more aggressive measures such as thrombolysis as indicated, together with general supportive measures. Postpartum management should cover subsequent health issues, including breastfeeding, contraception, mood changes, and recurrence in subsequent pregnancies.
{ "pmid": 35872147, "language": "eng" }
Analysis of the information contained within TikTok videos regarding orthodontic retention. To analyze the content, reliability, and quality of relevant TikTok videos regarding orthodontic retention. Six relevant terms were searched on the TikTok social media Web site. Uploaded videos satisfying inclusion/exclusion criteria were assessed for the presence of nine predetermined content domains and categorized as "high-content" (≥5) or "low-content" (<5). Reliability was determined using the DISCERN instrument and quality by the Global Quality Score (GQS). Layperson videos were evaluated for themes using discourse analysis. Intraclass coefficients for content, DISCERN scores, and GQS were calculated for intrarater reliability. Statistical analysis was via IBM SPSS Statistics (version 27.0.0.0; SPSS Inc. Chicago, IL). Orthodontists uploaded 37.8% and laypersons 34% of the 209 assessed videos. Just 22.1% of videos were considered to be 'high-content.' Quality of life issues related to retainer wear was the domain that was present least often in the videos provided by orthodontists (13.9%) and most often in those uploaded by laypersons (53.5%). High-content videos recorded greater mean GQS and DISCERN scores compared with low-content videos (P < 0.001). Strong positive associations existed between the number of domains present and DISCERN scores (rho = 0.808; P < 0.01) and between GQS and DISCERN scores (rho = 0.67; P < 0.01). Intraclass coefficient scores ranged between 0.92 and 0.98. Dissatisfaction with the need for long-term retainer wear was a prevalent negative theme in layperson videos. The content, reliability, and quality of TikTok videos regarding orthodontic retention were poor. The orthodontic profession must use TikTok effectively to ensure it delivers high-quality information relevant to laypersons' concerns.
{ "pmid": 35872136, "language": "eng" }
Impact of membranous septum length on pacemaker need with different transcatheter aortic valve replacement systems: The INTERSECT registry. New permanent pacemaker implantation (new-PPI) remains a compelling issue after Transcatheter Aortic Valve Replacement (TAVR). Previous studies reported the relationship between a short MS length and the new-PPI post-TAVR with a self-expanding THV. However, this relationship has not been investigated in different currently available THV. Therefore, the aim of this study was to investigate the association between membranous septum (MS)-length and new-PPI after TAVR with different Transcatheter Heart Valve (THV)-platforms. We included patients with a successful TAVR-procedure and an analyzable pre-procedural multi-slice computed tomography. MS-length was measured using a standardized methodology. The primary endpoint was the need for new-PPI within 30 days after TAVR. In total, 1811 patients were enrolled (median age 81.9 years [IQR 77.2-85.4], 54% male). PPI was required in 275 patients (15.2%) and included respectively 14.2%, 20.7% and 6.3% for Sapien3, Evolut and ACURATE-THV(p ​< ​0.01). Median MS-length was significantly shorter in patients with a new-PPI (3.7 ​mm [IQR 2.2-5.1] vs. 4.1 ​mm [IQR 2.8-6.0], p ​= ​<0.01). Shorter MS-length was a predictor for PPI in patients receiving a Sapien3 (OR 0.87 [95% CI 0.79-0.96], p ​= ​<0.01) and an Evolut-THV (OR 0.91 [95% CI 0.84-0.98], p ​= ​0.03), but not for an ACURATE-THV (OR 0.99 [95% CI 0.79-1.21], p ​= ​0.91). By multivariable analysis, first-degree atrioventricular-block (OR 2.01 [95% CI 1.35-3.00], p = <0.01), right bundle branch block (OR 8.33 [95% CI 5.21-13.33], p = <0.01), short MS-length (OR 0.89 [95% CI 0.83-0.97], p ​< ​0.01), annulus area (OR 1.003 [95% CI 1.001-1.005], p ​= ​0.04), NCC implantation depth (OR 1.13 [95% CI 1.07-1.19] and use of Evolut-THV(OR 1.54 [95% CI 1.03-2.27], p ​= ​0.04) were associated with new-PPI. MS length was an independent predictor for PPI across different THV platforms, except for the ACURATE-THV. Based on our study observations within the total cohort, we identified 3 risk groups by MS length: MS length ≤3 ​mm defined a high-risk group for PPI (>20%), MS length 3-7 ​mm intermediate risk for PPI (10-20%) and MS length > 7 ​mm defined a low risk for PPI (<10%). Anatomy-tailored-THV-selection may mitigate the need for new-PPI in patients undergoing TAVR.
{ "pmid": 35872148, "language": "eng" }
A longitudinal study of bidirectional associations between frequent pain and insomnia symptoms in adolescents. Chronic pain and insomnia symptoms are prevalent in adolescents. This study examined the prospective associations between pain and insomnia symptoms in a large sample of adolescents. A total of 7072 adolescents (mean age = 14.6 years) participated in a longitudinal study of behavior and health in Shandong, China. A baseline survey was conducted in November-December of 2015, and a follow-up survey was conducted one year later. A self-administered questionnaire was used to assess headache, stomachache, other nonspecific pain, depression, substance use, and family environment. The Youth Self-Rating Insomnia Scale was used to measure insomnia symptoms. At baseline and 1-year follow-up, frequent pain was reported by 8.4% and 7.8% of the sample, respectively; moderate to severe insomnia symptoms were reported by 15.2% and 14.8%, respectively. Logistic regression analyses showed that frequent pain at baseline was significantly associated with increased odds of incident insomnia symptoms at 1-year follow-up (odds ratio [OR] = 1.70, 95% confidence interval [CI] = 1.23-2.34) while adjusting for adolescent and family covariates. On the other hand, insomnia symptoms at baseline were significantly associated with increased odds of incident frequent pain at 1-year follow-up (OR = 2.00, 95% CI = 1.50-2.68). The 3 types of pain (ie, headache, stomachache, and other nonspecific pain) had similar associations with insomnia symptoms. The findings suggest that the associations between frequent pain and insomnia are bidirectional, independent of multiple adolescent and family covariates. These findings stress the importance of assessment and management of both pain and insomnia symptoms in adolescents in routine clinical practice and school-based intervention programs.
{ "pmid": 35872149, "language": "eng" }
Circadian, light, and sleep skills program: Efficacy of a brief educational intervention for improving sleep and psychological health at sea. Military service poses unique threats to sleep and circadian health, and the shipboard environment presents further challenges. Disrupted sleep and circadian rhythms are linked to myriad health and safety issues that compromise readiness, including negative psychological health outcomes. Thus, one advantage of mitigating sleep problems includes the possibility of also enhancing mental health. We evaluated the efficacy of the Circadian, Light, and Sleep Skills program for shipboard military personnel for improving sleep, and examined the impact of sleep on mental health in participating sailors. Questionnaires were administered to US sailors (N = 150) assigned to three ships (one control, two intervention) before the program (T1), immediately afterward (T2), and 2-4 months later, after a period at sea (T3). Outcomes included motivation to improve sleep; sleep and circadian knowledge; frequency of sleep-promoting behaviors; sleep quality (Pittsburgh Sleep Quality Index); and mental health symptoms. Satisfaction with specific program elements and perceived relevance were also examined. Sleep and circadian knowledge, frequency of sleep-promoting behaviors, and sleep quality improved from T1 to T3 in the intervention versus control group. Sleep quality also mediated the effects of the underway (at sea) period on mental health. The intervention was well received, with high satisfaction and perceived relevance ratings. A brief 30-min intervention before an underway period improved sleep, circadian, and psychological health outcomes in shipboard sailors, even months later. Broader dissemination of this program may provide significant positive impact with minimal investment of resources.
{ "pmid": 35872150, "language": "eng" }
Beneficial effects of sleep extension on daily emotion in short-sleeping young adults: An experience sampling study. Short sleep duration has been linked to disrupted emotional experiences and poor emotion regulation. Extending sleep opportunity might therefore offer a means to improve emotion functioning. This study used experience sampling to examine the effect of sleep extension on daily emotion experiences and emotion regulation. Participants were young adults (n = 72), aged 18-24 years who reported consistently sleeping less than 7 hours in a 24-hour period in the past 2 weeks. For 14 consecutive days, participants completed experience sampling questions related to sleep, emotion, and emotion regulation via a smartphone application. Procedures were identical for all participants for the first 7 days ("baseline" assessments). From days 8-14, participants were randomly assigned to either a "sleep extension" condition, in which they were instructed to increase their sleep opportunity by 90 minutes or a "sleep as usual" condition. Duration and quality of the previous night's sleep were reported each morning and daytime experiences of positive and negative emotion and emotion regulation were measured at pseudorandom timepoints 6 times a day. Multilevel modeling demonstrated that participants in the sleep extension condition reported significantly longer sleep times and improved sleep quality, as well as higher positive and lower negative daily emotion, compared to those in the sleep as usual condition. A brief experimental paradigm to extend sleep length has the potential to improve sleep quality and to a minor extent mood, among young adults with short sleep.
{ "pmid": 35872153, "language": "eng" }
The role of neuronavigation in TMS-EEG studies: Current applications and future perspectives. Transcranial magnetic stimulation combined with electroencephalography (TMS-EEG) allows measuring non-invasively the electrical response of the human cerebral cortex to a direct perturbation. Complementing TMS-EEG with a structural neuronavigation tool (nTMS-EEG) is key for accurately selecting cortical areas, targeting them, and adjusting the stimulation parameters based on some relevant anatomical priors. This step, together with the employment of visualization tools designed to perform a quality check of TMS-evoked potentials (TEPs) in real-time during TMS-EEG data acquisition, is pivotal for maximizing the impact of the TMS pulse on the cortex and in ensuring highly reproducible measurements within sessions and across subjects. Moreover, storing stimulation parameters in the neuronavigation system can help in replicating the stimulation parameters within and across experimental sessions and sharing them across research centers. Finally, the systematic employment of neuronavigation in TMS-EEG studies is also critical to standardize measurements in clinical populations in search for reliable diagnostic and prognostic TMS-EEG-based biomarkers for neurological and psychiatric disorders.
{ "pmid": 35872155, "language": "eng" }
Prognostic value of complex glandular patterns in invasive pulmonary adenocarcinomas. Prognostic stratification of patients surgically resected with invasive pulmonary adenocarcinoma must be improved. Previous studies reported that complex glandular patterns (CGPs), cribriform and fused gland growth patterns, are associated with unfavorable prognosis. The goal of this study is to evaluate the prognostic value of CGPs in patients with resected stage I-IV lung adenocarcinoma. The presence of CGPs as a minor to predominant component was tested for association with overall survival (OS, n = 676) and relapse-free survival (RFS, n = 463) after surgery. CGPs were observed in 284 tumors (42.0%). Cribriform and fused gland were the predominant patterns in 35 and 37 cases, respectively. The presence of cribriform pattern was associated with worse RFS, but not OS. The fused gland pattern alone or grouped into CGPs with the cribriform pattern was not associated with OS and RFS. As a predominant pattern, cribriform was associated with the worse survival compared to the 5 recognized histologic patterns. Patients with fused gland-predominant tumors had 5-year survival that ranged between papillary- and micropapillary-predominant tumors. We conclude that cribriform-predominant, but not fused gland-predominant, is a subtype with poor prognosis similar to the solid and micropapillary subtypes. In contrast, the presence of a minor component of fused gland or CGPs (cribriform + fused gland) is not associated with survival. The cribriform pattern alone offers prognosis stratification improvement, but this effect is attenuated when combined into CGPs to define a subset of acinar-predominant tumors with poor prognosis. This argues against combining cribriform and fused gland into CGPs to summarize high-grade patterns.
{ "pmid": 35872154, "language": "eng" }
A MEG compatible, interactive IR game paradigm for the study of visuomotor reach-to-target movements in young children and clinical populations: The Target-Touch Motor Task. The conventional focus on discrete finger movements (i.e., index finger flexion or button-box key presses) has been an effective method to study neuromotor control using magnetoencephalography (MEG). However, this approach is challenging for young children and not possible for some people with physical disability. We have developed a novel, interactive MEG compatible reach-to-target task to investigate neuromotor function, specifically for use with young children. We used an infrared touch-screen frame to detect responses to targets presented using custom software. The game can be played using a conventional computer monitor or during MEG recordings via projector. We termed this game the Target-Touch Motor Task (TTMT). We demonstrate that the TTMT is a feasible motor task for use with young children including children with physical impairments. TTMT response-to-target trial counts are also comparable to conventional methods. Artifacts from the touch screen, while present &gt; 100 Hz, did not affect MEG source analysis in the beta band (14-30 Hz). MEG responses during TTMT game play reveal robust cortical activity from expected areas of motor cortex as typically observed following movements of the upper limb. The TTMT paradigm allows participation by individuals with a broad range of motor abilities on a reach-to-target' functional task rather than conventional tasks focusing on discrete finger movements. The TTMT is well suited for young children and successfully activates expected motor cortical areas. The TTMT opens-up new opportunities for the assessment of motor function across the lifespan, including for children with physical limitations.
{ "pmid": 35872156, "language": "eng" }
Critical aspects of microsatellite instability testing in endometrial cancer: a comparison study. The identification of mismatch repair deficient (dMMR) and microsatellite unstable (MSI) endometrial cancers (ECs) is important in screening, diagnosis, and therapeutic stratification of patients. We compared the diagnostic performance of 4 MSI molecular tests based on fragment length assay in capillary electrophoresis (OncoMate™ MSI assay, Promega) and in microcapillary electrophoresis (TapeStation 4200, Agilent); with high-resolution melting (HRM) analysis approaches (Idylla™ MSI Test, Biocartis; EasyPGX® ready MSI, Diatech Pharmacogenetics) on a series of 56 ECs, which was well characterized for MMR status with immunohistochemical approach (IHC, nonmolecular reference test). The concordance of fluorescence capillary electrophoresis with IHC (AUC 0.98) was higher respect to the other molecular methodologies. Otherwise, HRM approaches and microcapillary electrophoresis platform failed to detect MSI-ECs showing minimal microsatellite shifts. In conclusion, in colorectal site, several technologies are eligible for MSI test, whereas in ECs, MSI test should be based on fluorescent capillary electrophoresis as it identifies a higher proportion of cases that could be misdiagnosed with other strategies.
{ "pmid": 35872157, "language": "eng" }
The diagnostic and prognostic utility of incorporating DAXX, ATRX, and alternative lengthening of telomeres to the evaluation of pancreatic neuroendocrine tumors. Pancreatic neuroendocrine tumors (PanNETs) are a heterogeneous group of neoplasms with increasing incidence and an ill-defined pathobiology. Although many PanNETs are indolent and remain stable for years, a subset may behave aggressively and metastasize widely. Thus, the increasing and frequent detection of PanNETs presents a treatment dilemma. Current prognostic systems are susceptible to interpretation errors, sampling issues, and do not accurately reflect the clinical behavior of these neoplasms. Hence, additional biomarkers are needed to improve the prognostic stratification of patients diagnosed with a PanNET. Recent studies have identified alterations in death domain-associated protein 6 (DAXX) and alpha-thalassemia/mental retardation X-linked (ATRX), as well as alternative lengthening of telomeres (ALT), as promising prognostic biomarkers. This review summarizes the identification, clinical utility, and specific nuances in testing for DAXX/ATRX by immunohistochemistry and ALT by telomere-specific fluorescence in situ hybridization in PanNETs. Furthermore, a discussion on diagnostic indications for DAXX, ATRX, and ALT status is provided to include the distinction between PanNETs and pancreatic neuroendocrine carcinomas (PanNECs), and determining pancreatic origin for metastatic neuroendocrine tumors in the setting of an unknown primary.
{ "pmid": 35872158, "language": "eng" }
Folic acid-targeted pluronic F127 micelles improve oxidative stress and inhibit fibrosis for increasing AKI efficacy. The oxidative stress and activation of the fibrosis pathway are essential pathological mechanisms of acute kidney injury (AKI). In this article, we designed a drug delivery system that could effectively improve oxidative stress and relieve fibrosis by the combination of precise targeting, solubilization, and reducing the toxicity of nano-transport system to strengthen the efficacy of AKI. Folic acid (FA) was used as the targeting molecule, and curcumin (Cur) and resveratrol (Res), which are Chinese medicine monomers with anti-inflammatory and antioxidant effects, were used as model drugs. Here, the targeting nanosystem (Cur/Res@FA-F127/TPGS) co-loaded with Cur and Res was successfully synthesized. Finally, the comprehensive therapeutic effect of the nanosystem was evaluated through the targeted and pharmacodynamic researches on the AKI models induced by cisplatin (CDDP) in vitro and in vivo. The studies in vitro proved that the nanosystem could not only specifically target HK-2 cells and promote the effective accumulation of Cur and Res in the kidney, but also effectively improve oxidative stress by eliminating reactive oxygen species (ROS), stabilizing mitochondrial membrane potential (MMP), and reducing the expression of apoptosis-related proteins. The studies in vivo showed that the nanosystem could effectively play the role of anti-oxidation, anti-inflammatory and alleviate fibrosis to reduce the apoptosis and necrosis of renal tubular cells. The nanosystem could coordinately repair damaged HK-2 cells by improving oxidative stress, inhibiting inflammation and tissue fibrosis, which provided a new idea for the treatment of AKI.
{ "pmid": 35872160, "language": "eng" }
Behavioural reactivity testing in sheep indicates the presence of multiple temperament traits. Temperament in sheep is commonly presented as unidimensional, with a 'nervous' temperament indicative of fear and reactivity towards humans and novel environments. However, temperament is multidimensional, with some traits expressed only under certain conditions (context-specific). There is evidence that a common temperament test in sheep, the isolation-box (IB), measures level of activity and not fearfulness as intended, and that behaviours measured in the IB test are indicative of different traits. To investigate this, 16 behavioural responses to a human, to being startled, and to confinement (IB test) were measured in 89 lambs, twice, three months apart. Our results agree with previous studies that vocalisations in all tests and locomotion in two, show high repeatability over time. A principal component analysis identified that vocalisations are domain-general, and are indicative of the trait 'sociability', however locomotion is context-specific' and captures the traits 'exploration-avoidance', 'boldness-shyness' and 'general activity'. A cluster analysis identified four behavioural profiles that indicate the trait 'boldness-shyness' captures reactivity towards humans. This suggests the IB test, which measures 'general activity', is unsuitable for measuring reactivity towards humans in sheep, and that when studying the impact of temperament on other factors, multiple conditions should be used when identify temperament traits.
{ "pmid": 35872161, "language": "eng" }
Does a high social status confer greater levels of trust from groupmates? An experimental study of leadership in domestic horses. In collective movements, specific individuals may emerge as leaders. In this study on the domestic horse (Equus ferus caballus), we conducted experiments to establish if an individual is successfully followed due to its social status (including hierarchical rank and centrality). We first informed one horse about a hidden food location and recorded by how many it was followed when going back to this location. In this context, all horses lead their groupmates successfully. In a second step, we tested whether group members would trust some leaders more than others by removing the food before the informed individual led the group back to the food location. In addition, two control initiators with intermediate social status for which the food was not removed were tested. The results, confirmed by simulations, demonstrated that the proportions of followers for the unreliable initiator with highest social status are greater than the ones of the unreliable initiator with lowest social status. Our results suggest an existing relationship between having a high social status and a leadership role. Indeed, the status of a leader sometimes prevail at the detriment of the accuracy of the information, because an elevated social status apparently confers a high level of trust.
{ "pmid": 35872162, "language": "eng" }
Investigating the nature of depressive experiences in adults who self-medicate low mood with alcohol. This study sought to explore whether individuals who self-medicate with alcohol experience higher levels of depression, and whether symptom level experiences are affected by the behavior of self-medication. Data were from the Wave I (2001-2002) National Epidemiologic Survey on Alcohol and Related Conditions - NESARC. Only participants who answered affirmatively to either one or both of the two stem questions that highlight the key symptoms of depression were included (n = 13,753). A one-factor model of depression was supported. Experiences of suicidality were more likely to be endorsed by people who self-medicated, compared to those with low mood who do not use alcohol in this way. Typically, more common experiences of depression in the form of appetite difficulties were less likely to be reported by those who self-medicated, compared to those who do not. The findings aid understanding of the drinking patterns and other mental health correlates of those who engage in the behavior of self-medication. Findings indicate that those who self-medicate are at a higher risk for suicidality, given the same level of depression. These findings highlight the importance of identifying these potentially problematic health behaviors as early as possible, due to these risks.
{ "pmid": 35872163, "language": "eng" }
Secondary metabolites of the genus Nigrospora from terrestrial and marine habitats: Chemical diversity and biological activity. Secondary metabolites produced by the ascomycetes have attracted wide attention from researchers. Their diverse chemical structures and rich biological activities are essential in medicine, food, and agriculture. The monophyletic Nigrospora genus belongs to the Apiosporaceae family and is a rich source of novel and diverse bioactive metabolites. It occurs as a common plant pathogen, endophyte, and saprobe distributed in many ecosystems worldwide. Researchers have focused on discovering new species and secondary metabolites in the past ten years. The host diseases caused by Nigrospora species are also investigated. This review describes 50 references from Web of Science, CNKI, Google Scholar and PubMed related to the secondary metabolites from Nigrospora. Here, a total of 231 compounds isolated from five known species and 21 unidentified species of Nigrospora from January 1991 to June 2022 are summarized. Their structures are attributed to polyketides, terpenoids, steroids, N-containing compounds, and fatty acids. Meanwhile, 77 metabolites exhibited various biological activities like cytotoxic, antifungal, antibacterial, antiviral, antioxidant, anti-inflammatory, antileukemic, antimalarial, phytotoxic, enzyme inhibitory, etc. Notably, this review presents a comprehensive literature survey focusing on the chemistry and bioactivity of secondary metabolites from Nigrospora.
{ "pmid": 35872164, "language": "eng" }
A smartphone-based portable fundus camera for retinal photography in infants with suspected nonaccidental trauma. We describe a novel, do-it-yourself smartphone-based fundus camera to help with documentation of retinal hemorrhages in infant patients with suspected nonaccidental trauma. This device can be easily assembled from commercially available and inexpensive materials. We discuss the advantages and limitations of our described fundus camera and provide representative images.
{ "pmid": 35872165, "language": "eng" }
Unilateral cataract and congenital stationary night blindness in a child with novel variants in TRPM1. Unilateral cataract can cause pediatric vision impairment. Although the majority of unilateral cataracts are idiopathic in nature, genetic causes have been reported. We present the case of a 4-week-old child of nonconsanguineous parents who was affected with unilateral cataract. Whole-genome sequencing using DNA extracted from blood and the lens epithelial cells following cataract surgery revealed two presumed pathogenic variants in the TRPM1 gene, the founding member of the melanoma-related transient receptor potential (TRPM) subfamily. TRPM1 is responsible for regulating cation influx to hyperpolarized retinal ON bipolar cells, and mutations in this gene are a major cause of autosomal recessive congenital stationary night blindness (CSNB). Electroretinography revealed findings consistent with CSNB, a phenotype that was not initially suspected, and which would likely have been missed without genome sequencing. It remains unclear whether the TRPM1 variants are associated with the cataract phenotype.
{ "pmid": 35872166, "language": "eng" }
Dissecting the clinicopathologic, genomic, and immunophenotypic correlates of KRASG12D-mutated non-small-cell lung cancer. Allele-specific KRAS inhibitors are an emerging class of cancer therapies. KRAS-mutant (KRASMUT) non-small-cell lung cancers (NSCLCs) exhibit heterogeneous outcomes, driven by differences in underlying biology shaped by co-mutations. In contrast to KRASG12C NSCLC, KRASG12D NSCLC is associated with low/never-smoking status and is largely uncharacterized. Clinicopathologic and genomic information were collected from patients with NSCLCs harboring a KRAS mutation at the Dana-Farber Cancer Institute (DFCI), Memorial Sloan Kettering Cancer Center, MD Anderson Cancer Center, and Imperial College of London. Multiplexed immunofluorescence for CK7, programmed cell death protein 1 (PD-1), programmed death-ligand 1 (PD-L1), Foxp3, and CD8 was carried out on a subset of samples with available tissue at the DFCI. Clinical outcomes to PD-(L)1 inhibition ± chemotherapy were analyzed according to KRAS mutation subtype. Of 2327 patients with KRAS-mutated (KRASMUT) NSCLC, 15% (n = 354) harbored KRASG12D. Compared to KRASnon-G12D NSCLC, KRASG12D NSCLC had a lower pack-year (py) smoking history (median 22.5 py versus 30.0 py, P < 0.0001) and was enriched in never smokers (22% versus 5%, P < 0.0001). KRASG12D had lower PD-L1 tumor proportion score (TPS) (median 1% versus 5%, P < 0.01) and lower tumor mutation burden (TMB) compared to KRASnon-G12D (median 8.4 versus 9.9 mt/Mb, P < 0.0001). Of the samples which underwent multiplexed immunofluorescence, KRASG12D had lower intratumoral and total CD8+PD1+ T cells (P < 0.05). Among 850 patients with advanced KRASMUT NSCLC who received PD-(L)1-based therapies, KRASG12D was associated with a worse objective response rate (ORR) (15.8% versus 28.4%, P = 0.03), progression-free survival (PFS) [hazard ratio (HR) 1.51, 95% confidence interval (CI) 1.45-2.00, P = 0.003], and overall survival (OS; HR 1.45, 1.05-1.99, P = 0.02) to PD-(L)1 inhibition alone but not to chemo-immunotherapy combinations [ORR 30.6% versus 35.7%, P = 0.51; PFS HR 1.28 (95%CI 0.92-1.77), P = 0.13; OS HR 1.36 (95%CI 0.95-1.96), P = 0.09] compared to KRASnon-G12D. KRASG12D lung cancers harbor distinct clinical, genomic, and immunologic features compared to other KRAS-mutated lung cancers and worse outcomes to PD-(L)1 blockade. Drug development for KRASG12D lung cancers will have to take these differences into account.
{ "pmid": 35872168, "language": "eng" }
Does the osteoarthritic shoulder have altered rotator cuff vectors with increasing glenoid deformity? An in silico analysis. A transverse force couple (TFC) functional imbalance has been demonstrated in osteoarthritic shoulders by recent 3-dimensional (3D) muscle volumetric studies. Altered rotator cuff vectors may be an additional factor contributing to a muscle imbalance and the propagation of glenoid deformity. Computed tomography images of 33 Walch type A and 60 Walch type B shoulders were evaluated. The 3D volumes of the entire subscapularis, supraspinatus, and infraspinatus-teres minor (ISP-Tm) and scapula were manually segmented. The volume masks and scapular landmarks were imported into MATLAB to create a coordinate system, enabling calculation of muscle force vectors. The direction of each muscle force vector was described in the transverse and vertical plane, calculated with respect to the glenoid. Each muscle vector was then resolved into compression and shear force across the glenoid face. The relationship between muscle force vectors, glenoid retroversion or inclination, compression/shear forces on the glenoid, and Walch type was determined using linear regression. In the transverse plane with all rotator cuff muscles combined, increasing retroversion was significantly associated with increasing posterior drag (P < .001). Type B glenoids had significantly more posterior drag than type A (P < .001). In the vertical plane for each individual muscle group and in combination, superior drag increases as superior inclination increases (P < .001). Analysis of individual muscle groups showed that the anterior thrust of ISP-Tm and supraspinatus switched to a posterior drag at 8° and 10° of retroversion respectively. The compression force on the glenoid face by ISP-Tm and supraspinatus did not change with increasing retroversion for type A shoulders (P = .592 and P = .715, respectively), but they did for type B shoulders (P < .001 for both). The glenoid shear force ratio in the transverse plane for the ISP-Tm and supraspinatus moved from anterior to posterior shear with increasing glenoid retroversion, crossing zero at 8° and 10° of retroversion, whereas the subscapularis exerted a posterior shear force for every retroversion angle. Increased glenoid retroversion is associated with increased posterior shear and decreased compression forces on the glenoid face, explaining some of the pathognomonic bone morphometrics that characterize the osteoarthritic shoulder. Although the subscapularis always maintains a posterior thrust, the ISP-Tm and supraspinatus together showed an inflection at 8° and 10° of retroversion, changing from an anterior thrust to a posterior drag. This finding highlights the importance that in anatomic TSA the rotator cuff functional balance might be better restored by correcting glenoid retroversion to less than 8°.
{ "pmid": 35872167, "language": "eng" }
Three weeks of indomethacin is not superior to 1 week of meloxicam as prophylaxis for heterotopic ossifications after distal biceps tendon repair with a single-incision technique. The aim of this study was to assess the efficacy of 3 weeks of indomethacin, a nonselective nonsteroidal anti-inflammatory drug, in comparison to 1 week of meloxicam as prophylaxis for heterotopic ossifications (HOs) after distal biceps tendon repair. A single-center retrospective study was performed on 78 patients undergoing distal biceps tendon repair between 2008 and 2019. From 2008 to 2016, patients received meloxicam 15 mg daily for the period of 1 week as usual care. From 2016 onward, the standard protocol was changed to indomethacin 25 mg 3 times daily for 3 weeks. All patients underwent a single-incision repair with a cortical button technique. The postoperative rehabilitation protocol was similar for all patients. The postoperative radiographs at 8-week follow-up were assessed blindly by 7 independent assessors. If HOs were present, it was classified according to the Ilahi-Gabel classification for size and according to the Gärtner-Heyer classification for density. Statistical analysis was performed to analyze the difference in HO between the patients who were treated with indomethacin and with meloxicam. Seventy-eight patients, with a mean age of 48.8 years (range 30-72) were included. The mean follow-up after surgery was 12 months (range 2-45). Indomethacin (21 days, 25 mg 3 times per day) was prescribed to 26 (33%) patients. The 52 other patients (67%) were prescribed meloxicam 15 mg daily for 7 days. HOs were seen in 19 patients 8 weeks postoperatively. Five of 26 patients treated with indomethacin developed HO, and 14 of 52 patients treated with meloxicam developed HO (P = .5). Two patients had symptomatic HO with minor restrictions in movement; neither patient was treated with indomethacin. Significantly more HOs were seen in patients with a longer time from injury to surgery (P = .01) The intraclass correlation score for reliability between assessors for HO scoring on postoperative radiographs was good to excellent for both classifications. In this study, HOs were seen in 24% of postoperative radiographs. Three weeks of indomethacin was not superior to meloxicam for 1 week for the prevention of HO after single-incision distal biceps tendon repair.
{ "pmid": 35872169, "language": "eng" }
The reliability of revision rates following primary shoulder arthroplasty as a quality indicator to rank hospital performance: a national registry analysis including 13,104 shoulders and 87 hospitals. To assess the extent of between-hospital variation in revision following primary shoulder arthroplasty (SA), both overall and for specific revision indications to guide quality improvement initiatives, and to assess whether revision rates are suitable as quality indicators to reliably rank hospital performance. All primary SAs performed between 2014 and 2018 were included from the Dutch Arthroplasty Register to examine 1-year revision and all primary SAs performed between 2014 and 2016 for 1- and 3-year revisions. For each hospital, the observed number (O) of revisions was compared with that expected (E) based on case-mix and depicted in funnel plots with 95% control limits to identify outlier hospitals. The rankability (ie, the reliability of ranking hospitals) was calculated as the percentage of total hospital variation due to true between-hospital differences rather than chance and categorized as low (<50%), moderate (50%-75%), and high (>75%). A total of 13,104 primary SAs (87 hospitals) in 2014-2018 were included, of which 7213 were performed between 2014 and 2016. Considerable between-hospital variation was found in 1-year revision in 2014-2016 (median 1.6%, interquartile range 0.0%-3.1%), identifying 3 outlier hospitals having overall significantly more revisions than expected (O/E range 1.9-2.3) and for specific indications (cuff pathology and infection). Results for 2014-2018 were similar. For 3-year revision, 3 outlier hospitals were identified (O/E range 1.7-3.3). Rankabilities for all outcomes were low. Considerable between-hospital variation was observed for 1- and 3-year revision rates following primary SA, where outlier hospitals could be identified based on large differences in revision for specific indications to direct quality improvement initiatives. However, rankabilities were low, meaning that much of the other (smaller) variation in performance could not be detected, rendering revisions unsuitable to rank hospital performances following primary SA.
{ "pmid": 35872170, "language": "eng" }
Arthroscopic lateral collateral ligament imbrication of the elbow: short-term clinical results. Chronic posterolateral rotatory instability (PLRI) of the elbow results from an insufficient lateral collateral ligament (LCL) complex. Arthroscopic LCL imbrication may prove a minimally invasive alternative to open lateral ulnar collateral ligament (LUCL) reconstruction with a quicker rehabilitation. The purpose of this study is to analyze the validity of a modified arthroscopic imbrication technique. We hypothesized that arthroscopic LUCL imbrication would yield stable elbows in patients with grade 1 or 2 chronic PLRI at a minimum of 2 year of follow-up. We retrospectively assessed data of all PLRI patients who underwent arthroscopic LUCL imbrication from 2010 to 2013 (n = 20). Stage 3 PLRIs (frank ulnohumeral dislocations) were excluded from this treatment. After confirmation of PLRI during standard elbow arthroscopy, a doubled absorbable suture is shuttled through as much LCL tissue as possible (from the lateral ulnar border to the area proximal to the lateral epicondyle) and the sutures are tied. This results in a plication of the entire LCL complex. Objective elbow stability was assessed using a combination of the pivot shift, table top, and posterior drawer tests. Of 20 included patients, 18 were stable subjectively and objectively at a minimum of 2 year of follow-up. Mean Mayo Elbow Performance Score improved from 48 preoperatively to 88.9 at final follow-up (P < .001). Mean Quick-Disabilities of the Arm, Shoulder, and Hand score improved from 53 preoperatively to 10.3 at final follow-up (P < .001). One patient developed elbow stiffness. Two patients reported tenderness of the subcutaneous PDS knots. As a less invasive alternative to open LCL reconstruction using a graft, arthroscopic LCL imbrication has demonstrated acceptable rates of perceived elbow stability among patients with stage 1 or 2 PLRI.
{ "pmid": 35872171, "language": "eng" }
Does glenohumeral offset affect clinical outcomes in a lateralized reverse total shoulder arthroplasty? Reverse total shoulder arthroplasty (rTSA) exhibits high rates of success and low complication rates. rTSA has undergone numerous design adaptations over recent years, and lateralization of implant components provides theoretical and biomechanical benefits in stability and range of motion (ROM) as well as decreased rates of notching. However, the magnitude of implant lateralization and its effect on these outcomes is less well understood. The purpose of this study was to evaluate how increasing glenohumeral offset affects outcomes after rTSA, specifically in a lateralized humerus + medialized glenoid implant model. Primary rTSA using a lateralized humeral + medialized glenoid implant model performed at a single academic institution between 2012 and 2018 were retrospectively reviewed. Patient-reported outcome (PRO) parameters and clinical outcomes including ROM were evaluated both pre- and postoperatively. Pre- and postoperative radiographs were analyzed for measurement of glenohumeral offset, defined as the acromial-tuberosity offset (ATO) distance on the anteroposterior radiograph. A total of 130 rTSAs were included in the analysis, with a mean follow-up of 35 mo. The mean postoperative absolute ATO was 16 mm, and the mean delta ATO (difference from pre- to postoperatively) was 4.6 mm further lateralized. Among all study patients, improvements in all ROM parameters and all PROs were observed from pre- to postoperative assessments. When assessing for the effects of lateralization on these outcomes, multivariate analysis failed to reveal a significant effect from the absolute postoperative ATO or the delta ATO on any outcome parameter. rTSA using a lateralized humeral + medialized glenoid implant model exhibits excellent clinical outcomes in ROM and PROs. However, the magnitude of lateralization as measured radiographically by the ATO did not significantly affect these outcomes; patients exhibited universally good outcomes irrespective of the degree of offset restoration.
{ "pmid": 35872172, "language": "eng" }
Low rates of serious complications after open Latarjet procedure at short-term follow-up. To report on intraoperative and short-term postoperative adverse events after open Latarjet procedure in patients with recurrent anterior shoulder instability. These complications were classified into different grades of severity based on the treatment required and the learning curve of the procedure. Ninety-six patients (102 shoulders) underwent open Latarjet procedure for recurrent post-traumatic anterior glenohumeral instability between 2012 and 2020. The minimum duration of patients' follow-up was 6 months. Adverse events were classified into 3 classes based on the severity and subsequent treatment. The complications in the first 50% of all cases were compared with the latter 50% to evaluate the role of learning curve on the complication rates. The mean follow-up was 7.2 ± 2.8 months. The patients' mean age was 26.7 ± 8.9 years and consisted of 83 (86.4%) male and 13 (13.6%) female patients. The total adverse events rate was 18.6%. Adverse events requiring no additional treatment (class 1) occurred in 6 cases (5.8%) including fibrous union (3.9%) and asymptomatic resorption of the graft (1.9%). Adverse events requiring additional or extended nonoperative management (class 2) occurred in 8 cases (7.8%), including coracoid fracture (2.9%), musculocutaneous nerve palsy (1.9%), axillary nerve palsy (0.9%), suprascapular nerve palsy (0.9%), and stiffness (0.9%). All the nerve palsies recovered without long-term sequelae. Adverse events requiring secondary operative procedures (class 3) occurred in 5 cases (4.9%), including symptomatic hardware (1.9%), medial healing of the graft (0.9%), screw loosening (0.9%), and deep infection (0.9%). The rate of adverse events in revision cases was higher than primary cases in 11.7% and 6.8%, respectively (P = .119). The complication rate was significantly higher in the first half of the surgeons' practice (14.7%) than in the second half (3.9%) (P ≤ .05). The overall complication rate reported in this open Latarjet series is 18.6%; however, the rate of class 3 adverse events that required additional surgery or long-term medical treatment was only 4.9%. Revision cases had a higher rate of complications than primary cases, and the learning curve has had a significant impact on the rate of adverse events.
{ "pmid": 35872173, "language": "eng" }
Concurrent external validation of bloodstream infection probability models. Accurately estimating the likelihood of bloodstream infection (BSI) can help clinicians make diagnostic and therapeutic decisions. Many multivariate models predicting BSI probability have been published. This study measured the performance of BSI probability models within the same patient sample. We retrieved validated BSI probability models included in a recently published systematic review that returned a patient-level BSI probability for adults. Model applicability, discrimination, and accuracy was measured in a simple random sample of 4485 admitted adults having blood cultures ordered in the emergency department or the initial 48 hours of hospitalization. Ten models were included (publication years 1991-2015). Common methodological threats to model performance included overfitting and continuous variable categorization. Restrictive inclusion criteria caused seven models to apply to <15% of validation patients. Model discrimination was less than originally reported in derivation groups (median c-statistic 60%, range 48-69). The observed BSI risk frequently deviated from expected (median integrated calibration index 4.0%, range 0.8-12.4). Notable disagreement in expected BSI probabilities was seen between models (median (25th-75th percentile) relative difference between expected risks 68.0% (28.6-113.6%)). In a large randomly selected external validation population, many published BSI probability models had restricted applicability, limited discrimination and calibration, and extensive inter-model disagreement. Direct comparison of model performance is hampered by dissimilarities between model-specific validation groups.
{ "pmid": 35872174, "language": "eng" }
Adults with symptoms of pneumonia: a prospective comparison of patients with and without infiltrates on chest radiography. Most studies on patients hospitalized with community-acquired pneumonia (CAP) require confirmation of an infiltrate by chest radiography, but in practice admissions are common among patients with symptoms of pneumonia without an infiltrate (SPWI). The aim of this research was to compare clinical characteristics, microbial etiology, and outcomes among patients with CAP and SPWI. Adults suspected of CAP were prospectively recruited at Landspitali University Hospital over a 1-year period, 2018 to 2019. The study was population based. Those admitted with two or more of the following symptoms were invited to participate: temperature ≥38°C or ≤36°C, sweating, shaking/chills, chest pain, a new cough, or new onset of dyspnea. Primary outcome was mortality at 30 days and one year. Six hundred twenty-five cases were included, 409 with CAP and 216 with SPWI; median age was 75 (interquartile range [IQR] 64-84) and 315 (50.4%) were females. Patients with CAP were more likely to have fever (≥38.0°C) (66.9% [273/408]) vs. 49.3% (106/215), p < 0.001), a higher CRP (median 103 [IQR 34-205] vs. 55 (IQR 17-103), p < 0.001), identification of Streptococcus pneumoniae (18.0% [64/355]) vs. 6.3% (10/159) of tested, p = 0.002) and to receive antibacterial treatment (99.5% [407/409]) vs. 87.5% (189/216), p < 0.001) but less likely to have a respiratory virus detected (25.4% [33/130]) vs. 51.2% (43/84) of tested, p < 0.001). The adjusted odds ratios for 30-day and 1 year mortality of SPWI compared to CAP were 0.86 (95% CI 0.40-1.86) and 1.46 (95% CI 0.92-2.32), respectively. SPWI is a common cause of hospitalization and despite having fever less frequently, lower inflammatory markers, and lower detection rate of pneumococci than patients with CAP, mortality is not significantly different.
{ "pmid": 35872175, "language": "eng" }
Spatial-topographic nestedness of interoceptive regions within the networks of decision making and emotion regulation: Combining ALE meta-analysis and MACM analysis. Prominent theories propose that interoception modulates our behavioral and emotional responses involving decision-making and emotion regulation. Are the regions implicated in interoception also spatially related to and possibly nested within the networks of decision making and emotion regulation? Addressing this question, we performed three meta-analyses of functional magnetic resonance imaging studies to identify the regions that are commonly activated by the three domains using activation likelihood estimation (ALE). Additionally, we assessed the coactivation pattern of identified common regions using meta-analytic connectivity modeling (MACM). The results showed major overlaps of interoception with both decision making and emotion regulation in specifically the right dorsal anterior insula. The pairwise contrast analyses confirmed this finding and revealed conjunction-based activities in decision making and emotion regulation in the dorsal anterior cingulate cortex (dACC). MACM based on the identified insula revealed a widespread convergent coactivation pattern with the left anterior insula, dACC, and bilateral thalamus which, together, constitute the salience network. Among these co-activated regions, bilateral insula and the dACC were shared among all three domains. These results suggest that the regions mediating interoception including intero-exteroceptive integration and salience attribution are contained and thus spatially nested within the more extensive networks recruited during decision making and emotion regulation.
{ "pmid": 35872179, "language": "eng" }
miR-429 negatively regulates the progression of hypoxia-induced retinal neovascularization by the HPSE-VEGF pathway. Heparanase (HPSE) and vascular endothelial growth factor (VEGF) are believed to play a vital role in hypoxia-induced retinal neovascularization (RNV). HPSE is a target gene of miR-429. Our study aimed to investigate the effect of the miR-429-HPSE-VEGF pathway on hypoxia-induced RNV. The gene and protein expression of miR-429, HPSE and VEGF in human retinal endothelial cells and retinas was determined by real-time PCR and Western blot assays. The effects of miR-429 on human retinal endothelial cells and retinal neovascularization under hypoxia condition were verified by in vitro and in vivo experiments. First, we studied the effect of the miR-429-HPSE-VEGF pathway in HRECs under hypoxic conditions. HREC functions such as migration and tube formation were enhanced under hypoxic conditions. Overexpression of miR-429 in HRECs reversed these changes. Then, we investigated the effect of miR-429 on hypoxia-induced RNV in vivo. When miR-429 agomirs were injected into the vitreous cavity of mice with oxygen-induced retinopathy to overexpress miR-429, the mRNA and protein expression of VEGF was significantly reduced. In addition, indicators of retinal neovascularization, such as the retinal avascular area, and morphology of vessels, were reduced significantly in the miR-429 overexpression group. In this study, our data showed that miR-429 plays an important role by inhibiting the HPSE-VEGF pathway in hypoxia-induced retinopathy.
{ "pmid": 35872180, "language": "eng" }
Hydroxypropyl methylcellulose acetate succinate as an exceptional polymer for amorphous solid dispersion formulations: A review from bench to clinic. Amorphous solid dispersions (ASDs) are a proven system for achieving a supersaturated state of drug, in which the concentration of drug is greater than its crystalline solubility. The usage of Hydroxypropyl Methylcellulose Acetate Succinate (HPMCAS) in the development of ASDs has grown significantly, as evidenced by the fact that majority of commercially approved ASD formulations are based on HPMCAS. HPMCAS has been widely utilized as a solubility enhancer and precipitation inhibitor or stabilizer to achieve supersaturation and inhibit crystallization of drugs in the gastrointestinal tract. The characteristics of HPMCAS ASDs such as less hygroscopic, strong drug-polymer hydrophobic interactions, high solubilization efficiency, greater potential to generate, maintain drug supersaturation and crystallization inhibition outperform other polymeric carriers in ASD development. Furthermore, combining HPMCAS with other polymers or surfactants as ternary ASDs could be a viable approach for enhancing oral absorption of poorly soluble drugs. This review discusses the concepts of supersaturation maintenance or precipitation inhibition of HPMCAS in the ASD formulations. In addition, the mechanisms underlying for improved dissolution performance, oral bioavailability and stability of HPMCAS ASDs are explored.
{ "pmid": 35872181, "language": "eng" }
Challenges for the application of EGFR-targeting peptide GE11 in tumor diagnosis and treatment. Abnormal regulation of cell signaling pathways on cell survival, proliferation and migration contributes to the development of malignant tumors. Among them, epidermal growth factor receptor (EGFR) is one of the most important biomarkers in many types of malignant solid tumors. Its over-expression and mutation status can be served as a biomarker to identify patients who can be benifit from EGFR tyrosine kinase inhibitors and anti-EGFR monocloncal antibody (mAb) therapy. For decades, researches on EGFR targeted ligands were actively carried out to identify potent candidates for cancer therapy. An ideal EGFR ligand can competitively inhibit the binding of endogenous growth factor, such as epidermal growth factor (EGF) and transforming growth factor-α(TGF-α) to EGFR, thus block EGFR signaling pathway and downregulate EGFR expression. Alternatively, conjugation of EGFR ligands on drug delivery systems (DDS) can facilitate targeting delivery of therapeutics or diagnostic agents to EGFR over-expression tumors via EGFR-mediated endocytosis. GE11 peptide is one of the potent EGFR ligand screened from a phage display peptide library. It is a dodecapeptide that can specifically binds to EGFR with high affinity and selectivity. GE11 has been widely used in the diagnosis and targeted delivery of drugs for radiotherapy, genetherapy and chemotherpy against EGFR positive tumors. In this review, the critical factors affecting the in vivo and in vitro targeting performance of GE11 peptide, including ligand-receptor intermolecular force, linker bond properties and physiochemical properties of carrier materials, are detailedly interpreted. This review provides a valuable vision for the rational design and optimization of GE11-based active targeting strategies for cancer treatment, and it will promote the translation studies of GE11 from lab research to clinical application.
{ "pmid": 35872183, "language": "eng" }
The regulation of circadian rhythm by insulin signaling in Drosophila. Circadian rhythm is well conserved across species and relates to numerous biological functions. Circadian misalignment impairs metabolic function. Insulin signaling is a key modulator of metabolism in the fruit fly as well as mammals and its defects cause metabolic disease. Daily diet timing affects both circadian rhythmicities of behavior and metabolism. However, the relationship between the circadian clock and insulin signaling is still elusive. Here, we report that insulin signaling regulates circadian rhythm in Drosophila melanogaster. We found the insulin receptor substrate mutant, chico1, showed a shorter free-running circadian period. The knockdown of insulin receptor (InR), or another signaling molecule downstream of InR, dp110, or the expression of a dominant-negative form of InR resulted in the shortening of the circadian period and diminished its amplitude. The impairment of insulin signaling both in all neurons and restricted circadian clock neurons altered circadian period length, indicating that the insulin signaling plays a role in the regulation of circadian rhythm in clock cells. Among 3 insulin-like ligands expressed in the brain, dilp5 showed the largest effect on circadian phenotype when deleted. These results suggested that insulin signaling contributes to the robustness of the circadian oscillation and coordinates metabolism and behavior.
{ "pmid": 35872182, "language": "eng" }
Management of difficult hepatic artery reconstructions to reduce complications through continual technical refinements in living donor liver transplantations. Hepatic artery reconstruction (HAR) for liver transplantation is crucial for successful outcomes. We evaluated transplantation outcome improvement through continual technical refinements. HAR was performed in 1448 living donor liver transplants by a single plastic surgeon from 2008 to 2020. Difficult HARs were defined as graft or recipient hepatic artery ≤2 mm, size discrepancy (≥2 to 1), multiple hepatic arteries, suboptimal quality, intimal dissection of graft or recipient hepatic artery (HA), and immediate redo during transplantation. Technique refinements include early vessel injury recognition, precise HA dissection, the use of clips to ligate branches, an oblique cut for all HARs, a modified funneling method for size discrepancy, liberal use of an alternative artery to replace a pathologic HA, and reconstruction of a second HA for grafts with dual hepatic arteries in the graft. Difficult HARs were small HA (21.35%), size discrepancy (12.57%), multiple hepatic arteries (11.28%), suboptimal quality (31.1%), intimal dissection (20.5%), and immediate redo (5.18%). The overall hepatic artery thrombosis (HAT) rate was 3.04% in this series. The average HAT rate during the last 4 years (2017-2020) was 1.46% (6/408), which was significantly lower than the average HAT rate from 2008 to 2016 (39/1040, 3.8%) with a statistical significance (p = 0.025). Treatment for posttransplant HAT included anastomosis after trim back (9), reconstruction using alternatives (19), and nonsurgical treatment with urokinase (9). Careful examination of the HA under surgical microscope and selection of the appropriate recipient HA are key to successful reconstruction. Through continual technical refinements, we can reduce HA complications to the lowest degree.
{ "pmid": 35872184, "language": "eng" }
Choosing Wisely Africa: Insights from the front lines of clinical care. A multidisciplinary Task Force of African oncologists and patient representatives published the Choosing Wisely Africa (CWA) recommendations in 2020. These top 10 recommendations identify low-value, unnecessary, or harmful practices that are frequently used in Sub-Saharan Africa (SSA). In this study, we describe agreement and concordance with the recommendations from front-line oncologists across SSA. An electronic survey was distributed to members of the African Organization for Research & Training in Cancer (AORTIC) and oncology groups within SSA using a hierarchical snowball method; each primary contact distributed the survey through their personal networks. The survey captured information about awareness of the CWA list, agreement with recommendations, and concordance with clinical practice. Descriptive statistics were used to summarize study results. 52 individuals responded to the survey; 64% (33/52) were female and 58% (30/52) were clinical oncologists. Respondents represented 15 countries in SSA; 69% (36/52) practiced exclusively in the public system. Only 46% (24/52) were aware of the CWA list and 89% (46/52) agreed it would be helpful if the list was displayed in their clinic. There was generally a high agreement with the recommendations (range 84-98%); the highest agreement was related to staging/defining treatment intent (98%). The proportion of oncologists who implemented these recommendations in routine practice was somewhat lower (range 68-100%). Lowest rates of concordance related to: the use of shorter schedules of radiotherapy (67%); discussion of active surveillance forlow-risk prostate cancer (67%); only performing breast surgery for a mass that was proven to be malignant (70%); and seeking multidisciplinary input for curative intent treatment plans (73%). While most frontline SSA oncologists agree with CWA recommendations, efforts are needed to disseminate the list. Agreement with the recommendations is high but there are gaps in implementation in routine practice. Further work is needed to understand the barriers and enablers of implementation.
{ "pmid": 35872185, "language": "eng" }
Systematic data-driven exploration of Austrian wastewater and sludge treatment - implications for phosphorus governance, costs and environment. Within the new policy framework shaped by the EU Green Deal and the Circular Economy Action Plans, the field of wastewater and sludge treatment in Europe is subject to high expectations and new challenges related to mitigation of greenhouse gas emissions, micropollutant removal and resource recovery. With respect to phosphorus recovery, several technologies and processes have been thoroughly investigated. Nevertheless, a systemic and detailed understanding of the existing infrastructure and of the related environmental and economic implications is missing. Such basis is essential to avoid unwanted consequences in designing new strategies, given the long lifespan of any infrastructural change. This study couples a newly collected and highly detailed database for all wastewater treatment plants in Austria bigger than 2000 population equivalent with a combination of analyses, namely Substance Flow Analysis with focus on nutrient and metal distribution in different environmental and anthropogenic compartments, Energy Flow Analysis, Life Cycle Assessment and cost estimation. The case study of Austria is of special interest, given its highly autonomous administration in federal states and its contrasting traits, ranging from flat metropolitan areas like Vienna to low-populated alpine areas. The significant impact of electricity demand of wastewater treatment on the overall Cumulative Energy Demand (CED) shows the importance of optimization measures. Further, the current system of wastewater and sludge disposal have a low efficiency in recovering nutrients and in directing pollutants as heavy metals into final sinks. Sludge composting with subsequent use in landscaping does not only show an unfavorable environmental balance, but it is the only relevant route leading to additional CED and Global Warming Potential emissions and to the highest transport volume. Altogether, the outcomes of this study provide a sound basis to further develop national strategies for resource recovery aimed to optimize trade-offs between different economic and environmental objectives.
{ "pmid": 35872186, "language": "eng" }
Neurocognitive function in adult residents of a mining district in Mexico after reducing manganese exposure: Follow-up after 11 years. Little is known about the neurotoxic effects of chronic exposure to airborne Mn once exposure has been reduced. The environmentally exposed and the reference adult populations evaluated in 2002 were followed, after an environmental management program (EMP) was implemented to reduce the exposure in a mining district in Mexico. The aim of this study was to compare the association between exposure to Mn and neurocognitive performance in environmentally exposed and reference groups of adults before and after EMP implementation. In 2013, the same battery of neurocognitive tests used in the initial study (2002) was applied to 58 adults exposed to airborne Mn and 30 adults from the reference community. A cumulative exposure index (CEI) was estimated for the study population before and after the EMP. Categorical outcomes were analyzed using logistic regression, and the resulting ORs were compared between studies. Continuous outcomes were analyzed using linear regression. All models were adjusted for age, years of education, socioeconomic status and blood lead levels. Exposed adults from the post-EMP study showed an improvement in fine motor and verbal regulation of motor skills (OR < 1) compared to the exposed adults from the pre-EMP study (OR > 1). In both pre- and post-studies, the exposed adults showed a deterioration in their dynamic organization of motor activity compared to the reference group (p < 0.05); however, they showed no significant change in attention and working-memory performance. After four years of a significant reduction in airborne Mn levels resulting from EMP implementation, chronically exposed adults showed an improvement in fine motor and verbal regulation of motor skills; however, the remaining areas of their motor and cognitive functions remained impaired.
{ "pmid": 35872188, "language": "eng" }
Microwave-based soil moisture improves estimates of vegetation response to drought in China. The increased frequency and severity of drought has heightened concerns over the risk of hydraulic vegetative stress and the premature mortality of ecosystems globally. Unfortunately, most land surface models (LSMs) continue to underestimate ecosystem resilience to drought - which degrades the credibility of model-predicted ecohydrological responses to climate change. This study investigates the response of vegetation gross productivity to water-stress conditions using microwave-based vegetation optical depth (VOD) and soil moisture retrievals. Based on the estimated isohydric/anisohydric spectrum, we find that vegetation at isohydric state exhibits a larger decrease in gross primary productivity and higher water use efficiency than anisohydric vegetation due to their more rigorous stomatal control and higher tolerance of carbon starvation risk. In addition, the introduction of microwave soil moisture improves the accuracy of isohydricity/anisohydricity estimates compared to those obtained using microwave VOD alone (i.e., increases their Spearman rank correlation versus the benchmark of Global Biodiversity Information Facility dataset from 0.12 to 0.63). Results of this study provide clear justification for the use of microwave-based soil moisture retrievals to enhance stomatal conductance parameterization within LSMs.
{ "pmid": 35872189, "language": "eng" }
Pilot-scale hydrolysis of primary sludge for production of easily degradable carbon to treat biological wastewater or produce biogas. Organic compounds in wastewater are required for the biological removal of nitrogen, but they can also be used for biogas production. Distribution of the internal organic carbon at the plant is therefore critical to ensure high quality of the treated water, reduce greenhouse gas emissions, and optimize biogas production. We describe a wastewater treatment plant designed to focus equally on energy production, water quality, and reduced emissions of greenhouse gases. A disk filter was installed to remove as much carbon as possible during primary treatment. Primary sludge was then hydrolyzed and centrifuged. The hydrolysate centrate contained volatile fatty acids and was used either for the secondary wastewater treatment or to produce biogas. The yield during hydrolysis was 30-35 g volatile fatty acid per kg dry material or 40-65 g soluble COD per kg total solid. The specific denitrification rate was 20-40 g/(g·min), which is on the same order of magnitude as that for commonly used external carbon sources. Hydrolysis at around 35 °C and pH 7 gave the best results. The hydrolysate centrate can be stored and added to the biological treatment to improve water quality and reduce emissions of nitrous oxide or it can be used to produce biogas to optimize the operation of the plant.
{ "pmid": 35872190, "language": "eng" }
Comparison of the performance of hydrochar, raw biomass, and pyrochar as precursors to prepare porous biochar for the efficient sorption of phthalate esters. In this study, three high-performance porous biochars were synthesized by the cocarbonization of Pistia stratiotes-derived precursors (raw biomass, hydrochar and pyrochar) with potassium hydroxide and utilized for the sorption of diethyl phthalate from aqueous solution. The developed pore structure, surface functional groups, high hydrophobicity characteristic and graphene structure of porous biochars contributed to the excellent sorption quantity of up to 813 mg g-1 (Ce, 25 mg L-1). Among the three precursors, hydrochar-derived porous biochar showed better properties in terms of its specific surface area and hydrophobicity, and it displayed the highest sorption capacity. The sorption kinetics and isotherm experiments confirmed that pore filling and partitioning dominated the sorption capacity while the mass transfer, hydrogen bonding and π-π stacking in the hydrochar limited the sorption rate. This finding helped to propose a feasible method for the efficient utilization of invasive aquatic plants and provided novel insight into the selection of precursors for preparing porous biochars.
{ "pmid": 35872192, "language": "eng" }
Influence of atmospheric patterns on soil moisture dynamics in Europe. Soil moisture (SM) plays a key role in the water cycle, and its variability is intimately linked to coupled land-atmosphere processes. Having a good knowledge of soil-atmospheric interactions is thus essential to assess the impact of climate change on SM; however, many aspects of how water and energy exchanges occur in the soil-atmosphere continuum are still uncertain. In particular, it is known that atmospheric circulation patterns influence climate conditions over Europe but their impact on SM has only rarely been studied. This study provides insight into how atmospheric patterns influence soil moisture dynamics in Europe, where an increase in temperature and agricultural droughts are expected as an impact of climate change. To do so, we analysed the influence of the North Atlantic Oscillation (NAO), the Arctic Oscillation (AO), and the El Niño Southern Oscillation (ENSO) on European SM, including lagged responses, for the period 1991-2020 at a monthly scale. Two methods have been used: a lagged correlation analysis and a more sophisticated causality approach using the PCMCI (PC method combined with the momentary conditional independence (MCI) test). SM series from two different databases were considered: the hydrological model LISFLOOD and the reanalysis dataset ERA5-Land. The results from the correlation analysis showed a significant, predominantly negative relationships of SM with NAO and AO over almost all of Europe and no significant relation with ENSO. With the causality analysis, similar patterns are obtained for NAO and AO; however, the PCMCI analysis revealed clear patterns of ENSO influencing SM with a delayed response of one-to-two months in central and northwest Europe. The results obtained in this work highlight that there are causal relations between the main modes of interannual climate oscillations and SM variations in Europe, underlining the importance of accounting for global atmospheric circulations to study current changes in regional soil water-related processes.
{ "pmid": 35872191, "language": "eng" }
Multi-faceted analyses of seasonal trends and drivers of land surface variables in Indo-Gangetic river basins. The Indo-Gangetic river basins feature a wide range of climatic, topographic, and land cover characteristics providing a suitable setting for the exploration of multivariate time series. Here, we collocated a comprehensive feature space for these river basins including Earth observation time series on the normalized difference vegetation index (NDVI), surface water area (SWA), and snow cover area (SCA) in combination with driving variables between December 2002 and November 2020. First, we evaluated changes using multi-faceted trend analyses. Second, we employed the causal discovery algorithm Peter and Clark Momentary Conditional Independence (PCMCI) to disentangle interactions within the feature space. PCMCI quantifies direct and indirect relationships between variables and has been rarely applied to remote sensing applications. The results showed that vegetation greening continues significantly. Irrigated croplands in the Indus basin indicated the highest trend magnitude (0.042 NDVI/decade-1). At annual and basin scale, positive trends were also identified for SWA in the Indus (837 km2/decade-1) and Ganges basin (677 km2/decade-1). Annual trends in SCA were insignificant at basin scale. Considering elevation zones, negative SCA trends were found in high altitudes of the Ganges and Brahmaputra river basins. Similarly, NDVI and SWA showed positive trends in high elevations. Furthermore, the causal analysis revealed that NDVI was controlled by water availability. SWA was directly influenced by river discharge and indirectly by precipitation. In high altitudes, SWA was controlled by SCA and temperature. Precipitation and temperature were identified as important drivers of SCA with spatio-temporal variations. With amplified climate change, the joint exploitation of time series will be of increasing importance to further enhance the understanding of land surface change and complex interplays across the spheres of the Earth system. The insights of this study and used methods could greatly support the development of climate change adaptation strategies for the investigated region.
{ "pmid": 35872193, "language": "eng" }
Avoidance responses by Danio rerio reveal interactive effects of warming, pesticides and their mixtures. Temperature variations and thermal extremes events caused by climate change can have profound implications for the toxicity of pesticides in aquatic organisms. Using an innovative system (Heterogeneous Multi-Habitat Test System - HeMHAS) that allows the simulation of different scenarios within a spatially heterogeneous landscape, the effects on the habitat selection of Danio rerio fish caused by the pesticides fipronil and 2,4-D were studied as single compounds and in mixture and integrated with air temperature variation (20, 24 and 28 °C). As a result, D. rerio detected and avoided both pesticides at air temperatures of 20 and 24 °C; however, at 28 °C no significant difference was observed in habitat choice by fish. Additionally, when pesticides were mixed in a heterogeneously contaminated landscape, it was observed that D. rerio detected contamination and preferred the clean zone at 20 and 24 °C; however, at 28 °C the potential to escape from the most contaminated areas was impaired. Thus, contamination by both pesticides made the habitat selection behavior of fish at 20 and 24 °C more noticeable. In addition, the association between pesticides and temperature showed negative effects on the response of fish to detect and escape from contaminated environments, suggesting the influence of temperature in altering the ability of the organism to provide an efficient response to stress.
{ "pmid": 35872194, "language": "eng" }
Bioavailability of potentially toxic elements influences antibiotic resistance gene and mobile genetic element abundances in urban and rural soils. Antibiotic resistance genes (ARGs) that can encode resistance traits in bacteria are found across the environment. While it is often difficult to discern their origin, their prevalence and diversity depends on many factors, one of which is their exposure to potentially toxic elements (PTE, i.e., metals and metalloids) in soils. Here, we investigated how ambient ARGs and mobile genetic elements (MGEs) relate to the relative bioavailability of different PTEs (total versus exchangeable and carbonate-bound PTE) in rural and urban soils in northeast England. The average relative abundances of ARGs in rural sites varied over a 3-log range (7.24 × 10-7 to 1.0 × 10-4 genes/16S rRNA), and relative ARG abundances in urban sites varied by four orders of magnitude (1.75 × 10-6 to 2.85 × 10-2 genes/16S rRNA). While beta-lactam and aminoglycoside resistance genes dominated rural and urban sites, respectively, non-specific ARGs, also called multidrug-resistance genes, were significantly more abundant in urban sites (p < 0.05). Urban sites also had higher concentrations of total and exchangeable forms of PTE than rural sites, whereas rural sites were higher in carbonate-bound forms. Significant positive Spearman correlations between PTEs, ARGs and MGEs were apparent, especially with bioavailable PTE fractions and at urban sites. This study found significant positive correlations between ARGs and beryllium (Be), which has not previously been reported. Overall, our results show that PTE bioavailability is important in explaining the relative selection of ARGs in soil settings and must be considered in future co-selection and ARG exposure studies.
{ "pmid": 35872195, "language": "eng" }
Effects of black carbon aerosol on air quality and vertical meteorological factors in early summer in Beijing. Black carbon (BC) aerosols have effects on the atmospheric thermal vertical structure due to its radiation absorption characteristics, hereby influencing the boundary layer characteristics and pollutant diffusion. This study focuses on the BC effects under different atmospheric conditions on air quality and vertical meteorological conditions. Four days flight observation combined with surface wind profiler radar data were used to investigate the vertical profiles of BC and wind speed over Beijing urban area in early summer. The vertical profiles of BC concentration and wind speed in the boundary layer had a negative correlation, both having abrupt changes near the boundary layer height under stagnant weather conditions. The chemical transport model showed the increase of BC under stagnant conditions could cause aggravation of the stability of the boundary layer, thereby increasing the accumulation of pollutants. In particular, BC leads to the changes in the temperature profile, which will modify relative humidity and indirectly lead to the changes in the vertical profile of aerosol optical properties. However, if the early accumulation of BC was absent under more turbulent conditions, the effects of BC on air quality and meteorological conditions were limited.
{ "pmid": 35872196, "language": "eng" }
Enzymatic regulation of N2O production by denitrifying bacteria in the sludge of biological nitrogen removal process. This study analyzed the activities of all denitrifying enzymes involved in the denitrification process under different organic loads in a continuously operating sequencing batch reactor (SBR), to reveal how the denitrifying enzymes performed while the denitrifying bacteria facing changes in organic load, and leading to nitrous oxide (N2O) production by fine-tuning enzyme activities. Results show that the activities of nitrate reductase (Nar), nitrite reductase (Nir), nitric oxide reductase (Nor) and nitrous oxide reductase (N2OR) increased with the increase of organic loads, and the increase of the activity of different enzymes promoted by the organic load increase were as Nar > Nir > Nor > N2OR. Compared with the Nar and Nir, the catalytic processes of the Nor and N2OR were more susceptible to the influence of the substrate concentration and the content of internal and external carbon sources. The Nor usually maintained "excess" catalytic activity to ensure the smooth reduction of nitric oxide when the electron donor and substrate were sufficient. Otherwise, it reduced to a relatively lower catalytic activity and remained stable. The activities of the N2OR were generally weaker than that of other denitrifying enzymes. More N2O was produced in the period feeding with low organic loads (COD/NO3--N ≤ 4.9). The mechanism of the enzyme activities (Nor and N2OR) regulating the total concentrations of N2O was clarified. When the organic load was relatively low (COD/NO3--N ≤ 2.5), the N2OR activity was inhibited due to its inability to acquire enough electrons, resulting the production of N2O. When the organic load was moderate (2.5 < COD/NO3--N ≤ 4.9), the N2OR activity was lower than the Nor activity due to the different activation rates of Nor and N2OR by the substrate in bacteria, resulting the production of N2O.
{ "pmid": 35872197, "language": "eng" }
Response of forage nutritional quality to climate change and human activities in alpine grasslands. The impacts of climate change and human activities on forage nutritional quality will affect nutrient capacity, livestock development and wildlife conservation in alpine regions. However, the response of forage nutritional quality to climate change and human activities remains indistinguishable across the whole Tibet. Here, six forage variables (i.e., crude protein, CP; ether extract, EE; crude ash, Ash; acid detergent fiber, ADF; neutral detergent fiber, NDF; water-soluble carbohydrates, WSC) together represented forage nutritional quality. We estimated potential forage CP, EE, Ash, ADF, NDF and WSC contents using growing mean air temperature, total precipitation and total radiation based on random forest models. We also estimated actual forage CP, EE, Ash, ADF, NDF and WSC contents using growing mean air temperature, total precipitation and total radiation, and maximum normalized difference vegetation index based on random forest models. Climate change had nonlinear effects on potential forage CP, EE, Ash, ADF, NDF and WSC contents. Radiation change predominated the variations of potential forage nutritional quality. Human activities altered the sensitivities of forage nutritional quality to climate change. The effects of human activities on forage nutritional quality increased with increasing longitude and precipitation, and decreasing elevation and radiation. Consequently, we should pay attention to the radiation change besides climate warming and precipitation change, at least for forage nutritional quality in alpine grasslands. The effects of human activities on forage nutritional quality can vary with longitude, elevation, precipitation and radiation in alpine grasslands.
{ "pmid": 35872198, "language": "eng" }
High importance of coupled nitrification-denitrification for nitrogen removal in a large periodically low-oxygen estuary. The coupling between nitrification and denitrification/anammox (nitrate/nitrite used in denitrification/anammox derives from nitrification) is a significant process of reactive nitrogen (N) removal that has attracted much attention. However, the dynamics of coupled nitrification-denitrification/anammox in the periodically low-oxygen estuaries and coasts remain unclear. Here, continuous-flow experiments combined with isotope tracing techniques were conducted in periodically low-oxygen areas of the Yangtze Estuary to reveal the changes in benthic sediment denitrification and anammox as well as their coupling with nitrification. Our results showed that denitrification increased but anammox decreased during low-oxygen summer. The occurrence of low oxygen also promoted coupled nitrification-denitrification but decreased coupled nitrification-anammox. These results implied that decreased dissolved oxygen in summer did not largely restrict nitrification activity, and anaerobic denitrification/anammox regulated the magnitude of coupled nitrification-denitrification/anammox rates. Denitrification (74.95-100 %) was the dominant process in total N removal, while coupled nitrification-denitrification accounted for a higher proportion (45.68-97.05 %) of denitrification, indicating that coupling between nitrification and denitrification played a dominant role in N removal. In addition to dissolved oxygen levels, carbon and N substrate availabilities were also important variables to regulate N transformations. Overall, this study advanced our knowledge of the distribution patterns and controlling factors of N removal processes and highlighted that coupled nitrification-denitrification might have a significant but neglected role in N removal from periodically low-oxygen estuaries.
{ "pmid": 35872199, "language": "eng" }
Groundwater travel times predict DOC in streams and riparian soils across a heterogeneous boreal landscape. Dissolved organic carbon (DOC) in surface waters is an important component of the boreal landscape carbon budget and a critical variable in water quality. A dominant terrestrial DOC source in the boreal landscape is the riparian zone. These near stream areas play a key role in regulating DOC transport between land and aquatic ecosystems. The groundwater dynamics at this interface have been considered a major controlling variable for DOC export to streams. This study focuses on the regulating role of groundwater levels and mean travel times (MTT) on riparian DOC concentrations and, subsequently, stream DOC. This is done by comparing them as explanatory variables to capture the spatial and intra-annual variability of the stream and riparian groundwater DOC. We used a physically based 3D hydrological model, Mike SHE, to simulate DOC concentrations of the riparian zones for 14 sub-catchments within the Krycklan catchment (Sweden). The model concept assumes that DOC concentrations will be higher in groundwater moving through shallow flow paths. In the model, this can be linked to the position of the groundwater table at a point of observation or the travel time, which will generally be shorter for water that has travelled through shallow and more conductive soil layers. We compared the results with both observed stream and groundwater concentrations. The analysis revealed that the correlation between modelled and observed annual averages of stream DOC increased from r = 0.08 to r = 0.87 by using MTT instead of groundwater level. MTT also better captured the observed spatial variability in riparian DOC concentrations and more successfully represented seasonal variability of stream DOC. We, therefore, suggest that MTT is a better predictor than groundwater level for riparian DOC concentration because it can capture a greater variety of catchment heterogeneities, such as variation in soil properties, catchment size, and input from deep groundwater sources.
{ "pmid": 35872200, "language": "eng" }
Enhancing phosphorus recovery from efficient acidogenic fermentation of waste activated sludge with acidic cation exchange resin pretreatment: Insights from occurrence states and transformation. Achieving phosphorus (P) recovery during treatment and disposal of waste activated sludge (WAS) by anaerobic-based processes has received increasing attention. To solve the problem of low phosphorus release efficiency, anaerobic fermentation (AF) combined with acidic cation exchange resin (ACER) pretreatment was first proposed in this study. Results showed that the isoelectric point pretreatment with ACER increased the recoverable phosphorus content by 2.3 times compared to that without ACER pretreatment. Phosphorus transformation was systematically analyzed from a whole-process perspective, and the results visually revealed that the release of phosphorus during the conventional AF process (without ACER pretreatment) was limited by insufficient phosphorus release from extracellular polymeric substances (EPS) and mineral precipitation, as well as the reprecipitation of soluble phosphorus with metals. ACER enabled effective dissolution of mineral phosphorus by acidifying WAS. On the other hand, ACER adsorbed metals to promote EPS disintegration and hydrolysis, thereby enhancing the release of EPS-bound P, which also reduced the reprecipitation of soluble phosphorus during AF. Furthermore, ACER pretreatment increased volatile fatty acids production by >2-fold with enhanced sludge hydrolysis. This finding has important implications for both non-renewable phosphorus recovery and sludge resource recovery.
{ "pmid": 35872202, "language": "eng" }
Artificial neural network modeling in environmental radioactivity studies - A review. The development of nuclear technologies has directed environmental radioactivity research toward continuously improving existing and developing new models for different interpolation, optimization, and classification tasks. Due to their adaptability to new data without knowing the actual modeling function, artificial neural networks (ANNs) are extensively used to resolve the tasks for which the application of traditional statistical methods has not provided an adequate response. This study presents an overview of ANN-based modeling in environmental radioactivity studies, including identifying and quantifying radionuclides, predicting their migration in the environment, mapping their distribution, optimizing measurement methodologies, monitoring processes in nuclear plants, and real-time data analysis. Special attention is paid to highlighting the scope of the different case studies and discussing the techniques used in model development over time. The performances of ANNs are evaluated from the perspective of prediction accuracy, emphasizing the advantages and limitations encountered in their use. The most critical elements in model optimization were identified as network structure, selection of input parameters, the properties of input data set, and applied learning algorithm. The analysis of strategies and methods for improving the performance of ANNs has shown that developing integrated and hybrid artificial intelligent tools could provide a new path in environmental radioactivity modeling toward more reliable outcomes and higher accuracy predictions. The review highlights the potential of neural networks and challenges in their application in environmental radioactivity studies and proposes directions for future research.
{ "pmid": 35872204, "language": "eng" }
Direct measurements of dissolved N2 and N2O highlight the strong nitrogen (N) removal potential of riverine wetlands in a headwater stream. Increasing levels of nitrogen (N) in aquatic ecosystems due to intensified human activities is focusing attention on N removal mechanisms as a means to mitigate environmental damage. Important N removal processes such as denitrification can resolve this issue by converting N to gaseous emissions. Here, the spatiotemporal variability of N removal rates in China's Zhongtian River, a headwater stream that contains wetlands, was investigated by quantifying gaseous emissions of the main end products, N2 and N2O, using the water-air exchange model. Excess concentrations of these gases relative to their saturations in the water column generally varied within 1.4-8.7 μmol L-1 and 8.7-20.3 nmol L-1, with mean values of 4.5 μmol L-1 and 13.7 nmol L-1, respectively, demonstrating significant N removal in the river. The reach with wetlands was characterized by higher in-stream N2 production than the non-wetland reach, especially in July, when aquatic vegetation is most abundant. High N2O emissions during the same period in the non-wetland reach indicate that environmental conditions associated with vegetation are conducive to N2 production and likely constrain N2O emission. Changes in dissolved oxygen, pH, temperature, and carbon to nitrogen ratios are correlated with the observed spatiotemporal variabilities in gaseous N production. The mean N removal rate in the wetland reach was roughly twice that in the non-wetland reach, i.e., 22.4 vs. 10.3 mmol N m-2 d-1, while the corresponding efficiency was about five times as high, i.e., 15 % vs. 3 %. This study reveals the spatiotemporal patterns of in-stream N removal in a headwater stream and highlights the efficacy of wetlands in N removal. The data provide a strong rationale for constructing artificial wetlands as a means to mitigate N pollution and thereby optimize riverine environmental conditions.
{ "pmid": 35872205, "language": "eng" }
Experimental evidence for the impact of phages on mineralization of soil-derived dissolved organic matter under different temperature regimes. Microbial mineralization of dissolved organic matter (DOM) plays an important role in regulating C and nutrient cycling. Viruses are the most abundant biological agents on Earth, but their effect on the density and activity of soil microorganisms and, consequently, on mineralization of DOM under different temperatures remains poorly understood. To assess the impact of viruses on DOM mineralization, we added soil phage concentrate (active vs. inactive phage control) to four DOM extracts containing inoculated microbial communities and incubated them at 18 °C and 23 °C for 32 days. Infection with active phages generally decreased DOM mineralization at day one and showed accelerated DOM mineralization later (especially from day 5 to 15) compared to that with the inactivated phages. Overall, phage infection increased the microbially driven CO2 release. Notably, while higher temperature increased the total CO2 release, the cumulative CO2 release induced by phage infection (difference between active phages and inactivated control) was not affected. However, higher temperatures advanced the response time of the phages but shortening its active period. Our findings suggest that bacterial predation by phages can significantly affect soil DOM mineralization. Therefore, higher temperatures may accelerate host-phage interactions and thus, the duration of C recycling.
{ "pmid": 35872203, "language": "eng" }
Exposure to ultrafine particles and childhood obesity: A cross-sectional analysis of the Seven Northeast Cities (SNEC) Study in China. Studies on the obesogenic effect of air pollution on children have been mixed and sparse. Moreover, due to insufficient air monitoring, few studies have investigated the role of more tiny but unregulated particles (ambient particles with a diameter of 0.1 μm or less, ultrafine particles). We sought to explore the associations between long-term exposure to ambient ultrafine particles (UFPs) and childhood obesity in Chinese children. In this cross-sectional study, we randomly recruited 47,990 children, aged 6-18 years, from seven cities in Northeastern China between 2012 and 2013. Child age- and sex-specific z-scores for body mass index (BMI Z-score) and weight status were generated using the World Health Organization growth reference. Four-year average concentrations of UFPs and airborne particulates of diameter ≤ 1 μm (PM1), ≤2.5 μm (PM2.5), and ≤10 μm (PM10) were estimated at home, using neural network simulated WRF-Chem model and spatiotemporal model, respectively. Confounder-adjusted generalized linear mixed models examined the associations between air pollution and BMI Z-score and the prevalence of childhood obesity. We found that UFPs exposure was associated with greater childhood BMI Z-score and a higher likelihood of obesity. Compared with the lowest quartile, higher quartiles of UFPs were associated with greater odds for obesity prevalence in children (i.e., the adjusted OR was 1.25; 95 % CI, 1.12-1.39; 1.43; 95 % CI, 1.27-1.61; and 1.41; 95 % CI, 1.25-1.58 for the second, third, and fourth quartile, respectively). Similar associations were observed for PM1, PM2.5, and PM10, and were greater in boys and children living close to roadways. Long-term UFPs exposure was associated with a greater likelihood of childhood obesity, and stronger associations on BMI Z-score were observed in boys and children living close to roadways. This study indicates that more attention should be paid to the health effects of UFPs, and routinely monitoring of UFPs should be considered.
{ "pmid": 35872206, "language": "eng" }
Understanding source terms of anthropogenic uranium in the Arctic Ocean - First 236U and 233U dataset in Barents Sea sediments. This work reports the first dataset of 236U and 233U in sediment cores taken from the Barents Sea, with the aim to better understand the source terms of anthropogenic uranium in the Arctic region. Concentrations of 236U and 233U along with 137Cs, and 233U/236U atomic ratio were measured in six sediment profiles. The cumulative areal inventories of 236U and 233U obtained in this work are (3.50-12.7) × 1011 atom/m2 and (4.92-21.2) × 109 atom/m2, with averages values of (8.08 ± 2.93) × 1011 atom/m2 and (1.08 ± 0.56) × 1010 atom/m2, respectively. The total quantities of 236U and 233U deposited in the Barents Sea bottom sediments were estimated to be 507 ± 184 g and 7 ± 3 g, respectively, which are negligible compared to the total direct deposition of 236U (6000 g) and 233U (40-90 g) from global fallout in the Barents Sea. The integrated atomic ratios of 233U/236U ranging in (0.98-1.57) × 10-2 reflect the predominant global fallout signal of 236U in the Barents Sea sediments and the highest reactor-236U contribution accounts for 30 ± 14 % among the six sediment cores. The reactor-236U input in the Barents Sea sediments is most likely transported from the European reprocessing plants rather than related to any local radioactive contamination. These results provide better understanding on the source term of anthropogenic 236U in the Barents Sea, prompt the oceanic tracer application of 236U for studying the dynamics of the Atlantic-Arctic Ocean and associated climate changes. The 236U-233U benchmarked age-depth profiles seem to match reasonably well with the reported input function history of radioactive contamination in the Barents Sea, indicating the high potential of anthropogenic 236U-233U pair as a useful tool for sediment dating.
{ "pmid": 35872207, "language": "eng" }
Reassessment of carbon emissions from fires and a new estimate of net carbon uptake in Russian forests in 2001-2021. Russia has the largest forest area on earth. Its boreal forests officially store about 97 Pg C, which significantly affect the global carbon cycle. In recent years, forest fires have been intensifying on the planet, leading to increased carbon emissions. Here we review how differences in fire control management of Russian forests affect fire related emissions. Carbon emissions due to fire were estimated using satellite data and compared to official reports for 2001-2021. We found that the relative areas affected by fire did differ between different fire protection zones, and 89 % of the area burnt was in forests controlled by fire-fighting aircraft or areas without protection. As a result, 417.7 Mha of poor or unprotected Russian forests (42 % of total) account about a half of total carbon emissions. According to our estimates, the average area of burnt forests in Russia was about 8.3 Mha per year between 2016 and 2021, resulting in annual carbon emission of 193 million metric tons (Mt) C emissions, and 53 % of them were from unprotected forest. These estimated carbon emissions are significantly higher than official national reports (79 Mt C yr-1). We estimated that net carbon uptake for Russia for 2015-2021 was about 333 ± 37 Mt C, which is roughly double the official estimates. Our results highlight large spatial differences in fire protection and prevention strategies in fire related emissions. The so-called control zone which stretches across large parts of Eastern Russia has no fire control and is the region of major recent fires. Our study shows that to estimate the Russian forest carbon balance it is critical to include this area. Implementation of some forest management in the remote areas (i.e., control zone) would help to decrease forest loss and resulting carbon emissions.
{ "pmid": 35872210, "language": "eng" }
Association of Intraoperative and Perioperative Transfusions with Postoperative Cardiovascular Events and Mortality After Infrainguinal Revascularization. Patients undergoing open or endovascular infrainguinal revascularization are at an elevated risk for postoperative cardiovascular complications due to high rates of comorbidities and the physiologic stress of surgery. Transfusions are known to be associated with adverse events but knowledge of specific risks associated with transfusion timing, product type, and long-term outcomes while accounting for preoperative cardiovascular risk factors is not well understood in this population. This study aimed to characterize the association of intraoperative and perioperative transfusion, anemia, and cardiovascular risk factors with cardiovascular events and mortality in patients undergoing infrainguinal revascularization. A single-center retrospective study was performed on 564 infrainguinal revascularization procedures, including both open (n = 250) and endovascular (n = 314) approaches (2016-2020). Comprehensive clinical data were collected including patient demographics, cardiovascular risk factors, preoperative hemoglobin, and detailed transfusion data. Multivariable logistic regression tested the association of transfusions with composite 30-day outcomes of cardiac complications (postoperative myocardial infarction [postop-MI], congestive heart failure, or dysrhythmia) and with major adverse cardiovascular events (MACE-postop-MI or death). Kaplan-Meier analysis and Cox proportional hazard modeling examined the association of transfusions, anemia, and cardiovascular risk factors with mortality up to 1 year. Intraoperative transfusion was performed in 15% of cases and 13% underwent transfusion in the early postoperative period. Intraoperative transfusion was associated with higher Revised Cardiac Risk Index (RCRI), lower preoperative hemoglobin, increased blood loss, and open procedures (all P < 0.05). Within each RCRI score, intraoperative transfusion was associated with 2-4-fold increased MACE at 30 days. Intraoperative packed red blood cells transfusion and early postoperative packed red blood cells transfusion was associated with more than 2-fold adjusted odds of any cardiovascular complication and intraoperative transfusion was also associated with MACE (all P < 0.05). Intraoperative transfusion was associated with mortality at 1 year on unadjusted analysis, but after adjustment for RCRI, age, and preoperative hemoglobin, only RCRI scores of 2 and 3+ and preoperatively hemoglobin remained significant risk factors for mortality. Intraoperative and early perioperative transfusions are strongly associated with worse cardiovascular outcomes after infrainguinal revascularization. These findings may have a prognostic value for further risk stratifying patients perioperatively at a high risk for complications. However, prospective studies are needed to elucidate whether optimizing transfusion strategies mitigates these risks.
{ "pmid": 35872211, "language": "eng" }
Outcomes of Popliteal Endarterectomy and Infrapopliteal Angioplasty for Short Atherosclerotic Popliteal Artery Occlusion: Single-Center Case Series. Localized popliteal artery occlusion (LPAO) is a rare entity with a challenging therapy. In selected patients, open popliteal endarterectomy (OPE) with infrapopliteal balloon angioplasty (IPA) can be limb-saving. The aim of this retrospective study from Iraq-Kurdistan is to assess the procedure outcomes. Over 5 years, ending at 2020, 28 patients with atherosclerotic LPAO unsuitable for femoropopliteal bypass or endovascular intervention received OPE + IPA through a medial approach under spinal anesthesia. Perioperative data were obtained from patients' records and entered into an Access database. Results were retrieved and statistically analyzed. There were 18 (64.3%) males. The mean age was 66.4 ± 6.53 years (range 52-79 years). Seventy five percent of patients had obesity, diabetes mellitus, and smoking. Twenty six (92.9%) of patients were in Rutherford category 5 and 6 with an ankle-brachial index < 0.40. Popliteal and pedal pulses were absent in 23 of 24 (95.8%) patients. Doppler ultrasound showed a good distal runoff in 1 (3.6%) patient. Computed tomography angiography revealed 20 (71.4%) femoropopliteal and 16 (57.1%) infrapopliteal lesions of types A and B as per Trans-Atlantic Inter-Society Consensus II document. Mean endarterectomy length was 5.1 cm (range 3-7), patched with a vein in (17, 60.7%) and primarily closed in (11, 39.3%) patients. Early complications were none in (18, 64.3%) patients whereas 1 leg was amputated on day 30. On average, follow-up lasted 3.4 years. Twenty seven (96.4%) vessels remained patent and 2 (7.1%) patients died. Our study confirms safety and efficacy of OPE + IPA for selected patients with critical limb ischemia due to LPAO.
{ "pmid": 35872212, "language": "eng" }
Distinct Characteristics and Chronology of Amoxicillin-Associated Reactions in Pediatric Acute Care Settings. Amoxicillin-associated reactions (AARs) in children presenting as rashes are common, and recent data suggest that >90% tolerate amoxicillin on re-exposure. However, additional data would help pediatricians and allergists gain confidence in referring and testing children who experienced systemic symptoms perceived as "worrisome," thus leading to urgent medical evaluations. By characterizing the entire spectrum of AAR symptoms in pediatric patients presenting to emergency department (ED)/urgent care (UC) settings, we sought to increase our diagnostic acumen to guide subsequent allergy evaluations. To fully characterize clinical features of rash and systemic symptoms in children presenting to the ED/UC with AARs. A retrospective chart review of children seen in the ED/UC from July 1, 2015, to June 30, 2017, was conducted. Clinical features, chronology, and seasonality were detailed, and cases were classified into 3 previously described AAR phenotypes: maculopapular exanthem (MPE), urticaria, and serum sickness-like reactions (SSLRs), if they experienced joint symptoms. Children (n = 668; median age: 1.8 years) presented to the ED/UC with urticaria (44%), MPE (36%), and SSLRs (11%) typically on days 7 to 10 of amoxicillin. Although children with SSLRs were more frequently treated with corticosteroids (28%, P < .0001) and exhibited higher rates of "worrisome" features (fever, angioedema, or gastrointestinal symptoms; 73%, P < .0001), delayed-onset systemic symptoms were identified frequently in all 3 groups. ED/UC reutilization was unexpectedly high with 66 children (10%) returning to the ED/UC for re-evaluation. "Worrisome" symptoms are common in children presenting to the ED/UC with AARs. Future studies are needed to determine the impact on subsequent referral and allergy testing.
{ "pmid": 35872215, "language": "eng" }
Predictors of Acute Care Reutilization in Pediatric Patients With Amoxicillin-Associated Reactions. Amoxicillin-associated reactions (AARs) contribute to substantial health care utilization, with a reutilization rate of 10% in pediatric emergency department (ED) and urgent care (UC) settings. To identify predictors of ED/UC reutilization by examining patients' clinical features and providers' management of AARs. Through a retrospective chart review of 668 patients presenting with AARs over 2 years to the pediatric ED/UC, we examined clinical features associated with ED/UC reutilization, including rash phenotype, systemic symptoms (fever, angioedema, joint involvement, gastrointestinal symptoms), and providers' management (pharmacologic treatment and counseling). We then constructed a statistical model to predict ED/UC reutilization using stepwise backward model selection. ED/UC reutilizers were more likely to be male (P = .008) and have fever (P = .0001), angioedema (P < .0001), joint involvement (P < .0001), and gastrointestinal symptoms (P = .0001) during their AAR course. Rash phenotypes differed between groups (P < .0001), as ED/UC reutilizers more frequently exhibited urticaria. However, there were no differences in clinical management between groups, including pharmacologic recommendations, at the initial ED/UC encounter. In addition, our statistical model identified younger patients <2 years of age as more likely to reutilize ED/UC resources if providers did not document specific return precautions (odds ratio, 3.6; 95% confidence interval, 1.7-7.7). Recognition of clinical features and treatment gaps associated with ED/UC reutilization will guide interventions to optimize care in children presenting with AARs, such as improved anticipatory guidance and early allergy consultation. Prospective studies are needed to determine whether these interventions will reduce ED/UC reutilization and facilitate timely allergy testing.
{ "pmid": 35872219, "language": "eng" }
Brain-derived neurotrophic factor expression in serotonergic neurons improves stress resilience and promotes adult hippocampal neurogenesis. The neurotrophin brain-derived neurotrophic factor (BDNF) stimulates adult neurogenesis, but also influences structural plasticity and function of serotonergic neurons. Both, BDNF/TrkB signaling and the serotonergic system modulate behavioral responses to stress and can lead to pathological states when dysregulated. The two systems have been shown to mediate the therapeutic effect of antidepressant drugs and to regulate hippocampal neurogenesis. To elucidate the interplay of both systems at cellular and behavioral levels, we generated a transgenic mouse line that overexpresses BDNF in serotonergic neurons in an inducible manner. Besides displaying enhanced hippocampus-dependent contextual learning, transgenic mice were less affected by chronic social defeat stress (CSDS) compared to wild-type animals. In parallel, we observed enhanced serotonergic axonal sprouting in the dentate gyrus and increased neural stem/progenitor cell proliferation, which was uniformly distributed along the dorsoventral axis of the hippocampus. In the forced swim test, BDNF-overexpressing mice behaved similarly as wild-type mice treated with the antidepressant fluoxetine. Our data suggest that BDNF released from serotonergic projections exerts this effect partly by enhancing adult neurogenesis. Furthermore, independently of the genotype, enhanced neurogenesis positively correlated with the social interaction time after the CSDS, a measure for stress resilience.
{ "pmid": 35872220, "language": "eng" }
Mice in translational neuroscience: What R we doing? Animal models play a pivotal role in translational neuroscience but recurrent problems in data collection, analyses, and interpretation, lack of biomarkers, and a tendency to over-reliance on mice have marred neuroscience progress, leading to one of the highest attrition rates in drug translation. Global initiatives to improve reproducibility and model selection are being implemented. Notwithstanding, mice are still the preferred animal species to model human brain disorders even when the translation has been shown to be limited. Non-human primates are better positioned to provide relevant translational information because of their higher brain complexity and homology to humans. Among others, lack of resources and formal training, strict legislation, and ethical issues may impede broad access to large animals. We propose that instead of increasingly restrictive legislation, more resources for training, education, husbandry, and data sharing are urgently needed. The creation of multidisciplinary teams, in which veterinarians need to play a key role, would be critical to improve translational efficiency. Furthermore, it is not usually acknowledged by researchers and regulators the value of comparative studies in lower species, that are instrumental in toxicology, target identification, and mechanistic studies. Overall, we highlight here the need for a conceptual shift in neuroscience research and policies to reach the patients.
{ "pmid": 35872217, "language": "eng" }
Electronic Cigarettes: A Pro-Con Review of the Current Literature. Electronic cigarettes (e-cigarettes, e-cigs, or electronic nicotine delivery systems) are battery-operated devices typically containing glycerol and/or propylene glycol-based solutions with varying nicotine content, known as e-liquids. Although e-cigarettes were originally developed as a potentially less harmful alternative to traditional combustible tobacco cigarette smokers, several factors have driven their popularity among smokers and nonsmokers alike, including their sleek product designs, innumerable appealing flavors, lack of combustible smoke and odor, and high potential nicotine concentrations. Furthermore, many advocates have promoted the idea that e-cigarettes are safe to use, or at least safer than conventional tobacco, despite limited longitudinal data to support these claims. Here, we examine what is known about the impacts of e-cigarette use on traditional cigarette smoking cessation, lung health, and youth and young adult tobacco product exposure. Upon review of the currently available literature, the negative effects of e-cigarette use seem to outweigh any potential benefit, because the available evidence does not confirm the use of e-cigarettes as an effective strategy for supporting traditional combustible tobacco cigarette smoking cessation, particularly given the emerging adverse effects on lung health and the potential future public health effects of e-cigarette adoption among a burgeoning new generation of tobacco product users.
{ "pmid": 35872222, "language": "eng" }
The potential 'blue light hazard' from LED headlamps. Many dental personnel use light-emitting diode (LED) headlamps for hours every day. The potential retinal 'blue light hazard' from these white light headlamps is unknown. The spectral radiant powers received from direct and indirect viewing of an electronic tablet, an LED curing light, a halogen headlamp, and 6 brands of LED headlamps were measured using integrating spheres attached to fiberoptic spectroradiometers. The spectral radiant powers were measured both directly and indirectly at a 35 cm distance, and the maximum daily exposure times (tMAX) were calculated from the blue weighted irradiance values. The headlamps emitted very different radiant powers, emission spectra, and color temperatures (K). The total powers emitted at zero distance ranged from 47 mW from the halogen headlamp to 378 mW from the most powerful LED headlamp. The color temperatures from the headlamps ranged from 3098 K to 7253 K. The tMAX exposure times in an 8 h day when the headlamps were viewed directly at a distance of 35 cm were: 810 s from the halogen headlamp, 53 to 220 s from the LED headlamps, and 62 s from the LED curing light. Light from the LED headlamps that was reflected back from a white reference tile 35 cm away did not exceed the maximum permissible exposure time for healthy adults. Using a blue dental dam increased the amount of reflected blue light, but tMAX was still greater than 24 h. White light LED headlamps emit very different spectra, and they all increase the retinal 'blue light hazard' compared to a halogen source. When the headlamps were viewed directly at a distance of 35 cm, the 'blue light hazard' from some headlamps was greater than from the LED curing light (tMAX = 62 s). Depending on the headlamp brand, tMAX could be reached after only 53s. The light from the LED headlamps that was reflected back from a white surface that was 35 cm away did not exceed the maximum permissible ocular exposure limits for healthy adults. Reflected white light from dental headlamps does not pose a blue light hazard for healthy adults. Direct viewing may be hazardous, but the hazard can be prevented by using the appropriate blue-light blocking glasses.
{ "pmid": 35872221, "language": "eng" }
Astrocyte energy and neurotransmitter metabolism in Alzheimer's disease: Integration of the glutamate/GABA-glutamine cycle. Astrocytes contribute to the complex cellular pathology of Alzheimer's disease (AD). Neurons and astrocytes function in close collaboration through neurotransmitter recycling, collectively known as the glutamate/GABA-glutamine cycle, which is essential to sustain neurotransmission. Neurotransmitter recycling is intimately linked to astrocyte energy metabolism. In the course of AD, astrocytes undergo extensive metabolic remodeling, which may profoundly affect the glutamate/GABA-glutamine cycle. The consequences of altered astrocyte function and metabolism in relation to neurotransmitter recycling are yet to be comprehended. Metabolic alterations of astrocytes in AD deprive neurons of metabolic support, thereby contributing to synaptic dysfunction and neurodegeneration. In addition, several astrocyte-specific components of the glutamate/GABA-glutamine cycle, including glutamine synthesis and synaptic neurotransmitter uptake, are perturbed in AD. Integration of the complex astrocyte biology within the context of AD is essential for understanding the fundamental mechanisms of the disease, while restoring astrocyte metabolism may serve as an approach to arrest or even revert clinical progression of AD.