instruction
stringlengths
10
664
context
stringlengths
1
5.66k
response
stringlengths
1
3.34k
category
stringclasses
1 value
Are Pregnancy Intentions Associated with Transitions Into and Out of Marriage?
In addition to having associations with health outcomes, pregnancy intentions may be associated with social outcomes, including marital transitions. Linked data from the 2004-2008 Oklahoma Pregnancy Risk Assessment Monitoring System and The Oklahoma Toddler Survey for 2006-2010 on 3,617 women who were married and 2,123 who were unmarried at conception were used to examine the relationship between pregnancy intention status (intended, mistimed by less than two years, mistimed by two or more years, or unwanted) and marital formation or dissolution by the time of the birth and two years later. Logistic regression analyses were conducted, and propensity score methods were used to adjust for confounding characteristics. Intention status was associated with marital transition two years after the birth, but not between conception and birth. In adjusted models, among women married at conception, those with a birth resulting from an unwanted pregnancy were more likely than those with a birth resulting from an intended pregnancy to transition out of marriage by the time their child was two years old (odds ratio, 2.2). Among women unmarried at conception, those with a birth following an unwanted pregnancy were less likely than those with a birth following an intended pregnancy to marry by the time their child was two (0.5). Births following mistimed pregnancies were not associated with marital transition.
The findings should motivate researchers to broaden the scope of research on the consequences of unintended childbearing. Future research should distinguish between mistimed and unwanted births.
closed_qa
Do Treatment Policies for Proximal Humerus Fractures Differ among Three Nordic Countries and Estonia?
Proximal humerus fractures are common fragility injuries. The incidence of these fractures has been estimated to be 82-105 per 105 person-years. Treatment of this fracture, especially in the elderly, is controversial. Our study group published a systematic review of the available literature and concluded that non-operative methods are favored over operative methods in three- and four-part fractures. The aim of this multinational study was to compare treatment policies for proximal humerus fractures among the Nordic countries and Estonia. The study was conducted as a questionnaire-based survey, using the Internet-based program, Webropol(®) (webropol.com). The questionnaire link was sent to the surgeons responsible for treating proximal humerus fractures in major public hospitals in Estonia, Finland, Norway, and Sweden. Questionnaire included questions regarding the responder's hospital, patient characteristics, and examinations taken before decision making. Clinical part included eight example patient cases with treatment options. Of the 77 recipients of the questionnaire, 59 responded; consequently, the response rate was 77%. Based on the eight presented displaced fracture examples, in both Estonia and Norway and in Finland, 41% and 38%, respectively, preferred surgical treatment with locking plate. In Sweden, the percentage was 28%. The pre- and post-operative protocols showed a similarity in all participant countries.
Our survey revealed a remarkable uniformity in the current practice of operative treatments and rehabilitation for proximal humerus fractures in the participant countries.
closed_qa
Can We Prevent a Postoperative Spinal Epidural Hematoma by Using Larger Diameter Suction Drains?
Epidural hematoma is a rare but serious complication. According to previous studies, it is not prevented by suction drains. This study evaluated the following alternative hypothesis: the larger the diameter of a suction drain, the less the remaining epidural hematoma after spinal surgery. This was a randomized prospective study. Patients who underwent posterior lumbar decompression and instrumented fusion were divided into two groups: the large drain (LD, 2.8-mm-diameter tube) and small drain (SD, 1.6-mm-diameter tube) groups according to the diameter of the suction drains. All patients were consecutive and allocated alternately according to the date of operations. Suction drains were removed on day 3 and magnetic resonance imaging was performed on day 7 postoperatively. The size of remaining hematomas was measured by the degree of thecal sac compression in cross section using the following 4-point numeric scale: G1, less than one quarter; G2, between one quarter and half; G3, more than half; and G4, more than subtotal obstruction. There were 39 patients with LDs and 38 with SDs. They did not differ significantly in terms of sex, number of fusion segments, revision or not, antiplatelet medication, intraoperative injection of tranexamic acid. However, patient age differed significantly between the two groups (LD, 63.3 years and<SD, 68.6 years; p = 0.007). The two groups did not differ significantly in terms of prothrombin time, activated partial thromboplastin time, platelet number, blood loss, or operation duration. However, platelet function analysis exhibited a significant difference (LD, 164.7 seconds and<SD, 222.3 seconds; p = 0.002). The two blinded readers showed high consistency (Kappa value = 0.740; p = 0.000). The results of reader 1 were as follows: LD and SD had 21 and 21 cases of G1, 9 and 11 cases of G2, 6 and 6 cases of G3, and 3 and 0 cases of G4, respectively. The results of reader 2 were as follows: LD and SD had 22 and 23 cases of G1, 7 and 9 cases of G2, 7 and 6 cases of G3, and 3 and 0 cases of G4, respectively. There was no difference between the two groups (reader 1, p = 0.636; reader 2, p = 0.466).
The alternative hypothesis was rejected. Therefore, postoperative spinal epidural hematoma would not be prevented by LD.
closed_qa
Child Passenger Safety Training for Pediatric Interns: Does it Work?
Evaluate the efficacy of a child passenger safety (CPS) educational intervention on the CPS-related knowledge, attitude and anticipatory guidance behaviors of pediatric interns. All subjects were surveyed at baseline and 6 months. Intervention interns attended a CPS training module which included viewing an educational video, observing a car seat inspection appointment, hands-on practice and completion of a post-intervention survey. All 16 intervention interns completed the initial survey, the intervention and the immediate-post questionnaire. Thirteen (81%) completed the 6-month follow-up. The baseline survey was completed by 27/40 (67%) of control interns, 28/40 (70%) submitted a follow-up. The proportion of intervention interns who self-reported giving CPS guidance at all well-child visits increased by 31.3% (95% CI 6.1,56.5%); the control group had no change. Similar results were seen with self-reported knowledge and attitude.
A CPS training module increases pediatric interns' knowledge, improves attitudes, and self-reported behaviors regarding CPS-related anticipatory guidance.
closed_qa
Do younger women with elevated basal follicular stimulating hormone levels undergoing gonadotropin-stimulated intrauterine insemination cycles represent compromised reproductive outcomes?
To compare stimulation characteristics and reproductive outcomes in women representing elevated and normal day 3 FSH levels and to evaluate the prognostic significance of day 3 FSH on the reproductive outcomes of gonadotropin-stimulated IUI (GS-IUI) cycles in women<35 years. A cross-sectional study was designed. Unexplained infertility patients at the age ≤36 years, who underwent IUI, following gonadotropin stimulation (GS), were investigated. From 105 women with a day 3 FSH≥ 10U/L, 170GS/IUI cycles were assigned to Group EF; whereas a control group (Group NF, normal FSH) was constituted of 170 cycles with a day 3 FSH levels<10U/L. Demographic and stimulation characteristics as well as reproductive outcomes were compared. Primary outcome measure of this study was the biochemical, clinical and ongoing pregnancy rates. Secondary outcome measures were total gonadotropin dose, duration of gonadotropin stimulation, multiple pregnancy, miscarriage and cycle cancellation rates. β-hCG positivity, clinical and ongoing pregnancy rates did not differ between women with normal and elevated FSH levels (p=0.234, 0.282 and 0.388, respectively). Total gonadotropin dose, multiple pregnancy and miscarriage rates were not significantly different between the groups (p=0,181, 0.652 and 0.415, respectively). Duration of stimulation was significantly longer and cycle cancellation rate was significantly higher in Group EF than in Group NF (p=0.005 and 0.021, respectively).
Younger women with elevated day 3 FSH represent comparable reproductive outcomes in GS-IUI cycles to those with normal FSH levels, although they may require longer periods of stimulation and are at higher risk of cycle cancellation. Thus, GS-IUI could be a possible treatment option in this patient group and should not be neglected.
closed_qa
Can Staining of Damaged Proteins in Urine Effectively Predict Preeclampsia?
To assess Congo red urine test in the first trimester for preeclampsia (PE) prediction. A Congo red test was developed with a cohort of 81 pregnant women in Bnai Zion hospital, Israel, at 26-41 weeks of gestation (12 PE cases). The test was then applied to a first-trimester cohort of 642 women at King's College Hospital, UK (105 subsequently developed PE, 21 early, i.e.,<34 weeks; 537 controls). Urine samples were spotted onto nitrocellulose membranes, stained with Congo red, de-stained, dried and quantified with imager and densitometry. At PE signs and symptoms, the detection rate (DR) was 93% and the false-positive rate (FPR) 4%. However, with first-trimester urine samples, the DR was 33.3%, 16.1% and 20% for early, late and all PE cases, respectively, at 12.8% FPR. The odds ratio (OR) for PE by Congo red alone (including adjusted OR) was superior to body mass index and mean arterial blood pressure (MAP) but inferior to previous PE and black ethnicity. Combining all five parameters generated an adjusted OR of 13.92 for PE (p<0.001).
Congo red urine test at PE verifies the disorder. In the first trimester, it adds accuracy for PE prediction in obese, black women, who had previous PE and over-average MAP.
closed_qa
Preeclampsia and superimposed preeclampsia: The same disease?
We aimed to compare sFlt-1 and placental growth factor (PlGF) levels and the sFlt-1/PlGF ratio between women with preeclampsia and superimposed preeclampsia to, respectively, normotensive and chronic hypertensive ones. We performed a prospective two-armed cohort in a tertiary teaching hospital in Sao Paulo, Brazil, including 37 normotensive and 60 chronic hypertensive pregnant women. We assessed the serum levels of sFlt-1 and PlGF at 20, 26, 32, and 36 gestational weeks by enzyme-linked immunosorbent assay. Having preeclampsia and superimposed preeclampsia. Among normotensive and chronic hypertensive pregnancies, 4 (10.8%) and 14 (23.3%) women developed preeclampsia and superimposed preeclampsia, respectively. Compared with those who remained normotensive, the preeclampsia women presented higher sFlt-1 levels at 32 gestational weeks (4323.45 pg/mL vs. 2242.04 pg/mL, p = 0.019), lower PlGF levels at 20 (183.54 pg/mL vs. 337.38 pg/mL, p = 0.034), 32 (169.69 pg/mL vs. 792.53 pg/mL, p = 0.001), and 36 gestational weeks (252.99 pg/mL vs. 561.81 pg/mL, p = 0.029), and higher sFlt-1/PlGF ratios at 26 (9.02 vs. 1.84, p = 0.004), 32 (23.61 vs. 2.55, p = 0.001), and 36 gestational weeks (49.02 vs. 7.34, p = 0.029). On the other hand, compared with those who remained chronic hypertensive, the superimposed preeclampsia women only presented a higher sFlt-1/PlGF ratio at 32 gestational weeks (9.98 vs. 2.51, p = 0.039).
Although angiogenic imbalance is clearly related to preeclampsia, it seems to play a more modest role in superimposed preeclampsia, in which other mechanisms should also be investigated.
closed_qa
Does Sitagliptin Affect the Rate of Osteoporotic Fractures in Type 2 Diabetes?
Type 2 diabetes and osteoporosis are both common, chronic, and increase with age, whereas type 2 diabetes is also a risk factor for major osteoporotic fractures (MOFs). However, different treatments for type 2 diabetes can affect fracture risk differently, with metaanalyses showing some agents increase risk (eg, thiazolidinediones) and some reduce risk (eg, sitagliptin). To determine the independent association between new use of sitagliptin and MOF in a large population-based cohort study. A sitagliptin new user study design employing a nationally representative Unites States claims database of 72 738 insured patients with type 2 diabetes. We used 90-day time-varying sitagliptin exposure windows and controlled confounding by using multivariable analyses that adjusted for clinical data, comorbidities, and time-updated propensity scores. We compared the incidence of MOF (hip, clinical spine, proximal humerus, distal radius) in new users of sitagliptin vs nonusers over a median 2.2 years follow-up. At baseline, the median age was 52 years, 54% were men, and median A1c was 7.5%. There were 8894 new users of sitagliptin and 63 834 nonusers with a total 181 139 person-years of follow-up. There were 741 MOF (79 hip fractures), with 53 fractures (4.8 per 1000 person-years) among new users of sitagliptin vs 688 fractures (4.0 per 1000 person-years) among nonusers (P = .3 for difference). In multivariable analyses, sitagliptin was not associated with fracture (adjusted hazard ratio 1.1, 95% confidence interval 0.8-1.4; P = .7), although insulin (P<.001), sulfonylureas (P<.008), and thiazolidinedione (P = .019) were each independently associated with increased fracture risk.
Even in a young population with type 2 diabetes, osteoporotic fractures were not uncommon. New use of sitagliptin was not associated with fracture, but other commonly used second-line agents for type 2 diabetes were associated with increased risk. These data should be considered when making treatment decisions for those with type 2 diabetes at particularly high risk of fractures.
closed_qa
"Can't We Just Have Some Sazón?
In September 2013, a Massachusetts high school launched a nutrition program in line with 2013 United States Department of Agriculture requirements. We sought to understand attitudes of stakeholders toward the new program. We employed community-based participatory research methods in a qualitative evaluation of the food program at the school, where 98% of students are students of color and 86% qualify for free/reduced lunch. We conducted 4 student (N = 32), 2 parent (N = 10), 1 faculty/staff focus group (N = 14), and interviews with school leadership (N = 3). A total of 10 themes emerged from focus groups and interviews, in 3 categories--impressions of the food (insufficient portion size, dislike of the taste, appreciation of the freshness, increased unhealthy food consumption outside school), impact on learning (learning what's healthy, the program's innovativeness, control versus choice), and concerns about stakeholder engagement (lack of student/family engagement, culturally incompatible foods). A representative comment was: "You need something to hold them from 9 to 5, because if they are hungry, McDonald's is right there."
Stakeholders appreciated the educational value of the program but stakeholder dissatisfaction may jeopardize its success. Action steps could include incorporating culturally appropriate recipes in the school's menus and working with local restaurants to promote healthier offerings.
closed_qa
ANXIETY AND DEPRESSIVE DISORDERS IN FUNCTIONAL DYSPEPSIA: CAUSE OR CONSEQUENCE?
The aim of this study was to evaluate the frequency and importance of anxiety and depression in patients with functional dyspepsia (FD), the relationship between these psychological characteristics, symptom severity and the quality of life. We performed a cross-sectional study. 125 patients with FD according to the Rome criteria ill, as well as a control group of 30 healthy volunteers were investigated. All study participants filled out a scale to identify HADS anxiety-depressive disorder, an overall assessment of the quality of life, using a questionnaire SF-8 (standard 4-week form). FD patients were asked to rate the severity of epigastric pain (burning) or abdominal discomfort (early satiation or postprandial fullness) with LPDS scale (Leuven postprandial distress scale). All statistical analyzes were performed in the Medstat program. The results obtained with p<0.05 and 95% CI were considered statistically significant. Anxiety and depression were observed in 50.4% and 42.4% of FD patients, respectively, and in 13.3% and 6.66% of healthy subjects, respectively (p<0.001 for both). The mean HADS scores for anxiety and depression in lBS patients were 7.93 ± 3.75 and 6.94 ± 3.78, respectively. Both anxiety and depression were associated with self-reported symptom severity (LPDS) (p<0.05). As determined by correlation analysis, symptom severity was the most important factor in the prediction of anxiety and depression. Self-reported symptom severity, anxiety and depression were clearly and independently associated with the overall health-related quality of life (HRQOL).
Biopsychosocial model of FD explained the difficulties of the pathogenesis of this disease. Anxiety and de- pression were frequently observed in FD patients and were related to the severity of their symptoms and the impairment of the patient's HRQOL. Our data suggest that assessing anxiety and depression is important when evaluating FD patients.
closed_qa
Should we reframe how we think about physical activity and sedentary behaviour measurement?
The measurement of physical activity (PA) and sedentary behaviour (SB) is fundamental to health related research, policy, and practice but there are well known challenges to these measurements. Within the academic literature, the terms "validity" and "reliability" are frequently used when discussing PA and SB measurement to reassure the reader that they can trust the evidence. In this paper we argue that a lack of consensus about the best way to define, assess, or utilize the concepts of validity and reliability has led to inconsistencies and confusion within the PA and SB evidence base. Where possible we propose theoretical examples and solutions. Moreover we present an overarching framework (The Edinburgh Framework) which we believe will provide a process or pathway to help researchers and practitioners consider validity and reliability in a standardized way.
Further work is required to identify all necessary and available solutions and generate consensus in our field to develop the Edinburgh Framework into a useful practical resource. We envisage that ultimately the proposed framework will benefit research, practice, policy, and teaching. We welcome critique, rebuttal, comment, and discussion on all ideas presented.
closed_qa
Measuring self-reported quality of life in 8- to 11-year-old children born with gastroschisis: Is the KIDSCREEN questionnaire acceptable?
Children born with gastroschisis have a good prognosis but require surgical correction and long-term follow up. There has been little research on the impact of gastroschisis on the child's health-related quality of life (QoL). The aim was to assess face and content validity of the KIDSCREEN-52 questionnaire as a measurement of self- and proxy-reported QoL in children born with gastroschisis and to evaluate self-reported QoL in these children compared with the reference population. In this cross-sectional exploratory study, we used the validated KIDSCREEN-52 questionnaire and individual interview with 8- to 11-year-old children born with gastroschisis who were identified from the Northern Congenital Abnormality Survey. Self-reported QoL scores were compared with age-matched UK norms by using the two-sample t test. Ten children (median age 9.6 years, interquartile range 8.3-11.0) and their parents participated. Children found KIDSCREEN a helpful tool to explore their feelings and that it covered life aspects important to them. Parents believed that all priority areas were represented and that it was straightforward for their children to complete. In nine KIDSCREEN domains, children with gastroschisis had similar QoL scores to those in the reference population, and in one (psychological well-being) the mean score was significantly better (p = 0.03). All children described their health as good/very good or excellent; eight said they would not like to change anything about their body.
The KIDSCREEN questionnaire has adequate face and content validity as a measure of QoL in children with gastroschisis and is acceptable to both children and parents.
closed_qa
Integrated health service delivery networks and tuberculosis avoidable hospitalizations: is there a relation between them in Brazil?
The early identification of the Breathing Symptoms within the scope of Primary Health Care is recommended, and is also one of the strategies of national sanitary authorities for reaching the elimination of tuberculosis. The purpose of this study is to consider which attributes and which territories have shown the most significant progress in Primary Health Care, in terms of coordination of Health Care Networks, and also check if those areas of Primary Health Care that are most critical regarding coordination, there were more or less cases of avoidable hospitalizations for tuberculosis. This is an ecological study that uses primary and secondary data. For analysis, coropletic maps were developed through the ArcGIS software, version 10.2. There was also the calculation of gross annual and Bayesian rates for hospitalizations for tuberculosis, for each Primary Health Care territory. There were satisfactory results for attributes such as Population (n = 37; 80.4 %), Primary Health Care (n = 43; 93.5 %), Support System (n = 45; 97.8 %); the exceptions were Logistics System (n = 32; 76.0 %) and Governance System, with fewer units in good condition (n = 31; 67.3 %). There is no evidence of any connection between networks' coordination by Primary Health Care and tuberculosis avoidable admissions.
The results show that progress has been made regarding the coordination of the Health Care Networks, and a positive trend has been shown, even though the levels are not excellent. It was found no relationship between the critical areas of Primary Health Care and tuberculosis avoidable hospitalizations, possibly because other variables necessary to comprehend the phenomena.
closed_qa
Is Waist-to-Height Ratio a Better Obesity Risk-Factor Indicator for Puerto Rican Children than is BMI or Waist Circumference?
Puerto Rican children could have a higher prevalence of obesity, compared to US children or even to US Hispanic children. Obese youths are more likely to have risk factors for cardiovascular conditions, such as hypertension. Although BMI provides a simple, convenient measurement of obesity, it does not measure body fat distribution, associated with mortality and morbidity. Waist circumference (WC) and waist-to-height ratio (WHtR) have been suggested to estimate obesity health risks. This study aimed to explore the association of a single blood pressure reading with 3 different obesity indicators (WC, BMI, and WHtR). A representative sample of students (first to sixth grade) from public and private schools in Puerto Rico was selected. The sample size consisted of 249 students, representing a 63% response rate. According to the sex-specific BMIs, approximately 38.1% of the children were obese or overweight. The prevalence of obesity was slightly higher when determined using WHtR but lower when using WC as the overweight indicator. The prevalence of high blood pressure among students was 12.5%; an additional 11.3% of the students were classified as possible prehypertensive. Regardless of the weight indicator used, overweight children were shown to have a higher risk of pre-hypertension/hypertension (as defined by a single BP measure) than were non-overweight children. The odds for high blood pressure were almost 3 times higher using WHtR. Logistic regression showed a stronger relationship between WHtR and the risk of pre-hypertension/hypertension than that between the former and either BMI or WC.
This study suggests the possibility of higher prevalence of high blood pressure in obese Puerto Rican children. The waist-to height ratio could be the best indicator to measure obesity and potential hypertension in Puerto Rican children.
closed_qa
Neuroendocrine tumors (NETs) of unknown primary: is early surgical exploration and aggressive debulking justifiable?
Neuroendocrine tumors (NETs) are rare tumors that often present with vague symptoms. Identification and localization of the primary NET can be challenging and the true incidence remains unclear. These patients have been thought to have a poor prognosis compared to those patients with a known primary. Therefore, traditionally the treatments for patients with unknown primaries have been passive and directed towards symptom control and/or cytoreduction of metastatic disease. We hypothesized that NET of unknown primary are predominantly low-grade and easily located surgically and therefore are amendable to surgical debulking and cytoreduction, which will likely increase survival in these patients. The charts for all 342 surgical patients, seen in our clinic at Ochsner-Kenner between 1/2009 and 9/2012 were retrospectively reviewed to determine which patients had a pre-operative diagnosis of a "NET with unknown primary". Twenty-two patients (6.4%) were identified. For these patients, the rate of successful surgical exploration in which a primary site was identified was recorded. Survival for these "unknown primary" patients were compared to a large similar group of NET patients from a recent study collected from this same Ochsner clinic group. Twenty-two (22/342, 6.4%) NET patients with a pre-operative diagnosis of an unknown primary were explored and cytoreduced. The primary tumor site was identified in all 22 patients (100%). The primary sites identified for these patients were 19 small intestines (86.4%) and 3 pancreatic (13.6%). All 22 patients had low-grade tumors and all were still alive as of 9/2012, not allowing for a survival curve to be generated.
Unknown primary NETs are not associated with a poor prognosis as previously reported. Timely surgical exploration and debulking always results in the identification of the primary and a maximum cytoreduction. Early surgical exploration with aggressive debulking is indicated for the treatment of these patients, as for the known counterpart.
closed_qa
Gender influence on clinical presentation and high-resolution ultrasound findings in primary carpal tunnel syndrome: do women only differ in incidence?
High-resolution ultrasound is increasingly used in the diagnosis of carpal tunnel syndrome; yet little is known about gender differences in clinical presentation and ultrasound findings. In this high-resolution ultrasound-based retrospective study in 170 cases, we assessed gender influence in CTS in terms of the severity of neural alterations by wrist-to-forearm ratio (WFR), epineural thickening, loss of fascicular anatomy, as well as classical signs and symptoms. The control group consisted of 42 wrists. Women present with a greater WFR at first admission are affected more often bilaterally, and report less subjective pain intensity, while men report fewer nightly pain episodes at higher WFR. Loss of fascicular anatomy is three times more frequent in women. An increase in epineural thickness, loss of fascicular anatomy, and involvement of more than 1.5 fingers correlate significantly with WFR regardless of sex.
Women differ significantly from men in terms of clinical presentation and ultrasound findings upon first diagnosis of CTS, which should be included in further diagnostic considerations.
closed_qa
Handedness, sexual orientation, and somatic markers for prenatal androgens: Are southpaws really that gay?
Some evidence suggests that prenatal androgens influence both handedness and sexual orientation. This study sought to clarify how androgens, handedness, and sexual orientation are interrelated. Data were obtained from large samples of students enrolled at universities in Malaysia and the US, including self-reported information on handedness, sexual orientation, and five somatic markers of prenatal androgen exposure (2D:4D, height, strength, muscularity, and athletic ability). Factor analysis of these somatic markers yielded two factors: a muscular coordination and a bone growth factor. In women, but not in men, ambidextrousness was more prevalent among those with homosexual tendencies. Modest and often complex associations were found between the androgen factors and handedness. Clear links between the androgen factors and sexual orientation were found, especially for muscular coordination. For males and females, intermediate sex-typical androgen exposure was associated with heterosexual preferences.
Ambidextrousness appears to be somewhat more common among females with homosexual tendencies, but left-handedness is nearly as strongly associated with heterosexual preferences, particularly in males, as is right-handedness. Factors indicative of prenatal androgen exposure are associated with sexual orientation in theoretically predictable ways, especially for muscular coordination, but associations between prenatal androgens and handedness are complex.
closed_qa
Sinus Computed Tomography Imaging in Pediatric Cystic Fibrosis: Added Value?
To evaluate the prevalence of computed tomography (CT) sinus imaging in a pediatric cystic fibrosis (CF) population, determine changes in Lund Mackay (LM) scores over time, and estimate radiation exposure. Case series with chart review. Tertiary care children's hospital. In total, 202 pediatric patients with CF who underwent endoscopic sinus surgery (ESS) were included. The total number of CT scans was calculated for each patient, with specific focus on the indications for and subsequent outcomes of the sinus CT scan subgroup. Patients underwent a total of 1718 CT scans, 832 of which were sinus CT scans (mean of 4.2 sinus scans per patient). Disease evaluation (54%) and preoperative planning (35%) were the most common indications. Otolaryngologists were more likely to order imaging for preoperative evaluation, and those scans were more likely to result in surgery compared with those requested by other physicians (P<.001). Ninety CT scans (10.8%) led to no change in management. There was no significant difference in LM scores between patients admitted to the hospital or prescribed antibiotics and those who were not. There was also no significant change in LM score following ESS after adjusting for age and sex (P = .23).
Based on LM scores, all sinus CT scans in patients with CF reveal moderate to severe sinus disease. Effort should be made to minimize radiation exposure in patients with CF by limiting sinus CT scans to the preoperative context or for evaluation of potential sinusitis complications.
closed_qa
2D shear-wave ultrasound elastography (SWE) evaluation of ablation zone following radiofrequency ablation of liver lesions: is it more accurate?
To evaluate the usefulness of two-dimensional quantitative ultrasound shear-wave elastography (2D-SWE) [i.e. virtual touch imaging quantification (VTIQ)] in assessing the ablation zone after radiofrequency ablation (RFA) for ex vivo swine livers. RFA was performed in 10 pieces of fresh ex vivo swine livers with a T20 electrode needle and 20-W output power. Conventional ultrasound, conventional strain elastography (SE) and VTIQ were performed to depict the ablation zone 0 min, 10 min, 30 min and 60 min after ablation. On VTIQ, the ablation zones were evaluated qualitatively by evaluating the shear-wave velocity (SWV) map and quantitatively by measuring the SWV. The ultrasound, SE and VTIQ results were compared against gross pathological and histopathological specimens. VTIQ SWV maps gave more details about the ablation zone, the central necrotic zone appeared as red, lateral necrotic zone as green and transitional zone as light green, from inner to exterior, while the peripheral unablated liver appeared as blue. Conventional ultrasound and SE, however, only marginally depicted the whole ablation zone. The volumes of the whole ablation zone (central necrotic zone + lateral necrotic zone + transitional zone) and necrotic zone (central necrotic zone + lateral necrotic zone) measured by VTIQ showed excellent correlation (r = 0.915, p < 0.001, and 0.856, p = 0.002, respectively) with those by gross pathological specimen, whereas both conventional ultrasound and SE underestimated the volume of the whole ablation zone. The SWV values of the central necrotic zone, lateral necrotic zone, transitional zone and unablated liver parenchyma were 7.54-8.03 m s(-1), 5.13-5.28 m s(-1), 3.31-3.53 m s(-1) and 2.11-2.21 m s(-1), respectively (p < 0.001 for all the comparisons). The SWV value for each ablation zone did not change significantly at different observation times within an hour after RFA (all p > 0.05).
The quantitative 2D-SWE of VTIQ is useful for the depiction of the ablation zone after RFA and it facilitates discrimination of different areas in the ablation zone qualitatively and quantitatively. This elastography technique might be useful for the therapeutic response evaluation instantly after RFA.
closed_qa
Does comorbid anxiety counteract emotion recognition deficits in conduct disorder?
Previous research has reported altered emotion recognition in both conduct disorder (CD) and anxiety disorders (ADs) - but these effects appear to be of different kinds. Adolescents with CD often show a generalised pattern of deficits, while those with ADs show hypersensitivity to specific negative emotions. Although these conditions often cooccur, little is known regarding emotion recognition performance in comorbid CD+ADs. Here, we test the hypothesis that in the comorbid case, anxiety-related emotion hypersensitivity counteracts the emotion recognition deficits typically observed in CD. We compared facial emotion recognition across four groups of adolescents aged 12-18 years: those with CD alone (n = 28), ADs alone (n = 23), cooccurring CD+ADs (n = 20) and typically developing controls (n = 28). The emotion recognition task we used systematically manipulated the emotional intensity of facial expressions as well as fixation location (eye, nose or mouth region). Conduct disorder was associated with a generalised impairment in emotion recognition; however, this may have been modulated by group differences in IQ. AD was associated with increased sensitivity to low-intensity happiness, disgust and sadness. In general, the comorbid CD+ADs group performed similarly to typically developing controls.
Although CD alone was associated with emotion recognition impairments, ADs and comorbid CD+ADs were associated with normal or enhanced emotion recognition performance. The presence of comorbid ADs appeared to counteract the effects of CD, suggesting a potentially protective role, although future research should examine the contribution of IQ and gender to these effects.
closed_qa
Pitfalls and Key Features of a Case of Sclerosing Pneumocytoma: A Cytological Challenge?
The aim of the current case report is to re-evaluate the key features and pitfalls of fine-needle aspiration cytology (FNAC) in the diagnosis of sclerosing pneumocytoma (previously named sclerosing hemangioma) and to establish the importance of FNAC in addressing a proper surgical strategy. Herein we documented a case of a 70- year-old man with a lung nodule which showed a hypermetabolic uptake on positron emission tomography. He therefore underwent FNAC under computed tomography scan guidance with a 22-gauge needle. The cytopathological examination allowed a diagnosis of sclerosing pneumocytoma. A wedge surgical excision was performed and the histological examination confirmed the cytological diagnosis.
FNAC is a fundamental tool for distinguishing sclerosing pneumocytoma from a malignant lung tumour and together with clinical, radiological and pathological multidisciplinary assessment is indispensable in planning appropriate surgical management. Cytopathologists should be aware of the pitfalls and key features of the cytopathological diagnosis of sclerosing pneumocytoma, which can significantly change the surgical approach to the patient and protect him from aggressive overtreatment.
closed_qa
Can Pacing Be Regulated by Post-Activation Potentiation?
Given the co-existence of post-activation potentiation (PAP) and fatigue within muscle, it is not known whether PAP could influence performance and pacing during distance running by moderating fatigue. The aim of this study was to assess the influence of PAP on pacing, jumping and other physiological measures during a self-paced 30 km trial. Eleven male endurance-trained runners (half-marathon runners) volunteered to participate in this study. Runners participated in a multi-stage 30 km trial. Before the trial started, determination of baseline blood lactate (bLa) and countermovement jump (CMJ) height was performed. The self-paced 30 km trial consisted of 6 × 5 km splits. At the end of each 5 km split (60 s break), data on time to complete the split, CMJ height, Rating of Perceived Exertion (RPE) and blood lactate were collected while heart rate was continuously monitored. There was a significant decrease in speed (e.g. positive pacing strategy after the 4th split, p<0.05) with a progressive increase in RPE throughout the trial. Compared with baseline, CMJ height was significantly (p<0.05) greater than baseline and was maintained until the end of the trial with an increase after the 5th split, concomitant with a significant reduction in speed and an increase in RPE. Significant correlations were found between ΔCMJ and ΔSPEED (r = 0.77 to 0.87, p<0.05) at different time points as well as between RPE and speed (r = -0.61 to -0.82, p<0.05).
Our results indicates that fatigue and potentiation co-exist during long lasting endurance events, and that the observed increase in jump performance towards the end of the trial could be reflecting a greater potentiation potentially perhaps counteracting the effects of fatigue and preventing further reductions in speed.
closed_qa
First trimester cystic hygroma: does early detection matter?
To describe the association of abnormal outcomes with fetal cystic hygroma detected when crown-rump length measures less than 45 mm, and to compare them to outcomes among fetuses with cystic hygroma detected when crown-rump length measures 45-84 mm. We performed a retrospective cohort study of fetuses with first trimester nuchal cystic hygroma from 2005 to 2015. A total of 212 fetuses were included. Abnormal karyotype was found in 20 of 46 (43.4%) fetuses with cystic hygroma detected when crown-rump length measured below 45 mm, compared to 108 of 148 (73%) fetuses with cystic hygroma detected at crown-rump lengths of 45-84 mm (p = 0.001). There were no differences in rates of major structural anomaly (27% vs 36%; p = 0.53) or pregnancy loss (23% vs 7%; p = 0.22) among fetuses with normal karyotype. Those with cystic hygroma diagnosed at crown-rump lengths below 45 mm were more likely to have a normal neonatal outcome compared to cases diagnosed with crown-rump lengths of 45-84 mm (25% vs 11%; p = 0.02).
Cystic hygroma detected when crown-rump length measures below 45 mm have lower rates of chromosomal abnormalities and a higher proportion of normal birth outcomes when compared to those detected later in the first trimester. © 2016 John Wiley&Sons, Ltd.
closed_qa
Does Real-Time Monitoring of Patient Dose With Dose Management Software Increase CT Technologists' Radiation Awareness?
Dose management software can be used to increase patient safety. The purpose of the current study was to evaluate whether real-time monitoring of patient dose in CT examinations increases CT technologists' dose awareness. Dose data of two scanners (clinical routine CT scanner, mainly outpatients; emergency CT scanner, predominantly emergency department and ICU patients) were analyzed before (period 1) and after (period 2) dose management software was implemented in clinical routine and technologists were advised to check for dose notifications (dose values above reference levels) after each examination (i.e., real-time monitoring). To assess statistically significant differences between both the scanners and the study periods, we used chi-square tests. A total of 6413 examinations were performed (period 1 = 3214 examinations, period 2 = 3199 examinations). Dose notifications were mainly because of patient miscentering (period 1 = 45% of examinations, period 2 = 23%), overweight patients (period 1 = 35%, period 2 = 49%), and scanning repetition (period 1 = 10%, period 2 = 15%). Overall, the number of dose notifications significantly declined in period 2 (period 1, n = 210; period 2, n = 120; p<0.001). Miscentering was more often seen on the clinical routine CT examinations (period 1 = 46%, period 2 = 23%) than on the emergency CT examinations (period 1 = 44%, period 2 = 22%) and occurred significantly less frequently on both scanners in period 2 (period 1: n = 94; period 2: n = 27; p<0.001). The relative values of dose notifications due to overweight patients or scanning repetition were higher in period 2, but these differences did not reach statistical significance (p>0.05).
Real-time monitoring of patient dose with dose management software increases CT technologists' dose awareness and leads to a reduced number of dose notifications due to human error.
closed_qa
Are medical students satisfied with rural community posting?
The aim of the study was to determine whether final year medical students in medical schools of south-east Nigeria were satisfied with rural community posting. A cross-sectional descriptive study design was used. All final year medical students in the six medical schools in south-east Nigeria who had completed their rural community posting and were willing to participate were included in the study. The students were interviewed using a pretested, self-administered questionnaire. A total of 457 medical students participated in the study, representing a response rate of 86.7%. Only a minor proportion of the students (22.5%) were satisfied with rural community posting. The most common reason for dissatisfaction among the students was lack of interest in rural communities. Most students (68.7%) were of the opinion that a good rural community posting could influence the students to practise in a rural area after graduation. Factors associated with satisfaction with rural community posting included being a student in a federal institution (adjusted odds ratio (AOR)=0.6, 95% confidence interval (CI)=0.4-0.9), being a male student (AOR=2.4, 95%CI=1.5-3.9) and intention to specialize in community medicine after graduation (AOR=2.7, 95%CI=1.2-6.0).
Most students were dissatisfied with rural community postings and the major reason for dissatisfaction was lack of interest in rural communities. A properly organized rural community posting is capable of changing the negative attitude of the students towards life and medical practice in the rural area. Adequate orientation of the students on the relevance of the posting, good community exposure and enhanced student lecturer interactions during the posting period could ensure satisfaction of the students. There should be a targeted evaluation of the rural community posting at the various medical schools in the country with the aim of strengthening and modifying the posting where necessary so as to ensure its purpose is realized.
closed_qa
Do Long-Term Survivor Primary Glioblastoma Patients Harbor IDH1 Mutations?
Approximately 3 to 16% of glioblastoma multiforme (GBM) patients are considered long-term survivors (LTS: 3+ years). Given the improved survival conferred by IDH1 mutations and the fact that these mutations are detected in 12% of newly diagnosed GBM cases, could long-term survivorship be explained by IDH1 mutation status? Our aim was to describe GBM LTS with IDH1 mutations and explore its association with overall survival (OS). Records of 453 newly diagnosed adult GBM patients treated at a single institution from 2004 to 2010 were reviewed retrospectively for patients who survived at least 36 months postsurgery. Descriptive statistics for clinical characteristics, treatments received, and tumor biomarkers were reported. Estimates for progression-free survival (PFS) and OS were provided. Forty (8.8%) LTS GBM patients were identified, with a median age of 50 years and a median preoperative Karnofsky Performance Score (KPS) of 80. Most patients underwent near-total/gross-total resection (72.5%), postoperative radiation (97.5%), and adjuvant temozolomide (95%). PFS rates at 12, 36, 48, and 72 months were 67.5%, 40%, 32.7%, and 26.2%, respectively. Median OS has not yet been reached; however, the survival rate at 48 months was 62.1%. Among 35 patients with available tumor samples, only 8 (22.9%) had IDH1 mutations. No significant difference in median PFS was found between IDH1 mutation and wild-type patients (46.6 versus 26.3 months; p =0.45).
Less than a quarter of our patients' long-term survivorship was associated with favorable IDH1 status. Therefore, IDH1 status does not explain most of the long-term survivorship in the temozolomide era.
closed_qa
Does the choice of suture material matter in anterior and posterior colporrhaphy?
The optimal suture material in traditional prolapse surgery is still controversial. Our aim was to investigate the effect of using sutures with rapid (RA) or slow (SA) absorption, on symptomatic recurrence after anterior and posterior colporrhaphy. A population-based longitudinal cohort study with data from the Swedish National Quality Register for Gynecological Surgery. A total of 1,107 women who underwent primary anterior colporrhaphy and 577 women who underwent primary posterior colporrhaphy between September 2012 and September 2013 were included. Two groups in each cohort were created based on which suture material was used. Pre- and postoperative prolapse-related symptoms and patient satisfaction were assessed. We found a significantly lower rate of symptomatic recurrence 1 year after anterior colporrhaphy in the SA suture group compared with the RA suture group, 50 out of 230 (22 %) vs 152 out of 501 (30 %), odds ratio 1.6 (CI 1.1-2.3; p = 0.01). The SA group also had a significantly higher patient satisfaction rate, 83 % vs 75 %, odds ratio 1.6 (CI 1.04-2.4), (p = 0.03). Urgency improved significantly more in the RA suture group (p < 0.001). In the posterior colporrhaphy cohort there was no significant difference between the suture materials.
This study indicates that the use of slowly absorbable sutures decreases the odds of having a symptomatic recurrence after an anterior colporrhaphy compared with the use of rapidly absorbable sutures. However, the use of RA sutures may result in less urgency 1 year postoperatively. In posterior colporrhaphy the choice of suture material does not affect postoperative symptoms.
closed_qa
Is There an Increased Arterial Stiffness in Patients with Primary Sjögren's Syndrome?
Primary Sjögren's syndrome (pSS) is a common chronic autoimmune disease that primarily affects the salivary and lacrimal glands. Arterial stiffness is one of the earliest detectable manifestations of adverse structural and functional changes within the vessel wall. The aim of this study was to evaluate the relationship between arterial stiffness and pSS. In this study, 75 female patients with pSS who fulfilled the American European Consensus Criteria for Sjögren's syndrome, were included. A total of 68 age-, sex- and body mass index-matched subjects were recruited as the control population. Arterial stiffness was assessed by measurement of the carotid-femoral pulse wave velocity (PWV). The mean age of the patients was 54.0±9.3 years and the median duration of the disease was 10 years. Compared with the control subjects, patients with pSS had a higher mean PWV (8.2±1.5 m/s vs. 7.5±1.4 m/s; p=0.01). Correlation analysis showed that the PWV was positively correlated with age, body mass index, serum cholesterol, low-density lipoprotein (LDL) and C-reactive protein levels, blood pressure, mean arterial pressure (MAP), pulse pressure and left ventricular mass index. A multiple linear regression analysis revealed that arterial stiffness was associated with age, MAP and LDL levels in pSS patients.
Although patients with pSS appear to have increased arterial stiffness, risk factors associated with arterial stiffness in these patients are similar to the general population. However, we cannot exclude the possibility that a higher PWV in pSS patients is caused, not by pSS itself, but by the use of steroids, hypertension and dyslipidemia.
closed_qa
Ultrasound Guided Fine-Needle Aspiration Biopsy of Thyroid Nodules: Does Radiologist Assistance Decrease the Rate of Unsatisfactory Biopsies?
Ultrasound guided fine-needle aspiration biopsy (UG-FNAB) is the main presurgical, minimally invasive, accurate and generally safe procedure for the diagnosis of thyroid pathology. At present it is recommended as a valuable diagnostic tool for the management of thyroid nodules. This study aimed to evaluate if a radiologist's assistance in the UG-FNAB procedure decreased the rate of unsatisfactory biopsies. Over a 3-year period, 385 (100%) patients were enrolled to the study. All individuals had UG-FNAB performed for the first time due to multiple nodules of the thyroid gland. Patients with a family history of thyroid cancer, receiving radioactive iodine and other predispositions for thyroid malignancy were excluded. 184 (47.79%) patients were examined using UG-FNAB with a radiologist's assistance (group 1) and 201 (52.21%) without such support (group 2). All biopsies were performed by the same surgeon. All specimens obtained were examined by two cytologists experienced in thyroid pathology. The specimens from the UG-FNAB were more frequently diagnostic when obtained from procedures performed with a radiologist's assistance (77.8% vs. 56.8%, p<0.0001). The cellularity of the specimens obtained from the UG-FNAB performed with a radiologist's assistance was higher than those obtained without such support (66.7% vs. 56.9%, p<0.0001).
UG-FNAB of the thyroid nodules performed with a radiologist's assistance makes it possible to obtain more valuable specimens, which may improve diagnostic accuracy in the preoperative management of thyroid pathology.
closed_qa
Are homografts superior to conventional prosthetic valves in the setting of infective endocarditis involving the aortic valve?
Surgical dogma suggests that homografts should be used preferentially, compared with conventional xenograft or mechanical prostheses, in the setting of infective endocarditis (IE), because they have greater resistance to infection. However, comparative data that support this notion are limited. From the prospective databases of 2 tertiary academic centers, we identified 304 consecutive adult patients (age ≥17 years) who underwent surgery for active IE involving the aortic valve (AV), in the period 2002 to 2014. Short- and long-term outcomes were evaluated using propensity scores and inverse-probability weighting to adjust for selection bias. Homografts, and xenograft and mechanical prostheses, were used in 86 (28.3%), 139 (45.7%), and 79 (26.0%) patients, respectively. Homografts were more often used in the setting of prosthetic valve endocarditis (58.1% vs 28.8%, P = .002) and methicillin-resistant Staphylococcus (25.6% vs 12.1%, P = .002), compared with conventional prostheses. Early mortality occurred in 17 (19.8%) in the homograft group, and 20 (9.2%) in the conventional group (P = .019). During follow-up (median: 29.4 months; interquartile-range: 4.7-72.6 months), 60 (19.7%) patients died, and 23 (7.7%) experienced reinfection, with no significant differences in survival (P = .23) or freedom from reinfection rates (P = .65) according to the types of prostheses implanted. After adjustments for baseline characteristics, using propensity-score analyses, use of a homograft did not significantly affect early death (odds ratio 1.61; 95% confidence interval [CI], 0.73-3.40, P = .23), overall death (hazard ratio 1.10; 95% CI, 0.62-1.94, P = .75), or reinfection (hazard ratio 1.04; 95% CI, 0.49-2.18, P = .93).
No significant benefit to use of homografts was demonstrable with regard to resistance to reinfection in the setting of IE. The choice among prosthetic options should be based on technical and patient-specific factors. Lack of availability of homografts should not impede appropriate surgical intervention.
closed_qa
A contemporary analysis of pulmonary hypertension in patients undergoing mitral valve surgery: Is this a risk factor?
Pulmonary hypertension (PHT) has been considered a risk factor for mortality in cardiac surgery. Among mitral valve surgery (MVS) patients, we sought to determine if severe PHT increases mortality risk and if patients who undergo concomitant tricuspid valve surgery (TVS) incur additional risk. Preoperative PHT was assessed in 1571 patients undergoing MVS, from 2004 to 2013. Patients were stratified into PHT groups as follows (mm Hg): none (<35); moderate (35-49); severe (50-79); and extreme (≥80). Propensity-score matching resulted in a total of 430 patients, by PHT groups, and 384 patients, by TVS groups. Patients with severe PHT had higher mortality, both 30-day (4% PHT vs 1% no PHT, P<.02) and late (defined as survival at 5 years): 75.5% severe versus 91.9% no PHT (P<.001). In propensity-score-matched groups, severe PHT was not a risk factor for 30-day (3% each, P = 1.0) or late mortality (86.2% severe vs 87.1% no PHT; P = .87). TVS did not increase 30-day (4.7% TVS vs 4.2% no TVS, P = .8) or late mortality (78.7% TVS vs 75.3% no TVS, P = .90). Late survival was lower in extreme PHT (75.4% vs no PHT 91.5%, P = .007), and a trend was found in 30-day mortality (11% extreme vs 3% no PHT, P = .16).
Mortality in MVS is unaffected by severe PHT or the addition of TVS, yet extreme PHT remains a risk factor. Severe PHT (50-79 mm Hg) should not preclude surgery; concomitant TVS does not increase mortality.
closed_qa
The recommended treatment algorithms of the BCLC and HKLC staging systems: does following these always improve survival rates for HCC patients?
Several staging systems have been proposed for hepatocellular carcinoma (HCC). Among them, only the Barcelona Clinic Liver Cancer (BCLC) and Hong Kong Liver Cancer (HKLC) staging systems also recommend treatment modality. This study was designed to see whether BCLC and HKLC staging can guide treatment strategy, so analyzed whether patients survival is better for those who received recommended therapy by each staging system. A total of 3515 treatment-naïve, newly diagnosed HCC patients at a single centre were analyzed. Five-year survival rates according to BCLC stages: 0 = 79.1%, A = 62.9%, B = 40.3%, C = 21.3% and D = 27.0%; 5-year survival rates according to HKLC stages: I = 72.3%, IIa = 54.9%, IIb = 50.6%, IIIa = 21.3%, IIIb = 10.2%, IVa = 16.7%, IVb = 7.2%, Va = 47.1% and Vb = 11.3%. The C-indices of the BCLC and HKLC staging systems were 0.708 and 0.732 respectively. Patient survival was better when patients received the recommended treatment in stages 0 or A; survival was worse if treatment began at stage B, C or D. For HKLC staging system, survival was better when patients received the recommended treatment in stages I, IIa, IIb, IIIa or Va but was worse when treatment began in stages IIIb, IVa, IVb or Vb.
Both the BCLC and HKLC staging systems effectively stratified patient prognosis, but neither could direct therapy for a large proportion of patients; for some stages, recommended therapy was associated with worse prognosis.
closed_qa
Does Immunosuppressive Therapy Affect Markers of Kidney Damage?
Markers currently used to detect kidney damage are effective in both early (KIM-1, NGAL) and late (MCP-1, MMP, TIMP) stages of renal tubular damage, indicating the progression of chronic kidney disease. Immunosuppressive drugs may damage the transplanted organ through their direct toxic effects and by contributing to the development of chronic fibrosis and tubular atrophy. The aim of this study was to determine if immunosuppressive drugs per se affect the concentration of kidney damage markers, by using concentrations and doses of immunosuppressive within therapeutic, not toxic, levels in rat blood. The study involved 36 rats grouped according to the immunosuppressive regimen used (tacrolimus, mycophenolate mofetil, cyclosporin A, rapamycin, and prednisone). The rats were treated with a 3-drug protocol for 6 months. No drugs were administered to the control group. The blood samples were collected to determine the concentration of kidney damage markers by using enzyme-linked immunosorbent assay (ELISA). 1. In the groups receiving regimens based on cyclosporin A (CyA), significantly higher concentrations of KIM-1 in plasma was observed compared to cases not treated with drugs. 2. The use of tacrolimus was associated with increased concentrations of MCP-1 in plasma and rapamycin was associated with decreased concentrations of MCP-1 in plasma. 3. Rapamycin induces an unfavorable, profibrotic imbalance between metalloproteinase-9 and its inhibitor, TIMP-1.
Commonly used immunosuppressive drugs influence the concentration of blood markers of kidney damage. This fact should be taken into account when analyzing the association between the concentration of these markers and pathological processes occurring in the transplanted kidney.
closed_qa
Is the HAS-BLED score useful in predicting post-extraction bleeding in patients taking warfarin?
Unexpected post-extraction bleeding is often experienced in clinical practice. Therefore, determining the risk of post-extraction bleeding in patients receiving anticoagulant therapy prior to surgery is beneficial. This study aimed to verify whether the HAS-BLED score was useful in predicting post-extraction bleeding in patients taking warfarin. Retrospective cohort study. Department of Oral and Maxillofacial Surgery, Tokyo Women's Medical University. Participants included 258 sequential cases (462 teeth) who had undergone tooth extraction between 1 January 2010 and 31 December 2012 while continuing warfarin therapy. Post-extraction risk factors for bleeding. The following data were collected as the predicting variables for multivariate logistic analysis: the HAS-BLED score, extraction site, tooth type, stability of teeth, extraction procedure, prothrombin time-international normalised ratio value, platelet count and the use of concomitant antiplatelet agents. Post-extraction bleeding was noted in 21 (8.1%) of the 258 cases. Haemostasis was achieved with localised haemostatic procedures in all the cases of post-extraction bleeding. The HAS-BLED score was found to be insufficient in predicting post-extraction bleeding (area under the curve=0.548, p=0.867, multivariate analysis). The risk of post-extraction bleeding was approximately three times greater in patients taking concomitant oral antiplatelet agents (risk ratio=2.881, p=0.035, multivariate analysis).
The HAS-BLED score alone could not predict post-extraction bleeding. The concomitant use of oral antiplatelet agents was a risk factor for post-extraction bleeding. No episodes of post-extraction bleeding required more than local measures for haemostasis. However, because this was a retrospective study conducted at a single institution, large-scale prospective cohort studies, which include cases of outpatient tooth extraction, will be necessary in the future.
closed_qa
Survival From Childhood Hematological Malignancies in Denmark: Is Survival Related to Family Characteristics?
Due to diverse findings as to the role of family factors for childhood cancer survival even within Europe, we explored a nationwide, register-based cohort of Danish children with hematological malignancies. All children born between 1973 and 2006 and diagnosed with a hematological malignancy before the age of 20 years (N = 1,819) were followed until 10 years from diagnosis. Kaplan-Meier curves and Cox proportional hazards models estimating hazard ratios (HR) and 95% confidence intervals (CI) were used to assess the impact of family characteristics on overall survival in children with hematological malignancies. Having siblings and increasing birth order were associated with reduced survival from acute lymphoblastic leukemia (ALL) and acute myeloid leukemia (AML). Associations with AML were strongest and statistically significant. HRs of 1.62 (CI 0.85; 3.09) and 5.76 (CI 2.01; 16.51) were observed for the fourth or later born children with ALL (N = 41) and AML (N = 9), respectively. Children with older parents showed a tendency toward inferior ALL survival, while for AML young maternal age was related to poorer survival. Based on small numbers, a trend toward poorer survival from non-Hodgkin lymphoma was observed for children having siblings and for children of younger parents.
Further research is warranted to gain further knowledge on the impact of family factors on childhood cancer survival in other populations and to elaborate potential underlying mechanisms and pathways of those survival inequalities.
closed_qa
Beneficial Effects of Early Enteral Nutrition After Major Rectal Surgery: A Possible Role for Conditionally Essential Amino Acids?
To investigate direct postoperative outcome and plasma amino acid concentrations in a study comparing early enteral nutrition versus early parenteral nutrition after major rectal surgery. Previously, it was shown that a low plasma glutamine concentration represents poor prognosis in ICU patients. A preplanned substudy of a previous prospective, randomized, open-label, single-centre study, comparing early enteral nutrition versus early parenteral nutrition in patients at high risk of postoperative ileus after surgery for locally advanced or locally recurrent rectal cancer. Early enteral nutrition reduced postoperative ileus, anastomotic leakage, and hospital stay. Tertiary referral centre for locally advanced and recurrent rectal cancer. A total of 123 patients with locally advanced or recurrent rectal carcinoma requiring major rectal surgery. Patients were randomized (ALEA web-based external randomization) preoperatively into two groups: early enteral nutrition (early enteral nutrition, intervention) by nasojejunal tube (n = 61) or early parenteral nutrition (early parenteral nutrition, control) by jugular vein catheter (n = 62). Eight hours after the surgical procedure artificial nutrition was started in hemodynamically stable patients, stimulating oral intake in both groups. Blood samples were collected to measure plasma glutamine, citrulline, and arginine concentrations using a validated ultra performance liquid chromatography-tandem mass spectrometric method. Baseline concentrations were comparable for both groups. Directly after rectal surgery, a decrease in plasma amino acids was observed. Plasma glutamine concentrations were higher in the parenteral group than in the enteral group on postoperative day 1 (p = 0.027) and day 5 (p = 0.008). Arginine concentrations were also significantly increased in the parenteral group at day 1 (p<0.001) and day 5 (p = 0.001).
Lower plasma glutamine and arginine concentrations were measured in the enteral group, whereas a better clinical outcome was observed. We conclude that plasma amino acids do not provide a causal explanation for the observed beneficial effects of early enteral feeding after major rectal surgery.
closed_qa
Do the Threshold Limit Values for Work in Hot Conditions Adequately Protect Workers?
We evaluated core temperature responses and the change in body heat content (ΔHb) during work performed according to the ACGIH threshold limit values (TLV) for heat stress, which are designed to ensure a stable core temperature that does not exceed 38.0°C. Nine young males performed a 120-min work protocol consisting of cycling at a fixed rate of heat production (360 W). On the basis of the TLV, each protocol consisted of a different work-rest (WR) allocation performed in different wet-bulb globe temperatures (WBGT). The first was 120 min of continuous (CON) cycling at 28.0°C WBGT (CON[28.0°C]). The remaining three protocols were intermittent work bouts (15-min duration) performed at various WR and WBGT: (i) WR of 3:1 at 29.0°C (WR3:1[29.0°C]), (ii) WR of 1:1 at 30.0°C (WR1:1[30.0°C]), and (iii) WR of 1:3 at 31.5°C (WR1:3[31.5°C]) (total exercise time: 90, 60, and 30 min, respectively). The change in rectal (ΔTre) and mean body temperature (ΔTb) was evaluated with thermometry. ΔHb was determined via direct calorimetry and also used to calculate ΔTb. Although average rectal temperature did not exceed 38.0°C, heat balance was not achieved during exercise in any work protocol (i.e., rate of ΔTre>0°C·min; all P values ≤ 0.02). Consequently, it was projected that if work was extended to 4 h, the distribution of participant core temperatures higher and lower than 38.0°C would be statistically similar (all P values ≥ 0.10). Furthermore, ΔHb was similar between protocols (P = 0.70). However, a greater ΔTb was observed with calorimetry relative to thermometry in WR3:1[29.0°C] (P = 0.03), WR1:1[30.0°C](P = 0.02), and WR1:3[31.5°C] (P<0.01) but not CON[28.0°C] (P = 0.32).
The current study demonstrated that heat balance was not achieved and ΔTb and ΔHb were inconsistent, suggesting that the TLV may not adequately protect workers during work in hot conditions.
closed_qa
Preoperative experience for public hospital patients with gynecologic cancer: Do structural barriers widen the gap?
Widespread disparities in care have been documented in women with gynecologic cancer in the United States. This study was designed to determine whether structural barriers to optimal care were present during the preoperative period for patients with gynecologic cancer. A retrospective review was conducted for patients undergoing surgery for a gynecologic malignancy at a public hospital or a private hospital staffed by the same team of gynecologic oncologists between July 1, 2013 and July 1, 2014. Two hundred fifty-seven cases were included for analysis (public hospital, 69; private hospital, 188). Patients treated at the private hospital were older (58 vs 52 years; P = .004) and had similar medical comorbidities (median Charlson comorbidity index at both hospitals, 6) but required fewer hospital visits in preparation for surgery (2 vs 4; P<.001). Public hospital patients had a longer wait time from the diagnosis of disease to surgery (63 vs 34 days; P<.001). According to a multiple linear regression model, the public hospital setting was associated with a longer interval from diagnosis to surgery with adjustments for the insurance status, age at diagnosis, cancer stage, and number of preoperative hospital visits (P<.001).
Patients at the public hospital were subject to a greater number of preoperative visits and had to wait longer for surgery than patients at the private hospital. Attempts to reduce health care disparities should focus on improving efficiency in health care delivery systems once contact has been established.
closed_qa
Corneal Transplantation in Disease Affecting Only One Eye: Does It Make a Difference to Habitual Binocular Viewing?
Clarity of the transplanted tissue and restoration of visual acuity are the two primary metrics for evaluating the success of corneal transplantation. Participation of the transplanted eye in habitual binocular viewing is seldom evaluated post-operatively. In unilateral corneal disease, the transplanted eye may remain functionally inactive during binocular viewing due to its suboptimal visual acuity and poor image quality, vis-à-vis the healthy fellow eye. This study prospectively quantified the contribution of the transplanted eye towards habitual binocular viewing in 25 cases with unilateral transplants [40 yrs (IQR: 32-42 yrs) and 25 age-matched controls [30 yrs (25-37 yrs)]. Binocular functions including visual field extent, high-contrast logMAR acuity, suppression threshold and stereoacuity were assessed using standard psychophysical paradigms. Optical quality of all eyes was determined from wavefront aberrometry measurements. Binocular visual field expanded by a median 21% (IQR: 18-29%) compared to the monocular field of cases and controls (p = 0.63). Binocular logMAR acuity [0.0 (0.0-0.0)]almost always followed the fellow eye's acuity [0.00 (0.00 --0.02)] (r = 0.82), independent of the transplanted eye's acuity [0.34 (0.2-0.5)](r = 0.04). Suppression threshold and stereoacuity were poorer in cases [30.1% (13.5-44.3%); 620.8 arc sec (370.3-988.2 arc sec)] than in controls [79% (63.5-100%); 16.3 arc sec (10.6-25.5 arc sec)](p<0.001). Higher-order wavefront aberrations of the transplanted eye [0.34 μ (0.21-0.51 μ)] were higher than the fellow eye [0.07 μ (0.05-0.11 μ)](p<0.001) and their reduction with RGP contact lenses [0.09 μ (0.08-0.12 μ)] significantly improved the suppression threshold [65% (50-72%)]and stereoacuity [56.6 arc sec (47.7-181.6 arc sec)] (p<0.001).
In unilateral corneal disease, the transplanted eye does participate in gross binocular viewing but offers limited support to fine levels of binocularity. Improvement in the transplanted eye's optics enhances its participation in binocular viewing. Current metrics of this treatment success can expand to include measures of binocularity to assess the functional benefit of the transplantation process in unilateral corneal disease.
closed_qa
Mechanic valve prosthesis and pregnancy: Is Phenprocoumon replaceable?
We report the case of a 30-year-old pregnant patient with mechanical valve replacement in mitral and aortic position. She had discontinued Phenprocoumon-treatment in the 5+4 week of pregnancy by herself. Because of rheumatic fever she had undergone a mechanical aortic and mitral valve replacement 12 years ago. Due to a thrombosis of the mitral valve, an acute reoperation had to be done 5 years later. 2 years ago, a partially re-thrombosis of the mechanical mitral valve was treated by intravenous thrombolysis. These complications had been probably due to incomplicance. The patient had experienced 3 abortions before. The vaginal sonography determined an intact gestation. The laboratory test revealed an INR of 1.2. The transesophageal echocardiography showed a partially thrombosed mechanical mitral valve. The abdominal ultrasonography detected an embolic splenic infarction. These findings were consistent with partially thrombosed mechanical mitral valve with thromboembolic splenic infarction among incompetent oral anticoagulation. After initial heparinization with under twice daily control of the partial thromboplastin time the joint decision was made to restart Phenprocoumon (target INR 2.5 to 3.5, and additional ASS 100 mg /day). 9 days later the patient had a missed abortion. An uncomplicated curettage was performed under therapeutic i.v. heparinization.
The use of coumarins in pregnancy carries a fetal risk. But it is the most secure anticoagulation after a mechanical valve replacement, especially in high-risk patients. Alternatives are heparins. They don't cross the placenta but are associated with a slightly elevated risk of thromboembolism.
closed_qa
Does OCT morphology provide indications for prognosis of visual acuity after venous occlusion?
Even though macular edema (ME) in patients with retinal vein occlusion (RVO) is resolved after intravitreal treatment with anti-vascular endothelial growth factor (VEGF), impairment of visual acuity (VA) often persists. A qualitative and quantitative evaluation of spectral domain optical coherence tomography (SD-OCT) images was carried out in patients with RVO and resolved ME to investigate a correlation between retinal morphology and functional results. Foveal SD-OCT scans of 13 patients with RVO and resolved ME after treatment were retrospectively evaluated. The thickness of inner retinal layers up to the external limiting membrane (ELM) and up to the photoreceptors in the retinal pigment epithelium (RPE) was measured by automatic segmentation software. Foveal continuity of the four outer hyperreflective bands, the ellipsoid zone of the inner segments (ISe), the ELM, the interdigitation zone (IZ), the RPE and the location of the initial ME were evaluated. Patients with good  (≤ 0.3 logMAR, n = 10) and poor VA (≥ 1.0 logMAR, n = 3) were compared. Inner retinal layers up to ELM were thinner in the the poor VA group. In the good VA group the initial ME was significantly more often above the ISe and after resolution of ME the ISe tended to be intact more frequently.
In patients with poor VA despite resolved ME the inner retinal layers up to the ELM were significantly thinner, which could be a sign of atrophy. Qualitative differences were seen at the photoreceptor level, which could be explained by ischemia or an involvement of the outer retina during initial ME that leads to permanent destruction of the ISe.
closed_qa
Do artisanal fishers perceive declining migratory shorebird populations?
This paper discusses the results of ethno-ornithological research conducted on the local ecological knowledge (LEK) of artisanal fishers in northeast Brazil between August 2013 and October 2014. The present study analyzed the LEK of 240 artisanal fishermen in relation to Nearctic shorebirds and the factors that may be affecting their populations. We examined whether differences occurred according to the gender and age of the local population. The research instruments included semi-structured and check-list interviews. We found that greater knowledge of migratory birds and the areas where they occur was retained by the local men compared with the local women. Half of the male respondents stated that the birds are always in the same locations, and most of the respondents believed that changes in certain populations were caused by factors related to habitat disturbance, particularly to increases in housing construction and visitors to the island. The main practices affecting the presence of migratory birds mentioned by the locals were boat traffic and noise from bars and vessels. According to the artisanal fishermen, the population of migratory birds that use the area for foraging and resting has been reduced over time.
Changes in the local landscape related to urbanization and tourism are most likely the primary causes underlying the reduced migratory shorebird populations as reported by local inhabitants. Thus, managing and monitoring urbanization and tourism are fundamental to increasing the success of the migration process and improving the conservation of migratory shorebird species.
closed_qa
The orientation of transcription factor binding site motifs in gene promoter regions: does it matter?
Gene expression is to large degree regulated by the specific binding of protein transcription factors to cis-regulatory transcription factor binding sites in gene promoter regions. Despite the identification of hundreds of binding site sequence motifs, the question as to whether motif orientation matters with regard to the gene expression regulation of the respective downstream genes appears surprisingly underinvestigated. We pursued a statistical approach by probing 293 reported non-palindromic transcription factor binding site and ten core promoter motifs in Arabidopsis thaliana for evidence of any relevance of motif orientation based on mapping statistics and effects on the co-regulation of gene expression of the respective downstream genes. Although positional intervals closer to the transcription start site (TSS) were found with increased frequencies of motifs exhibiting orientation preference, a corresponding effect with regard to gene expression regulation as evidenced by increased co-expression of genes harboring the favored orientation in their upstream sequence could not be established. Furthermore, we identified an intrinsic orientational asymmetry of sequence regions close to the TSS as the likely source of the identified motif orientation preferences. By contrast, motif presence irrespective of orientation was found associated with pronounced effects on gene expression co-regulation validating the pursued approach. Inspecting motif pairs revealed statistically preferred orientational arrangements, but no consistent effect with regard to arrangement-dependent gene expression regulation was evident.
Our results suggest that for the motifs considered here, either no specific orientation rendering them functional across all their instances exists with orientational requirements instead depending on gene-locus specific additional factors, or that the binding orientation of transcription factors may generally not be relevant, but rather the event of binding itself.
closed_qa
Cardiomyopathy in children: Can we rely on echocardiographic tricuspid regurgitation gradient estimates of right ventricular and pulmonary arterial pressure?
Introduction Agreement between echocardiography and right heart catheterisation-derived right ventricular systolic pressure is modest in the adult heart failure population, but is unknown in the paediatric cardiomyopathy population. All patients at a single centre from 2001 to 2012 with a diagnosis of cardiomyopathy who underwent echocardiography and catheterisation within 30 days were included in this study. The correlation between tricuspid regurgitation gradient and catheterisation-derived right ventricular systolic pressure and mean pulmonary artery pressure was determined. Agreement between echocardiography and catheterisation-derived right ventricular systolic pressure was assessed using Bland-Altman plots. Analysis was repeated for patients who underwent both procedures within 7 days. Haemodynamic data from those with poor agreement and good agreement between echocardiography and catheterisation were compared. A total of 37 patients who underwent 48 catheterisation procedures were included in our study. The median age was 11.8 (0.1-20.6 years) with 22 males (58% total). There was a modest correlation (r=0.65) between echocardiography and catheterisation-derived right ventricular systolic pressure, but agreement was poor. Agreement between tricuspid regurgitation gradient and right ventricular systolic pressure showed wide 95% limits of agreement. There was a modest correlation between the tricuspid regurgitation gradient and mean pulmonary artery pressure (r=0.6). Shorter time interval between the two studies did not improve agreement. Those with poor agreement between echocardiography and catheterisation had higher right heart pressures, but this difference became insignificant after accounting for right atrial pressure.
Transthoracic echocardiography estimation of right ventricular systolic pressure shows modest correlation with right heart pressures, but has limited agreement and may underestimate the degree of pulmonary hypertension in paediatric cardiomyopathy patients.
closed_qa
Implementation of tuberculosis infection control measures in designated hospitals in Zhejiang Province, China: are we doing enough to prevent nosocomial tuberculosis infections?
Tuberculosis (TB) infection control measures are very important to prevent nosocomial transmission and protect healthcare workers (HCWs) in hospitals. The TB infection control situation in TB treatment institutions in southeastern China has not been studied previously. Therefore, the aim of this study was to investigate the implementation of TB infection control measures in TB-designated hospitals in Zhejiang Province, China. Cross-sectional survey using observation and interviews. All TB-designated hospitals (n=88) in Zhejiang Province, China in 2014. Managerial, administrative, environmental and personal infection control measures were assessed using descriptive analyses and univariate logistic regression analysis. The TB-designated hospitals treated a median of 3030 outpatients (IQR 764-7094) and 279 patients with confirmed TB (IQR 154-459) annually, and 160 patients with TB (IQR 79-426) were hospitalised in the TB wards. Most infection control measures were performed by the TB-designated hospitals. Measures including regular monitoring of TB infection control in high-risk areas (49%), shortening the wait times (42%), and providing a separate waiting area for patients with suspected TB (46%) were sometimes neglected. N95 respirators were available in 85 (97%) hospitals, although only 44 (50%) hospitals checked that they fit. Hospitals with more TB staff and higher admission rates of patients with TB were more likely to set a dedicated sputum collection area and to conduct annual respirator fit testing.
TB infection control measures were generally implemented by the TB-designated hospitals. Measures including separation of suspected patients, regular monitoring of infection control practices, and regular fit testing of respirators should be strengthened. Infection measures for sputum collection and respirator fit testing should be improved in hospitals with lower admission rates of patients with TB.
closed_qa
Does the duration of symptoms influence outcome in patients with sciatica undergoing micro-discectomy and decompressions?
Early surgical treatment for back and leg pain secondary to disc herniation has been associated with very good outcomes. However, there are conflicting data on the role of surgical treatment in case of prolonged radicular symptomatology. We aimed to evaluate whether the duration of symptoms at presentation affects the subjective outcome.STUDY DESIGN/ This is a retrospective review of prospectively collected data from a single surgeon including micro-discectomies and lateral recess decompressions in patients younger than 60 years old using patient medical notes, radiology imaging, operation notes, and Patient Reported Outcome Measures (PROMS) including Oswestry Disability Index (ODI), visual analogue scale for back pain and leg pain (VAS-BP and VAS-LP). The final follow-up was carried out through postal questionnaire or telephone consultation. Demographic information, duration of symptoms, type and incidence of complications, length of hospital stay, and follow-up were analyzed. Data were categorized into four subgroups: symptoms 0≥6 months, 6 months≥1 year, 1 year≥2 years, and>2 years. A clinically significant result was an average improvement of 2 or more points in the VAS and of 20% and over in the ODI. The level of statistical significance was<0.05%. A total number of 107 patients who underwent 109 operations were included. The level of surgery was L5/S1 (50), L4/L5 (43), L3/L4 (3), L2/L3 (2), and two levels (11). The mean improvement was from 0 to ≤6 months (VAS-LP 5.21±2.81, VAS-BP 3.04±3.15, ODI 35.26±19.25), 6 months to ≤1 year (VAS-LP 4.73±2.61, VAS-BP 3.30±3.05, ODI 26.92±19.49), 1 year to ≤2 years (VAS-LP 3.78±3.68, VAS-BP 3.00±2.78, ODI 19.03±20.24), and>2 years (VAS-LP 4.77±3.61, VAS-BP 3.54±3.43, ODI 28.36±20.93). The length of hospital stay and complication rate was comparable between groups. Average follow-up was 15.69 months.
Our study showed significant improvement in patients with symptoms beyond 1 as well as 2 years since onset, and surgery is a viable option in selected patients.
closed_qa
Is There a Consensus when Physicians Evaluate the Relevance of Retrieved Systematic Reviews?
A significant challenge associated with practicing evidence-based medicine is to provide physicians with relevant clinical information when it is needed. At the same time it appears that the notion of relevance is subjective and its perception is affected by a number of contextual factors. To assess to what extent physicians agree on the relevance of evidence in the form of systematic reviews for a common set of patient cases, and to identify possible contextual factors that influence their perception of relevance. A web-based survey was used where pediatric emergency physicians from multiple academic centers across Canada were asked to evaluate the relevance of systematic reviews retrieved automatically for 14 written case vignettes (paper patients). The vignettes were derived from prospective data describing pediatric patients with asthma exacerbations presenting at the emergency department. To limit the cognitive burden on respondents, the number of reviews associated with each vignette was limited to three. Twenty-two academic emergency physicians with varying years of clinical practice completed the survey. There was no consensus in their evaluation of relevance of the retrieved reviews and physicians' assessments ranged from very relevant to irrelevant evidence, with the majority of evaluations being somewhere in the middle. This indicates that the study participants did not share a notion of relevance uniformly. Further analysis of commentaries provided by the physicians allowed identifying three possible contextual factors: expected specificity of evidence (acute vs chronic condition), the terminology used in the systematic reviews, and the micro environment of clinical setting.
There is no consensus among physicians with regards to what constitutes relevant clinical evidence for a given patient case. Subsequently, this finding suggests that evidence retrieval systems should allow for deep customization with regards to physician's preferences and contextual factors, including differences in the micro environment of each clinical setting.
closed_qa
Are current case-finding methods under-diagnosing tuberculosis among women in Myanmar?
Although there is a large increase in investment for tuberculosis control in Myanmar, there are few operational analyses to inform policies. Only 34% of nationally reported cases are from women. In this study, we investigate sex differences in tuberculosis diagnoses in Myanmar in order to identify potential health systems barriers that may be driving lower tuberculosis case finding among women. From October 2014 to March 2015, we systematically collected data on all new adult smear positive tuberculosis cases in ten township health centres across Yangon, the largest city in Myanmar, to produce an electronic tuberculosis database. We conducted a descriptive cross-sectional analysis of sex differences in tuberculosis diagnoses at the township health centres. We also analysed national prevalence survey data to calculate additional case finding in men and women by using sputum culture when smear microscopy was negative, and estimated the sex-specific impact of using a more sensitive diagnostic tool at township health centres. Overall, only 514 (30%) out of 1371 new smear positive tuberculosis patients diagnosed at the township health centres were female. The proportion of female patients varied by township (from 21% to 37%, p = 0.0172), month of diagnosis (37% in February 2015 and 23% in March 2015 p = 0.0004) and age group (26% in 25-64 years and 49% in 18-25 years, p<0.0001). Smear microscopy grading of sputum specimens was not substantially different between sexes. The prevalence survey analysis indicated that the use of a more sensitive diagnostic tool could result in the proportion of females diagnosed at township health centres increasing to 36% from 30%.
Our study, which is the first to systematically compile and analyse routine operational data from tuberculosis diagnostic centres in Myanmar, found that substantially fewer women than men were diagnosed in all study townships. The sex ratio of newly diagnosed cases varied by age group, month of diagnosis and township of diagnosis. Low sensitivity of tuberculosis diagnosis may lead to a potential under-diagnosis of tuberculosis among women.
closed_qa
Can the follow-up of patients with papillary thyroid carcinoma of low and intermediate risk and excellent response to initial therapy be simplified using second-generation thyroglobulin assays?
In view of the low probability of recurrence, the cost-effective follow-up of patients with papillary thyroid carcinoma (PTC) of low or intermediate risk and excellent response to initial therapy represents a challenge. This study evaluated the cases of structural recurrence among these patients. The sample comprised 578 patients with PTC of low or intermediate risk, who were submitted to total thyroidectomy with or without (131) I therapy and exhibited an excellent response to initial therapy defined based on nonstimulated thyroglobulin (Tg) ≤0·2 ng/ml and negative neck ultrasonography (US). Twelve patients (2%) showed structural recurrence. At the time when recurrence was 'confirmed', Tg elevation had not occurred in only two patients, one with lymph node metastases<1 cm detected by US and the other with pulmonary metastases. Antithyroglobulin antibodies (TgAb) were undetectable in both patients. The first alteration observed in patients with recurrence was Tg elevation in six patients, Tg elevation associated with suspicious US in three, and suspicious US in two. An increase in TgAb was not the first alteration in any of the patients. Among the 560 patients who continued to have Tg ≤ 0·2 ng/ml, US permitted the detection of only one neck recurrence. Measurement of TgAb did not detect any recurrence.
Our results confirm that in patients with PTC of low or intermediate risk an excellent response to initial therapy can be defined based on nonstimulated Tg ≤ 0·2 ng/ml. Follow-up consisting only of clinical examination and periodic measurement of Tg with a second-generation assay may be sufficient.
closed_qa
Abdominal compartment syndrome in traumatic hemorrhagic shock: is there a fluid resuscitation inflection point associated with increased risk?
The volume of fluid administered during trauma resuscitation correlates with the risk of abdominal compartment syndrome (ACS). The exact volume at which this risk rises is uncertain. We established the inflection point for ACS risk during shock resuscitation. Using the Glue Grant database, patients aged ≥16 years with ACS were compared with those without ACS (no-ACS). Stepwise analysis of the sum or difference of the mean total fluid volume (TV)/kg, TV and/or body weight, (μ) and standard deviations (σ) vs % ACS at each point was used to determine the fluid inflection point. A total of 1,976 patients were included, of which 122 (6.2%) had ACS. Compared with no-ACS, ACS patients had a higher emergency room lactate (5.8 ± 3.0 vs 4.5 ± 2.8, P<.001), international normalized ratio (1.8 ± 1.5 vs 1.4 ± .8, P<.001), and mortality (37.7% vs 14.6%, P<.001). ACS group received a higher TV/kg (498 ± 268 mL/kg vs 293 ± 171 mL/kg, P<.001) than no-ACS. The % ACS increased exponentially with the sum of μ and incremental σ, with the sharpest increase occurring at TV and/or body weight = μ + 3σ or 1,302 mL/kg.
There is a dramatic rise in ACS risk after 1,302 mL/kg of fluid is administered. This plot could serve as a guide in limiting the ACS risk during resuscitation.
closed_qa
Could transesophageal echocardiography be useful in selected cases during liver surgery resection?
Although only limited scientific evidence exists promoting the use of transesophageal echocardiography (TEE) in non cardiac surgery, several recent studies have documented its usefulness during liver surgery. In the present case study, through the use of color Doppler TEE, compression of the inferior vena cava and the right hepatic vein was clearly evident, as was their restoration after surgery.
TEE should be encouraged in patients undergoing liver resection, not only for hemodynamic monitoring, but also for its ability to provide information about the anatomy of the liver, its vessels, and inferior vena cava patency.
closed_qa
Is 3-hour cyclosporine blood level superior to trough level in early post-renal transplantation period?
Cyclosporine dose is traditionally based on trough blood levels. Cyclosporine trough blood level correlates poorly with acute rejection and cyclosporine nephrotoxicity after renal transplantation. We determined whether cyclosporine blood level at any other time point is superior to cyclosporine trough blood level as a predictor of acute rejection and cyclosporine nephrotoxicity. Cyclosporine blood level was measured before (trough), and 1, 2, 3 and 4 hours after the dose in 156 initial renal transplant cases 2 to 4 days after the initiation of cyclosporine micro-emulsion formula administration. The cylosporine micro-emulsion dose was based on cyclosporine trough blood level targeting 250 to 400 microg./l. Regression analysis revealed that only delayed graft function (p = 0.007) and cyclosporine blood level after 3 hours (p = 0.008) predicted acute rejection. Mean cyclosporine trough blood level plus or minus standard error was not significantly different in patients with and without acute rejection (293+/-21 versus 294+/-11 microg./l.). Mean cyclosporine blood level after 3 hours was significantly lower in patients with acute rejection (1,156+/-90 versus 1,421+/-50, p = 0.008). Cases were divided into tertiles at levels after 3 hours (1,100 and 1,500 microg./l.). The group in which the level after 3 hours was less than 1,100 microg./l. had the highest acute rejection rate (22 of 50 patients, 44%) and a cyclosporine nephrotoxicity rate of 13% (7 of 52 patients). The group in which the level after 3 hours was 1,100 to 1,500 microg./l. had the lowest acute rejection rate (5 of 46 patients, 11%) without increased cyclosporine nephrotoxicity (7 of 52 patients, 13%). A level after 3 hours of greater than 1,500 microg./l. was associated with a rejection rate of 15% (7 of 47 patients) but significantly higher cyclosporine nephrotoxicity (16 of 52 patients, 30%).
Cyclosporine blood level after 3 hours in the early post-transplantation period is associated with acute rejection and cyclosporine nephrotoxicity. A cyclosporine blood level range after 3 hours of 1,100 to 1,500 microg./l. is associated with an optimal outcome. Our data suggest that cyclosporine blood level after 3 hours may represent a better method of monitoring cyclosporine micro-emulsion dose than cyclosporine trough blood level. This hypothesis must be further studied in randomized trials.
closed_qa
Lack of diagnostic tools to prove erectile dysfunction: consequences for reimbursement?
Oral medications for treatment of erectile dysfunction may drastically increase health care expenses. Therefore, reimbursement for treatment will be limited in many countries. Proof of erectile dysfunction on an individual basis may be required. We determine whether erectile dysfunction can be proved by pharmacostimulation tests. We prospectively evaluated 77 consecutive patients with a median age of 54 years (range 25 to 75) who presented with previously untreated erectile dysfunction. Assessment included patient reported semiquantitative data on sexual erections (rigidity, ability for vaginal intromission, duration), standard clinical and laboratory tests, and intracavernous injection test and color duplex sonography with 10 microg. intracavernous prostaglandin E1. Data were compared on the basis of the most important complaint, namely whether vaginal intromission was impossible, feasible only with manual assistance or possible but not long enough for satisfactory sexual performance. Of the 77 patients 36 (47%) were unable to perform vaginal intromission, 28 (37%) needed manual help and 13 (17%) had erections sufficient for penetration but were not satisfied with sexual performance. Patient reports were reliable as shown by the significant correlation of items (r = 0.77) and significant discriminating power among categories for penetration (analysis of variance p<0.001). In contrast, clinical response to intracavernous pharmacostimulation and flow parameters assessed by color duplex sonography could not discriminate among the groups.
Erectile dysfunction could not be defined by pharmacostimulated erections but relevant erectile dysfunction was honestly reported. New and reliable tests for clinical assessment are required to support the application for reimbursement of treatment expenses for erectile dysfunction.
closed_qa
Should vasectomy reversal be performed in men with older female partners?
An assumption exists that men with older female partners who seek treatment of post-vasectomy infertility should undergo in vitro fertilization (IVF) with intracytoplasmic sperm injection (ICSI) rather than vasectomy reversal. Although several studies have reviewed ICSI success rates with advancing maternal age, to our knowledge none has compared them to outcomes for vasectomy reversal in men with older partners. The records of all patients with ovulating partners older than 37 years who underwent vasectomy reversal from 1994 through 1998 were reviewed. Patients were contacted to establish pregnancy and birth rates. Costs of vasectomy reversal, testicular sperm extraction, IVF and ICSI were obtained from the financial office of our institution. A total of 29 patients underwent vasectomy reversal with a followup of 3 to 59 months (median 25). Median male age was 46 years (range 37 to 67) and median female age was 40 years (range 38 to 48). A total of 5 pregnancies and 4 live births were achieved. In the 23 patients followed for more than 1 year the pregnancy rate was 22% and live birth rate was 17%. Using this 17% birth rate at our $4,850 cost for vasectomy reversal the cost per newborn was $28,530. In comparison, using the 8% birth rate per cycle of ICSI for women older than 36 years at a cost of $8,315 for testicular sperm extraction and 1 cycle of IVF with ICSI, the cost per newborn was estimated at $103,940.
Vasectomy reversal appears to be cost-effective to achieve fertility in men with ovulating partners older than 37 years.
closed_qa
Should the age specific prostate specific antigen cutoff for prostate biopsy be higher for black than for white men older than 50 years?
Investigators who have examined age specific reference ranges recommend a higher prostate specific antigen (PSA) cutoff for biopsy for black than for white men older than 50 years. We controlled for PSA to determine whether age specific reference range cutoffs for diagnosis defined by the Walter Reed Army Medical Center group (Walter Reed group) would improve the disproportionate prostate cancer prognosis between black and white men. We studied 651 consecutive patients who underwent radical prostatectomy at Wayne State University between 1991 and 1995 with a mean followup of 34 months (range 1.5 to 75). Log rank tests were used to determine the homogeneity of survival functions between black and white men with similar PSA ranges, and between groups defined by age specific PSA reference ranges for each race. Disease stage and grade were similar or worse in black men for any PSA range, and biochemical disease-free survival was similar or worse within each range. Black men had a higher percentage of high grade prostate cancer than white men 60 to 69 years old who would not have undergone biopsy using the Walter Reed group proposed PSA cutoff.
Black men have similar or worse prostate cancer severity and outcome than white men with similar PSA ranges. Using age specific reference ranges for the PSA test defined by the Walter Reed group, black men have worse outcome than white men after radical prostatectomy. Therefore, we recommend that the PSA cutoff for biopsy should not be higher for black men at any age range.
closed_qa
Is surgery for large hepatocellular carcinoma justified?
Most hepatocellular carcinomas are still discovered at an advanced stage and are left untreated as large hepatocellular carcinomas are contraindications to liver transplantation and percutaneous ethanol injection and are usually considered as poor indications for liver resection. The aim of this study was to reassess the results of surgery in these patients. Between 1984 and 1996, 256 patients underwent resection of biopsy-proven, non-fibrolamellar hepatocellular carcinoma. Of these, 121 had a tumour diameter of less than 5 cm (small hepatocellular carcinomas) and 94 a tumour diameter of more than 8 cm (large hepatocellular carcinomas). The short- and long-term outcome of patients with small and large hepatocellular carcinomas were compared. The in-hospital mortality rate following resection of small and large hepatocellular carcinomas was comparable (11.5 vs. 10.6%), even after stratifying for the presence and severity of an underlying liver disease. In patients with a chronic liver disease, large hepatocellular carcinomas were associated with a greater risk of death and recurrence during the first 2 operative years. In the long term, however (3-5 years), survival and disease-free survival following resection of small and large hepatocellular carcinomas were comparable (34 vs. 31% and 25 vs. 21% at 5 years). Similarly, treatment of and survival after the onset of recurrence were not influenced by the size of the initial tumour.
Patients with large hepatocellular carcinomas should not be abandoned and should be considered for liver resection as this treatment may be associated with an in-hospital mortality rate and a long-term survival comparable to that observed after resection of small hepatocellular carcinomas.
closed_qa
DSM-IV substance abuse and dependence: are there really two dimensions of substance use disorders in adolescents?
Data were collected using the 1995 Minnesota Student Survey. Survey items were designed to correspond to DSM-IV diagnostic criteria for substance abuse and dependence. Public schools, alternative schools and area learning centers. Of the 78,800 students between the ages of 14 and 18 years who completed the survey, 18,803 reported substance use and at least one substance use disorder diagnostic criterion during the previous 12 months and were used for the analyses. The sample was divided randomly into two groups in order to conduct data analyses on one group (n = 9490) and confirm the findings in the other group (n = 9313). Confirmatory factor analyses were conducted to test three competing factor structure models consisting of a single factor model, a two-factor model of distinct dimensions and a two-factor model with interrelated dimensions. The single factor and correlated two-factor models had similar parameter estimates and fit the data better than the competing two-factor model with distinct dimensions. Findings were confirmed in a second sample.
The study findings indicate that DSM-IV substance abuse and dependence criteria may be more optimally structured as a unidimensional construct rather than as bidimensional constructs for adolescents.
closed_qa
Complications in the first year of laparoscopic gastric banding: is it acceptable?
From December 1997 to December 1998, 25 laparoscopic adjustable silicone gastric banding (LASGB) procedures were done without previous experience in bariatric surgery. Body mass index (BMI) ranged from 37 to 57 kg/m2 (average 45.5 kg/m2). Retrospective analysis of the 1-year experience was done. Operating time was measured, and BMI and complications were reviewed. Five complications were observed. There was a complication rate of 20%. On two occasions, it was gastric wall slippage, and both were corrected laparoscopically. In one patient, the intususception of the gastric wall through the band resulted after profuse vomiting. Removal of the band was necessary, with conversion to an open procedure. On two occasions, the infection of the port-site was observed, in one of these patients, port removal was necessary. No antibiotic prophylaxis was used.
Despite lack of experience in bariatric surgery in these laparoscopic surgeons, the complications with LASGB appear to be acceptable. Although prior bariatric surgical experience is preferable.
closed_qa
Fingerstick Helicobacter pylori antibody test: better than laboratory serological testing?
Antibody testing is the recommended method to screen for Helicobacter pylori (H. pylori) infection. Whole-blood fingerstick antibody tests are simple, in-office tests providing rapid results, but the accuracy of first-generation tests was lower than other diagnostic tests. We assessed a new whole-blood antibody test, using endoscopic biopsy tests as a "gold standard," and compared it with a laboratory quantitative serological test. Two hundred-one patients not previously treated for H. pylori who were undergoing endoscopy had gastric biopsies for rapid urease test and histological examination; whole-blood antibody tests and quantitative serological tests were also performed. Two separate gold standards for H. pylori infection were employed: either rapid urease test or histological exam positive; and both rapid urease test and histological exam positive. Sensitivities for whole-blood test versus quantitative serology with gold standard 1 (either biopsy test positive) were 86% versus 92% (95% confidence interval [CI] of difference, -2-14%; p = 0.19) and specificities were 88% versus 77% (95% CI of difference, 0.4-22%; p = 0.052). Sensitivities with gold standard 2 (both biopsy tests positive) were 90% versus 94% (95% CI of difference, -4-12%; p = 0.41) and specificities were 79% versus 67% (95% CI of difference, 1-24%; p = 0.048).
New generation in-office, whole-blood antibody tests that can achieve a sensitivity and specificity similar to or better than those of widely used quantitative laboratory serological tests may be used as the initial screening tests of choice for H. pylori.
closed_qa
Mortality in rheumatoid arthritis: have we made an impact in 4 decades?
To evaluate trends in survival among patients with rheumatoid arthritis (RA) over the past 4 decades. Three population based prevalence cohorts of all Rochester, Minnesota, residents age>or =35 years with RA (1987 American College of Rheumatology criteria) on January 1, 1965, January 1, 1975, and January 1, 1985; and an incidence cohort of all new cases of RA occurring in the same population between January 1, 1955 and January 1, 1985, were followed longitudinally through their entire medical records (including all inpatient and outpatient care by any provider) until death or migration from the county. Mortality was described using the Kaplan-Meier method and the influence of age, sex, rheumatoid factor (RF) positivity, and comorbidity (using the Charlson Comorbidity Index) on mortality was analyzed using Cox proportional hazards models. Mortality was statistically significantly worse than expected for each of the cohorts (overall p<0.0001). A trend toward increased mortality in the 1975 and 1985 prevalence cohorts compared to the 1965 prevalence cohort was present, even after adjusting for significant predictors of mortality (age, RF positivity, and comorbidity). Survival for the general population of Rochester residents of similar age and sex improved in 1975 compared to 1965, and in 1985 compared to 1975.
The excess mortality associated with RA has not changed in 4 decades. Moreover, people with RA have not enjoyed the same improvements in survival experienced by their non-RA peers. More attention should be paid to mortality as an outcome measure in RA.
closed_qa
Does social integration confound the relation between alcohol consumption and mortality in the Multiple Risk Factor Intervention Trial (MRFIT)?
It has been proposed that social integration would act as a confounder in the relationship between alcohol consumption and all-cause mortality. This study tested the assumption that the J-shaped relationship between drinking and all-cause mortality may partly reflect a protective effect of social integration, to the extent that moderate drinkers are more socially integrated than either abstainers or heavy drinkers, and to the extent that social integration offers direct protection from mortality. This hypothesis was tested using data from 10,832 of the 12,866 men in the Multiple Risk Factor Intervention Trial (MRFIT). Indicators of social integration were derived from an exploratory factor analysis of 25 relevant items in the MRFIT data and from a scale of six items selected by the investigators. We failed to confirm a direct protective effect of social integration. Nondrinkers had the highest rates of all-cause mortality. Compared with heavy drinking, relative risks of all-cause mortality for abstinence, light and moderate drinking were unaffected by inclusion of social integration variables in the proportional hazards models.
The MRFIT data fail to confirm a confounding effect of social integration.
closed_qa
Prostaglandin E1: a new agent for the prevention of renal dysfunction in high risk patients caused by radiocontrast media?
Acute renal failure following the administration of radiocontrast media (RCM) is a complication found especially in patients with impaired renal function. Within the limits of a pilot study, the objective was to (a) show the effectiveness and compatibility of prostaglandin E(1) (PGE(1)=Alprostadil) in preventing acute renal failure in patients with elevated levels of serum creatinine and (b) to identify the most appropriate PGE(1)-dose. 130 patients with renal impairment (serum creatinine>/=1.5 mg/dl) were included in the study prior to intravascular RCM injection. The patients received one of three different doses of PGE(1) (10, 20, or 40 ng/kg bodyweight/min) or placebo (physiologic sodium chloride solution) intravenously over a time period of 6 h (beginning 1 h prior to RCM application). Serum creatinine was measured 12, 24, and 48 h post RCM-application and creatinine clearance was determined with two 12 h collection periods, as well as one 24 h collection within 48 h post RCM administration. Adverse events during PGE(1) administration were recorded. In the placebo group, the mean elevation of serum creatinine was markedly higher (0.72 mg/dl) 48 h after RCM administration compared with the three PGE(1) groups (0.3 mg/dl in the 10 ng/kg/min group, 0. 12 mg in the 20 ng/kg/min group, and 0.29 mg/dl in the 40 ng/kg/min group). No clinically relevant changes were seen regarding the creatinine clearance in the four groups examined.
Results from this pilot-study suggest that intravenous PGE(1) may be used efficaciously and safely to prevent RCM-induced renal dysfunction in patients with pre-existing impaired renal function.
closed_qa
Can nurses screen all outpatients?
This paper outlines and evaluates a nurse based model for screening outpatients that is utilized in our free standing Surgical Day Care Centre (SDCC). For 668 outpatients presenting at our SDCC, the attending anesthesiologist completed a study survey that was designed to identify: completeness of history; important concerns as judged by the pre-admission nurse; whether the patient was seen in the anesthesia preadmission clinic (PAC) for a consultation; if there was a delay in SDCC, the duration and reasons for the delay; whether in the opinion of the attending anesthesiologist the patient should have had an anesthetic consultation; whether the patient was canceled and the reason for cancellation. A nurse based model for screening all outpatients in a university affiliated tertiary hospital day care unit had an accuracy of 81%, specificity of 86%, sensitivity of 46% and a negative predictive value of 92%. The cancellation rate with this model was 1.4%(8/551) and the case delay rate was 3.4%(19/551). The referral rate to anesthesiology staff was 17.5%(117/668) and the referral rate to the PAC for anesthetic consultation was 5.1%(34/668).
The use of the nurse based model allowed for the efficient use of anesthesia and surgical day care centre resources. The model was better at 'ruling out' patients who do not need to be seen by anesthesiology ahead of the day of surgery rather than 'ruling in' patients who need to be seen by anesthesiology.
closed_qa
Surgical management of renal trauma: is vascular control necessary?
To assess in a randomized prospective manner nephrectomy rate, transfusion rate, blood loss, and time of operation in penetrating renal trauma patients randomized to vascular control or no vascular control before opening Gerota's fascia. During a 53-month period from January of 1994 to May of 1998, 56 patients with penetrating renal injuries were entered into a randomized prospective study at an urban Level I trauma center. The patients were randomized to a preliminary vascular control group or no vascular control group. Randomization was performed intraoperatively before opening Gerota's fascia. All renal injuries were identified and diagnosed intraoperatively. Intravenous pyelography was not performed preoperatively. If the patient was randomized to the no control group and significant bleeding ensued after opening of Gerota's fascia, the renal hilum was cross-clamped. All injuries were included regardless of patient age, associated injuries, blood loss, severity of renal injury, or other abdominal organs injured. All injuries that required renorrhaphy or partial nephrectomy underwent drainage with closed Jackson-Pratt drainage. Twenty-nine patients were randomized to the preliminary vascular control group, and 27 patients were randomized to the no vascular control group. The average age in the vascular control group was 25.3 years (SD, 10.9) and 23.4 years (SD, 8.2) in the no control group. The average penetrating abdominal trauma index in the vascular control group was 22.9 (SD, 10.9) and in the no control group 23.7 (SD, 13.7). Nine nephrectomies (31%) were performed in the vascular control group, and eight nephrectomies (30%) were performed in the no vascular control group (p>0.05). The average operative time for the vascular control group was 127 minutes and for the no control group was 113 minutes (p>0.05). Eleven patients (38%) required intraoperative blood transfusion in the vascular control group (average, 5.5 U/patient transfused) versus eight patients (30%) in the no vascular control group (average, 5.2 U/patient transfused) (p>0.05). The average blood loss in the vascular control group was 1.06 liters versus 0.91 liters in the no control (p>0.05). There was one mortality in the study population.
Vascular control of the renal hilum before opening Gerota's fascia has no impact on nephrectomy rate, transfusion requirements, or blood loss. Operative time may be increased with the vascular control technique.
closed_qa
A preliminary investigation into the use of virtual environments in memory retraining after vascular brain injury: indications for future strategy?
In a preliminary investigation of the use of Virtual Environments (VEs) in neurorehabilitation, this study compares the effects of active and passive experience of a VE on two types of memory in vascular brain injury patients and controls. Forty-eight patients with vascular brain injury and 48 non-impaired control participants were randomly assigned to active and passive VE conditions. The active participants explored a virtual bungalow seeking a particular object; the passive participants observed, but did not control movement through the VE, also seeking the object. Afterwards, both active and passive participants completed spatial recognition and object recognition tests. Expectedly, the patients were impaired relative to the controls but were able to perform the virtual tasks. Active participation in the VE enhanced memory for its spatial layout in both patients and controls. On object recognition, active and passive patients performed similarly, but passive controls performed better than active controls.
The findings are discussed in relation to their implications for memory rehabilitation strategies.
closed_qa
Should the patient with an interatrial defect recognized in adulthood always be operated on?
Atrial septal defect (ASD) can be recognized in adult age, mostly in asymptomatic or scarcely symptomatic patients. These patients differ from patients in "historical" clinical series, in whom diagnosis was done on the basis of clinical evidence, and their natural history is probably different. Our aim was to verify retrospectively results of surgery versus medical follow-up in an adult population with ASD with age at first diagnosis>or = 30 years. Seventy-two patients with ASD, 52 females (72%), observed at our Institution since 1978, were considered. Mean age at diagnosis was 48 +/- 12 years (range 30-79); 36 patients (50%, group A) are still on medical therapy, 36 patients (group B) were operated. As groups A and B did not differ significantly in any demographic, clinical or echocardiographic parameter, they were compared for the incidence of complications. During follow-up (100 +/- 70 months, range 12-240), the incidence of major clinical events showed no significant differences in the two groups, as cardiac death or cardiovascular complications (cerebral ischemic events, severe mitral insufficiency, reoperation) occurred in 4 patients in group A (11%) and in 4 patients in group B (11%). Worsening of NYHA class was observed in 3 patients from group A (8%) and 2 patients from group B (5.5%; p = ns). New onset of supraventricular arrhythmias occurred more frequently in group B (14 patients, 39%) than in group A (5 patients, 14%) (p = 0.01; OR = 3.9; CI 95%: 1.2-12.6).
In an adult population affected with asymptomatic or mildly symptomatic ASD and age at first diagnosis>or = 30 years, surgical closure of the defect did not modify morbidity and mortality at a mid-term follow-up. We suggest that, mostly in older asymptomatic patients, surgery should not be a routine choice and clinical decision-making should be individualized in each case.
closed_qa
Parathyroid adenomas: is bilateral neck exploration necessary?
The traditional surgical treatment for primary hyperparathyroidism is bilateral neck exploration with identification of all parathyroid glands. Multiple investigators who recommend initial unilateral neck exploration based on more advanced localization studies have recently challenged this approach. We reviewed our experience with primary hyperparathyroidism to determine if localization study-aided unilateral neck exploration is sufficient for a cure. Retrospective chart review of patients with primary hyperparathyroidism. Sixty-eight patients underwent surgery for primary hyperparathyroidism. Forty-four patients were treated with localization study-aided unilateral neck exploration, and 24 patients were treated with bilateral neck exploration without preoperative localization studies. The most successful preoperative localization study was the technetium 99m sestamibi (T99mS) scan which correctly identified the location of adenomas in all cases in which it was used (n = 15). All patients were treated with unilateral neck exploration and were cured. This success was matched only by surgical exploration (n = 24).
Unilateral neck exploration based on the results of a T99mS scan can be used as an initial approach for primary hyperparathyroidism if the scan identifies a solitary lesion. The second gland on the same side of the lesion should be biopsied, and if it is normal, the opposite side of the neck may be left undisturbed. If the second gland is not normal, or if the T99mS scan shows multiple lesions, bilateral neck exploration should be performed.
closed_qa
Does axillary dissection affect prognosis in T1 breast tumors?
The treatment of patients with breast cancer has undergone many revisions over recent decades. The current trend is toward limited resections and breast conservation. Some authors advocate the abandonment of axillary lymph node dissection (ALND) for small tumors. While it is accepted that ALND has no therapeutic effect in breast cancer patients, its prognostic significance for small tumors is debated. Eligibility criteria for surgical treatment without axillary dissection are evolving. Considering that problem, we retrospectively reviewed the charts of 100 patients with T1 invasive carcinoma of the breast treated at Hippokration Hospital of Athens between 1986 and 1987. Patients were divided into two groups: those that underwent ALND (n=76) and those that did not (n=24). The following data were recorded: age, tumor size, grade, hormone receptor status and postoperative treatment. The ten-year overall and disease-free survival were analysed. A multivariate analysis was used to identify prognostic variables. There was no statistically significant difference in the ten-year overall and disease-free survival between the two groups. The univariate analysis showed that tumor size predicts both recurrence and survival. In the multivariate analysis tumor size was found to be an independent prognostic factor for overall survival.
ALND did not influence the ten-year survival or the recurrence rate. Tumor size was the only statistically significant and independent prognostic factor for T1 breast cancer patients.
closed_qa
Pure stress leakage symptomatology: is it safe to discount detrusor instability?
To determine whether the combination of a urological history and urinary diary, with rigorous selection criteria, can be used to define a group of women on whom urodynamic assessment is unnecessary prior to offering surgery for urinary stress incontinence. Retrospective review of the urodynamic records of women attending for assessment between January 1992 and December 1996. Urodynamic Department, Southmead Hospital, Bristol. 5193 women who attended the urodynamic clinic during the five year study period. Self-completion of a urinary diary in the preceding week before urodynamic assessment and a detailed urological history before undergoing cystometry by all women in the study period. Data were entered onto a computer database. Women reporting stress incontinence in the absence of bladder filling symptoms, with a normal urinary diary showing daytime frequency of seven times or less and nocturia of no more than once, had the results of their filling cystometry analysed. Of 5193 women, 555 had symptoms of pure stress incontinence and a normal urinary diary. Incontinence was confirmed objectively in 81%, with 9% having incontinence secondary to detrusor instability; 5% had detrusor instability as the sole cause of their incontinence with 4% having a mixed picture of detrusor instability incontinence and urethral sphincter weakness.
Genuine stress incontinence cannot be diagnosed reliably from a urological history, even when rigorous selection criteria are used in combination with a normal urinary diary. Without cystometry, incontinence secondary to detrusor instability will be missed.
closed_qa
Is an ultrasound assessment of gestational age at the first antenatal visit of value?
To assess the efficacy of an ultrasound scan at the first antenatal visit. Randomised clinical trial. Women's and Children's tertiary level hospital, Adelaide, Australia. Six hundred and forty-eight women attending for their first antenatal visit at less than 17 weeks of gestation who had no previous ultrasound scan in the pregnancy, who were expected to give birth at the hospital, and for whom there was no indication for an ultrasound at their first visit. Eligible consenting women were enrolled by telephone randomisation into either the ultrasound at first visit group, who had an ultrasound at the time of their first antenatal visit, or the control group in whom no ultrasound assessment was done at their first antenatal visit. Both groups of women completed a questionnaire at the end of the first visit on their feelings towards the pregnancy and anxiety levels. Data were collected on details of any ultrasound assessments, including the 18 to 20 weeks morphology scan, and pregnancy outcome. All primary analyses were on an intention-to-treat basis. The number of women who needed adjustment in dates of 10 days or more on the basis of their 18 to 20 weeks ultrasound morphology scan, who were booked for their morphology scan at sub-optimal gestations, who had a repeat of their maternal serum screening test, or who felt worried about their pregnancy at the end of the first antenatal visit. Fewer women (9%) in the ultrasound at first visit group needed adjustment of their expected date of delivery as a result of the 18 to 20 week ultrasound, compared with 18% of women in the control group (RR 0.52, 95% CI 0.34-0.79; P = 0.002). The number of women who had the 18 to 20 week ultrasound assessment timed suboptimally was similar to that in the control group (16% vs. 21%), as was the number of women who had a repeat blood sample taken for maternal serum screening (6% vs. 6%). Fewer women in the ultrasound at first visit group reported feeling worried about their pregnancy (RR 0.80, 95% CI 0.65-0.99; P = 0.04) or not feeling relaxed about their pregnancy (RR 0.73, 95% CI 0.56-0.96; P = 0.02), compared with women in the control group.
A routine ultrasound assessment for dating offered to women at the first antenatal visit provides more precise estimates of gestational age and reduces the need to adjust the estimate of the date of delivery in mid-gestation. Women who had an ultrasound at the first visit reported more positive feelings about their pregnancy, compared with women in the control group at that time.
closed_qa
Does an inflatable obstetric belt facilitate spontaneous vaginal delivery in nulliparae with epidural analgesia?
To assess whether an inflatable obstetric belt, synchronised to apply uniform fundal pressure during a uterine contraction, reduces operative delivery rates when used in the second stage of labour. Randomised controlled trial. Five hundred nulliparae with a singleton cephalic pregnancy at term and with an epidural in labour were recruited during the first stage and randomised at full dilatation. Standard care involved one hour passive second stage and one hour active pushing after which instrumental delivery was performed if delivery was not imminent. Those randomised to the belt group, in addition to standard care, had the inflatable obstetric belt for the whole second stage of labour. Mode of delivery. One hundred and eleven of the 260 women in the belt group (42.7%) compared with 94 of the 240 in the control group (39.2%) had a spontaneous vertex delivery (P = 0.423). The lift-out instrumental delivery rate was similar between the two groups: 108 belts (41.5%), compared with 101 controls (42.1%) (P = 0.902), whereas rotational instrumental deliveries in the belt group were 26 belts (10%) compared with 36 controls (15%) (P = 0.09). Fifteen women (5.8%) in the belt group and nine women (3.8%) in the control group had a caesarean section in the second stage (P = 0.292). An intact perineum was more likely in the belt group (16.5% compared with 9.6%, P = 0.022) as was a third degree tear (6.5% compared with 0.4%, P = 0.001).
The inflatable obstetric belt did not significantly reduce operative delivery rates when used in this clinical setting in the second stage of labour.
closed_qa
Isolated fetal echogenic intracardiac foci or golf balls: is karyotyping for Down's syndrome indicated?
To determine the prevalence of isolated echogenic intracardiac foci and the subsequent risk for Down's syndrome at 18-23 weeks in an unselected obstetric population. Prospective study. A district general hospital serving a routine obstetric population. 16,917 pregnant women who underwent a routine ultrasound screening at 18-23 weeks of gestation between November 1994 and August 1998. All women were offered screening for Down's syndrome by nuchal translucency or maternal serum biochemistry. The prevalence of isolated echogenic intracardiac foci was determined and the relative risk for Down's syndrome was calculated for different ultrasound findings. The combined sensitivity of age, nuchal translucency and maternal serum biochemistry for Down's syndrome was 84% (27/32). The relative risk for Down's syndrome was 0.17 (95% CI 0.07-0.41) for the women with normal scan findings at 18-23 weeks. The prevalence of isolated echogenic intracardiac foci at 18-23 weeks was 0.9% (144/16,917). None of these pregnancies were affected by Down's syndrome.
The significance of the association between isolated echogenic intracardiac foci and Down's syndrome is a matter of ongoing debate. The data of this study suggest that in an unselected obstetric population with prior, effective, routine Down's syndrome screening, the association between isolated echogenic intracardiac foci and Down's syndrome is no longer significant.
closed_qa
Preparation of the internal thoracic artery by vasodilator drugs: is it really necessary?
The internal thoracic artery has become the conduit of choice for coronary artery bypass grafting. To avoid spasm of the artery, and increases in its diameter and flow, various vasodilators have been used either intraluminally or by topical application by different surgeons. In order to define the best vasodilating agent for preparation of the internal thoracic artery, a randomized double-blind placebo-controlled clinical study was performed in a group of patients submitted for elective coronary artery bypass grafting. Eighty (80) consecutive patients submitted for elective first time coronary artery bypass grafting were randomly subdivided into five treatment groups. Free flow of the left internal thoracic artery was measured using an electromagnetic flow meter. The first measurement was performed shortly after the internal thoracic artery was dissected from the chest wall and the second just prior to performing distal anastomosis to the left anterior descending coronary artery. During the time interval between the two measurements the internal thoracic artery was immersed in a special applicator tube containing 20 ml solution of one of the following drugs: papaverin 2 mg/ml, nitroglycerin 1 mg/ml, verapamil 0.5 mg/ml, nitroprusside 0.5 mg/ml, normal saline 0.9%. No statistically significant differences were found between the groups in respect to age, body surface area, bypass time, cross clamping time, and time interval between the two flow measurements. Mean arterial pressure at the time of the first and second internal thoracic artery flow measurements did not show statistically significant differences either within or between the groups. In all five groups, the free flow of the internal thoracic artery increased significantly with time. However, no statistically significant differences were shown between the five groups with respect to second flow (P = 0.2).
Within the limits of our study design, we suggest that preparation of the LITA by topical vasodilator drugs using a special applicator tube does not result in a significantly superior free flow than placebo.
closed_qa
The medical review article revisited: has the science improved?
The validity of a review depends on its methodologic quality. To determine the methodologic quality of recently published review articles. Critical appraisal. All reviews of clinical topics published in six general medical journals in 1996. Explicit criteria that have been published and validated were used. Of 158 review articles, only 2 satisfied all 10 methodologic criteria (median number of criteria satisfied, 1). Less than a quarter of the articles described how evidence was identified, evaluated, or integrated; 34% addressed a focused clinical question; and 39% identified gaps in existing knowledge. Of the 111 reviews that made treatment recommendations, 48% provided an estimate of the magnitude of potential benefits (and 34%, the potential adverse effects) of the treatment options, 45% cited randomized clinical trials to support their recommendations, and only 6% made any reference to costs.
The methodologic quality of clinical review articles is highly variable, and many of these articles do not specify systematic methods.
closed_qa
Does a glass of red wine improve endothelial function?
To examine the acute effect of red wine and de-alcoholized red wine on endothelial function. High frequency ultrasound was used to measure blood flow and percentage brachial artery dilatation after reactive hyperaemia induced by forearm cuff occlusion in 12 healthy subjects, less than 40 years of age, without known cardiovascular risk factors. The subjects drank 250 ml of red wine with or without alcohol over 10 min according to a randomized procedure. Brachial artery dilatation was measured again 30 and 60 min after the subjects had finished drinking. The subjects were studied a second time within a week of the first study in a cross-over design. After the red wine with alcohol the resting brachial artery diameter, resting blood flow, heart rate and plasma-ethanol increased significantly. After the de-alcoholized red wine these parameters were unchanged. Flow-mediated dilatation of the brachial artery was significantly higher (P<0.05) after drinking de-alcoholized red wine (5.6+/-3.2%) than after drinking red wine with alcohol (3.6+/-2.2%) and before drinking (3.9+/-2.5%).
After ingestion of red wine with alcohol the brachial artery dilated and the blood flow increased. These changes were not observed following the de-alcoholized red wine and were thus attributable to ethanol. These haemodynamic changes may have concealed an effect on flow-mediated brachial artery dilatation which did not increase after drinking red wine with alcohol. Flow-mediated dilatation of the brachial artery increased significantly after de-alcoholized red wine and this finding may support the hypothesis that antioxidant qualities of red wine, rather than ethanol in itself, may protect against cardiovascular disease.
closed_qa
Does a completely accomplished duplex-based surveillance prevent vein-graft failure?
to assess the benefits of duplex-based vein-graft surveillance over clinical surveillance with distal pressure measurements. prospective randomised comparative trial. three hundred and forty-four patients with 362 consecutive infrainguinal vein bypasses were prospectively randomised to a follow-up regime with or without duplex scanning (ABI group and DD group) at 1, 3, 6, 9, and 12 months postoperatively. one hundred and eighty-three grafts were enrolled to the ABI group and 179 to the DD group. The primary assisted patency, secondary patency and limb salvage rates were 67%, 74%, 85% for the ABI group and 67%, 73%, 81% for the DD group. Ninety grafts in the ABI group and 57 in the DD group had surveillance that completely adhered to the protocol. The outcome was also similar for these groups at one year (77%, 87%, 94% and 77%, 83%, 93% respectively), although grafts were revised more frequently in the DD group.
intensive surveillance with duplex scanning did not improve the results of any outcome criteria examined. To demonstrate any potential benefit of duplex scanning for vein-graft surveillance a multicentre study with a large number of patients to ensure sufficient power is needed.
closed_qa
Do prostaglandins have a salutary role in skeletal muscle ischaemia-reperfusion injury?
the effects of prostaglandins (PG) E1, E2, and the prostacyclin analogue iloprost with and without the addition of free-radical scavengers catalase and superoxide dismutase on gastrocnemius blood flow and oedema were studied in a rodent model of hindlimb ischaemia-reperfusion. male Sprague-Dawley rats underwent 6-h hindlimb ischaemia with 4-h reperfusion. Prostaglandins were infused prior to reperfusion and their effects on limb blood flow and oedema examined. control animals exhibited a triphasic pattern of muscle blood flow during reperfusion compared to normal animals. PGE1 did not abolish low reflow at 10 min, relative reperfusion was preserved but reperfusion injury was abolished at 120 min. Muscle blood flow was increased at 240 min compared to controls. Increased limb swelling was also seen. Addition of free-radical scavengers caused the abolition of low reflow. Similar results were seen with iloprost. PGE2 abolished low reflow at 10 min and increased perfusion at 120 min but did not prevent reperfusion injury at 240 min.
PGE1 and iloprost enhance muscle blood flow at 4-h reperfusion, though neither abolishes low reflow; PGE2 improved flow at 10 and 120 min but not after 240 min. This study demonstrates a potentially beneficial role for prostaglandins in improving muscle blood flow in skeletal muscle ischaemia-reperfusion injury.
closed_qa
Occurrence of hippocampal sclerosis: is one hemisphere or gender more vulnerable?
We analyzed a large group of patients investigated for suspected seizures to test whether gender or side are important factors in the origins of hippocampal sclerosis (HS). We studied 996 consecutive patients (48% men, 52% women) by using standard hippocampal T2-relaxometry methods. HS was associated with a highly abnormal T2 time (<or =113 ms). Categoric analysis showed that hippocampal T2 time was independent of gender and side. T2 time was bilaterally normal in 81% of men and in 79% of women; it was unilaterally abnormal in 15% of both men and women; and bilaterally abnormal in 4% of men and in 6% of women. Highly abnormal T2 relaxometry, suggesting HS, occurred with equal frequency in men and women and on the right and left sides. Quantitative analysis of hippocampal T2 times showed values not differing significantly between men and women or between the right and left hemispheres. There was no significant interaction between gender and side.
In patients with seizure disorders, hippocampal T2 relaxometry is not different in adult men and women and in the right and left hemispheres.
closed_qa
Can interpectoral nodes be sentinel nodes?
This study was designed to determine if interpectoral nodes could be sentinel nodes for some breast cancers. Thirty-five consecutive breast cancer patients undergoing axillary node dissection had a dissection of the interpectoral nodes. These were sent to pathology as a separate specimen. Three patients were identified with isolated interpectoral nodal metastasis.
In upper quadrants or deep breast cancers the interpectoral nodes may be the earliest site of nodal metastasis. This may lead to false negative results in some sentinel node biopsies.
closed_qa
Do subjects with asthma have greater perception of acute bronchoconstriction than smokers with airflow limitation?
Smokers who develop chronic airflow limitation (CAL) do not usually present for medical attention until their lung disease is well advanced. In contrast, asthmatic subjects experience acute symptoms and present for care early in the course of their disease. The aim of this study was to determine whether subjects with asthma differ from smokers with CAL in their ability to perceive acute methacholine-induced bronchoconstriction. Thirteen subjects with diagnosed asthma and 10 current smokers with CAL, defined as forced expiratory volume in 1 s (FEV1)<75% predicted and FEV1/forced vital capacity<80%, with no previous diagnosis of asthma, were challenged with methacholine. Symptom severity was recorded on a Borg scale. Lung volumes were measured before challenge and after the FEV1 had fallen by 20%. After methacholine falls in FEV1 were similar in the asthmatic subjects and smokers. The regression lines relating change in FEV1 to symptom score were significantly steeper in asthmatic subjects than smokers (0.13 +/- 0.04, 0.03 +/- 0.04, respectively, P<0.01). At 20% fall in FEV1 there were no significant differences between asthmatic subjects and smokers in the magnitude of change of lung volumes.
In asthmatic subjects, symptoms are closely related to change in FEV1. In smokers with CAL, symptoms change little during bronchial challenge despite large changes in FEV1. The differences in perception between the two subject groups are not due to differences in acute hyperinflation during challenge. We propose that heavy smokers may adapt to poor lung function, or may have damaged sensory nerves as a result of prolonged cigarette smoking.
closed_qa
Is there a relationship between serum S-100beta protein and neuropsychologic dysfunction after cardiopulmonary bypass?
Over the past decade, the glial protein S-100beta has been used to detect cerebral injury in a number of clinical settings including cardiac surgery. Previous investigations suggest that S-100beta is capable of identifying patients with cerebral dysfunction after cardiopulmonary bypass. Whether detection of elevated levels S-100beta reflects long-term cognitive impairment remains to be shown. The present study evaluated whether perioperative release of S-100beta after coronary artery operations with cardiopulmonary bypass could predict early or late neuropsychologic impairment. A total of 100 patients undergoing elective coronary bypass without a previous history of neurologic events were prospectively studied. To exclude noncerebral sources of S-100beta, we did not use cardiotomy suction or retransfusion of shed mediastinal blood. Serial perioperative measurements of S-100beta were performed with the use of a new sensitive immunoluminometric assay up to 8 hours after the operation. Patients underwent cognitive testing on a battery of 11 tests before the operation, before discharge from the hospital, and 3 months later. No significant correlation was found between S-100beta release and neuropsychologic measures either 5 days or 3 months after the operation.
Despite using a sensitive immunoluminometric assay of S-100beta, we found no evidence to support the suggestion that early release of S-100beta may reflect long-term neurologic injury capable of producing cognitive impairment.
closed_qa
Is herniography an effective and safe investigation?
The records of all patients undergoing herniography within one unit over a 1 year period were studied retrospectively. A follow-up postal questionnaire was sent out to all patients enquiring about outcome and any complications of herniography. From a total of 64 patients undergoing a herniogram, 36% were found to have a positive result and 64% a negative result. This study showed a sensitivity rate of 0.94 and a specificity rate of 0.95. There was a 5% major complication rate leading to hospital admission, and 42% of patients described minor complications occurring within 24 hours of herniography.
Herniography is a useful diagnostic tool for identification of clinically occult hernias, with good rates of sensitivity and specificity. In most cases it is a safe investigation but it is not without a significant complication rate.
closed_qa
Does exchanging comments of Indian and non-Indian reviewers improve the quality of manuscript reviews?
The quality of peer reviewing in developing countries is thought to be poor. To examine whether this was so, we compared the performance of Indian and non-Indian reviewers who were sent original and review articles submitted to The National Medical Journal of India. We also tested whether informing reviewers that their comments would be exchanged improved the quality of their reviews. In a prospective, randomized, blinded study, we sent 100 manuscripts to pairs of peer reviewers (Indian and non-Indian) of which 78 pairs of completed replies were available for analysis. Thirty-eight pairs of reviews were exchanged and 40 were not. The quality of the reviews was assessed by two editors who were unaware of the reviewers' nationality and whether they had been told that their reviews would be exchanged. The quality of the reviews was scored out of 100 (based on a predesigned evaluation proforma). We also measured the time taken to return a manuscript. Overall, non-Indian reviewers scored higher than Indians (mean scores non-Indians first, 56.7 v. 48.6, p<0.001), especially those in the non-exchanged group (58.4 v. 47.3, p<0.001) but not the exchanged group (54.8 v. 50.0, p<0.06). Being informed that reviews would be exchanged did not affect the quality of reviews by non-Indians (54.8 exchanged v. 58.4 non-exchanged) or of reviews by Indians (50.0 exchanged v. 47.3 non-exchanged). The editors' assessment of the reviewers matched well (r = 0.59, p<0.001). Non-Indians took the same amount of time as Indians to return their reviews, although the postage time was at least eight days longer.
We found that non-Indian peer reviewers were better than Indians and informing them that their views would be exchanged did not seem to affect the quality of their reviews. We suggest that Indian editors should also use non-Indian reviewers and start training programmes to improve the quality of peer reviews in India.
closed_qa
Radiation exposure during fluoroscopy: should we be protecting our thyroids?
Recent reports on thyroid cancer among Australian orthopaedic surgeons prompted the present study which sought to evaluate the effectiveness of lead shielding in reducing radiation exposure (RE) to the thyroid region during endo-urological procedures. Radiation exposure to the thyroid region of the surgeon and scrubbed nurse was monitored for 20 consecutive operations over a 6-week period by thermoluminescent dosimeters (TLD). A TLD was placed over and underneath a thyroid shield of 0.5 min lead equivalent thickness to monitor the effect of shielding. Eight percutaneous nephrolithotomies, seven retrograde pyelograms and ureteric stentings and five ureteroscopies for calculous disease were monitored. Total exposure time was 63.1 min. For the surgeon, the total cumulative RE over and under the lead shield was 0.46 and 0.02 mSv, respectively, equating to a 23-times reduction in RE if shielding was used. This effectively reduced RE to almost background levels, which was represented by the control TLD exposure (0.01 mSv).
Although RE without thyroid shields did not exceed current standards set by radiation safety authorities, no threshold level has been set below which thyroid carcinogenesis is unlikely to occur. Because lead shields are easy to wear and can effectively reduce RE to the thyroid region to near-background levels, they should be made easily available and used by all surgeons to avoid the harmful effects of radiation on the thyroid.
closed_qa
Gender differences in musculoskeletal injury rates: a function of symptom reporting?
This study determined gender differences in voluntary reporting of lower extremity musculoskeletal injuries among U.S, Marine Corps (USMC) recruits, and it examined the association between these differences and the higher injury rates typically found among women trainees. Subjects were 176 male and 241 female enlisted USMC recruits who were followed prospectively through 11 wk (men) and 12 wk (women) of boot camp training. Reported injuries were measured by medical record reviews. Unreported injuries were determined by a questionnaire and a medical examination administered at the completion of training. Among female recruits the most commonly reported injuries were patellofemoral syndrome (10.0% of subjects), ankle sprain (9.1%), and iliotibial band syndrome (5.8%); the most common unreported injuries were patellofemoral syndrome (2.1%), metatarsalgia (1.7%), and unspecified knee pain (1.7%). Among male recruits iliotibial band syndrome (4.0% of subjects), ankle sprain (2.8%), and Achilles tendinitis/bursitis (2.8%) were the most frequently reported injuries; shin splints (4.6%), iliotibial band syndrome (4.0%), and ankle sprain (2.8%) were the most common unreported diagnoses. Female recruits were more likely to have a reported injury than male recruits (44.0% vs 25.6%, relative risk (RR) = 1.72, 95% confidence interval (CI) 1.29-2.30), but they were less likely to have an unreported injury (11.6% vs 23.9%, RR = 0.49, 95% CI 0.31-0.75). When both reported and unreported injuries were measured, total injury rates were high for both sexes (53.5% women, 45.5% men, RR = 1.18, 95% CI 0.96-1.44), but the difference between the rates was not statistically significant.
Our results indicate that the higher injury rates often found in female military trainees may be explained by gender differences in symptom reporting.
closed_qa
Minimally invasive saphenous vein harvesting: is there an improvement of the results with the endoscopic approach?
In the postoperative course after conventional open removal of the greater saphenous vein, wound healing disturbances are common and often painful. Therefore the primary goal of this investigation was to prove the safety and practicability of this new less invasive technique for saphenous vein harvesting and the effect on complications and morbidity. The study comprised 103 coronary artery bypass grafting (CABG) patients with an endoscopic approach to harvest the saphenous vein (MIVH). We used the VasoView II system developed by Origin, and compared the intraoperative procedure time and the clinical results with 105 equivalent patients in which a conventional open technique was used. In 101 patients endoscopic vein harvesting was successful; a conversion into open technique was necessary in two patients. On average 2.6 vein segments could be harvested in the endogroup versus 2.9 segments in the opengroup. The mean procedure time was 13.2 min per segment in the endogroup compared to 12.2 min per segment in the opengroup. Relevant hematoma were found in 29 patients (27.6%) of the opengroup, whereas only nine patients (8.7%) of the endogroup revealed severe hematoma. Infection was apparent in nine patients (8.5%) after conventional vein harvesting. Two infections were found after endoscopic intervention.
Endoscopic saphenous vein harvesting as part of a less invasive concept in cardiac surgery is a safe and after the learning curve, fast alternative to harvest the saphenous graft. The cosmetic result is excellent and the complication rate seems to be lower. It must be noted however, that the cost effectiveness of the method has to be proved and that further histological and functional studies are needed in order to check the intimal structure of the vein.
closed_qa
Endogenous mediators in emergency department patients with presumed sepsis: are levels associated with progression to severe sepsis and death?
We sought to determine whether levels of the endogenous mediators tumor necrosis factor (TNF)-alpha, interleukin (IL) 6, and nitric oxide (NO) measured in patients with presumed sepsis (systemic inflammatory response syndrome [SIRS] and infection) are different than levels in patients with presumed noninfectious SIRS, whether levels are associated with septic complications, and whether there are potential relationships between mediators. A prospective, observational tricenter study of a convenience sample of adults presenting to the emergency department meeting Bone's criteria for SIRS (any combination of fever or hypothermia, tachycardia, tachypnea, or WBC count aberration) was performed. Mediator levels were determined and associated with deterioration to severe sepsis (hypotension, hypoperfusion, or organ dysfunction) and death in subjects admitted to the hospital with presumed sepsis. One hundred eighty subjects with SIRS were enrolled and classified into 3 groups: group 1 (SIRS, presumed infection, admitted; n=108), group 2 (SIRS, presumed infection, discharged; n=27), and group 3 (SIRS, presumed noninfectious, admitted; n=45). Group 1 TNF-alpha and IL-6 levels were significantly higher than those found in the other groups. NO levels for groups 1 and 2 were significantly lower than those for group 3. TNF-alpha and IL-6 levels were higher in the group 1 subjects who had bacteremia or progressed to severe sepsis or death. NO levels were not associated with these outcomes.
ED patients admitted with presumed sepsis have elevated cytokine levels compared with patients with sepsis who are discharged and with those patients with presumed noninfectious SIRS. An association appears to exist between cytokines and subsequent septic complications in these patients. The importance of these measures as clinical predictors for the presence of infection and subsequent septic complications needs to be evaluated.
closed_qa
Angel trumpet: a poisonous garden plant as a new addictive drug?
Angel's trumpet (Species Brugmansia) is widely used as a garden plant because it is easily kept and the luxuriance of its flowering. Belonging to the Family Solanacea it contains a large amount of alkaloids (parasympatholytics). Because of its hallucinogenic action, its leaves and flowers are increasingly used by young people as a substitute for the hallucinogen LSD (lysergic acid diethylamide). In the summer of 1997, one of a group of youths died after they had ingested its flowers which they had gathered from front gardens. An investigation was undertaken to identify the alkaloids and measure their concentration in the various parts of the plant. Four young and one eight-year old plant were kept outdoors from May until October, and its flowers and leaves were removed for analysis weekly. All samples were deep-frozen at -20 degrees C and later, at the same time, thawed out, weighed and extracted in methanol. The alkaloids were identified by high pressure liquid chromatography (HPLC), diode array detector, separated by means of a Hypersil HyPurity cartridge, and measured at a wave-length of 220 nm. All 66 flowers, 32 leaves and 2 speed capsules contained tropane alkaloids, mainly scopolamine. The highest concentrations were found in the seed capsules, lower ones in the flowers, while the leaves contained only small amounts. Total alkaloid content per flower of the younger plants averaged 0.94 mg, of the younger ones 1.81 mg. The flowers of the old plant contained up to 3 mg scopolamine.
The ingestion of even a few flowers of Angel's trumpet can cause symptoms of poisoning. Easy availability of the plant thus presents a danger. Because of the increasing incidence of deliberate ingestion by young people, poisoning by Angel's trumpet should be included in the differential diagnosis in patients with confusion and hallucinations of uncertain origin, especially during the summer months.
closed_qa
Malacoplakia: a possible complication of poorly controlled diabetes mellitus?
A 47-year-old woman with poorly controlled diabetes mellitus (HbA1C 9.2%, fasting blood glucose>200 mg/dl) had complained of moderately severe stabbing pain in the left abdomen. On admission there were no abnormal findings on abdominal palpation. Abdominal ultrasound and computed tomography (CT) revealed a partly solid partly cystic well-circumscribed space-occupying lesion, about 15 cm in diameter, in the left abdomen, extending from the lower third of the kidney into the pelvis. Biopsy of the lesion showed chronic granulating inflammation with foamy histiocytes (Hansemann macrophages) as characteristic substrate of extensive malakoplakia. Despite the size of the lesion it was not excised but long-term treatment with ciprofloxacin undertaken. At the same time, the diabetes was carefully controlled with ordinary insulin. Ten months later there was no longer any evidence of the lesion by ultrasound and CT.
Even extensive malakoplakia can be successfully treated with ciprofloxacin. Poorly controlled diabetes together with a weak immune status (CD4/CD8<or = 1) may have favoured the occurrence of malakoplakia.
closed_qa
Iodine deficiency in ambulatory participants at a Sydney teaching hospital: is Australia truly iodine replete?
To assess iodine status in four separate groups--pregnant women, postpartum women, patients with diabetes mellitus and volunteers. Prospective cross-sectional study at a tertiary referral hospital in Sydney. 81 pregnant women attending a "high risk" obstetric clinic; 26 of these same women who attended three months postpartum; 135 consecutive patients with diabetes mellitus attending the diabetes clinic for an annual complications screen; and 19 volunteers. There were no exclusion criteria. Spot urine samples were obtained, and urinary iodine was measured by inductively coupled plasma mass spectrometer. Iodine status based on urinary iodine concentration categorised as normal (>100 micrograms/L), mild deficiency (51-100 micrograms/L) and moderate to severe deficiency (<50 micrograms/L). Moderate to severe iodine deficiency was found in 16 pregnant women (19.8%), five postpartum women (19.2%), 46 patients with diabetes (34.1%) and five volunteers (26.3%). Mild iodine deficiency was found in an additional 24 pregnant women (29.6%), nine postpartum women (34.6%), 51 patients with diabetes (37.8%) and 9 normal volunteers (47.4%). Median urinary iodine concentration was 104 micrograms/L in pregnant women, 79 micrograms/L in postpartum women, 65 micrograms/L in patients with diabetes mellitus and 64 micrograms/L in volunteers.
The high frequency of iodine deficiency found in our participants suggests that dietary sources of iodine in this country may no longer be sufficient. Further population studies are required.
closed_qa
Co-infection with malaria and HIV in injecting drug users in Brazil: a new challenge to public health?
To describe AIDS and malaria geography in Brazil, highlighting the role of injecting drug users (IDUs) in malaria outbreaks occurring in malaria-free regions, and the potential clinical and public health implications of malaria/HIV co-infection. Review of the available literature and original analyses using geoprocessing and spatial analysis techniques. Both HIV/AIDS and malaria distribution are currently undergoing profound changes in Brazil, with mutual expansion to intersecting geographical regions and social networks. Very recent reports describe the first clinical case of AIDS in a remote Amazonian ethnic group, as well as malaria cases in Rio de Janeiro state (hitherto a malaria-free area for 20 years); in addition, two outbreaks of both infections occurred at the beginning of the 1990s in the most industrialized Brazilian state (São Paulo), due to the sharing of needles and syringes by drug users. Spatial data point to: (a) the expansion of HIV/AIDS towards malarigenic areas located in the centre-west and north of Brazil, along the main cocaine trafficking routes, with IDU networks apparently playing a core role; and (b) the possibility of new outbreaks of secondary malaria in urban settings where HIV/AIDS is still expanding, through the sharing of needles and syringes.
New outbreaks of cases of HIV and malaria are likely to occur among Brazilian IDUs, and might conceivably contribute to the development of treatment-resistant strains of malaria in this population. Health professionals should be alert to this possibility, which could also eventually occur in IDU networks in developed countries.
closed_qa
Does the subspecialty of the surgeon performing primary colonic resection influence the outcome of patients with hepatic metastases referred for resection?
To compare resection rates and outcome of patients subsequently referred with hepatic metastases whose initial colon cancers were resected by surgeons with different specialty interests. Variation in practice among noncolorectal specialist surgeons has led to recommendations that colorectal cancers should be treated by surgeons trained in colorectal surgery or surgical oncology. The resectability of metastases, the frequency and pattern of recurrence after resection, and the length of survival were compared in patients referred to a single center for resection of colorectal hepatic metastases. The patients were divided into those whose colorectal resection had been performed by general surgeons (GS) with other subspecialty interests (n = 108) or by colorectal specialists (CS; n = 122). RESULTS No differences were observed with respect to age, sex, tumor stage, site of primary tumor, or frequency of synchronous metastases. Comparing the GS group with the CS group, resectable disease was identified in 26% versus 66%, with tumor recurrence after a median follow-up of 19 months in 75% versus 44%, respectively. Recurrences involving bowel or lymph nodes accounted for 55% versus 24% of all recurrences, with respective median survivals of 14 months versus 26 months.
Fewer patients referred by general surgeons had resectable liver disease. After surgery, recurrent tumor was more likely to develop in the GS group; their overall outcome was worse than that of the CS group. This observation is partly explained by a lower local recurrence rate in the CS group.
closed_qa
Linked production of antibodies to mammalian DNA and to human polyomavirus large T antigen: footprints of a common molecular and cellular process?
To test whether the presence of antibodies to human polyomavirus large T antigen, a viral DNA-binding protein essential for productive polyomavirus replication, correlates with the presence of antibodies to single-stranded DNA (ssDNA), double-stranded DNA (dsDNA), or the autologous TATA-binding protein (TBP). Sera from patients with various diagnosed or suspected autoimmune syndromes were analyzed for the presence of antibodies to T antigen, DNA, or TATA-binding protein, and correlations were determined. Rheumatoid factor (RF) was studied as a control antibody. A highly significant correlation between antibodies to T antigen and antibodies to ssDNA or TATA-binding protein, but not between anti-T antigen antibodies and RF, was found in all patient groups. Of all sera that were positive for antibodies to dsDNA, 62% were positive for antibodies to T antigen (P<0.03).
A non-self DNA-binding protein such as human polyomavirus large T antigen may render DNA immunogenic upon binding to nucleosomes when expressed in vivo. This is indicated by the strong correlation between antibodies to T antigen and antibodies to DNA or TBP and is consistent with a hapten-carrier model. This model implies cognate antigen-selective interaction of T antigen-specific T helper cells and DNA-specific B cells or B cells specific for other components of nucleosomes, consistent with the results of previous experiments.
closed_qa
Radicular pain caused by synovial cyst: an underdiagnosed entity in the elderly?
Synovial cyst is a recognized but infrequent cause of nerve root or spinal canal compression. The authors undertook a review of 839 decompressive spinal procedures performed over a 5-year period. They found seven cases in which the symptoms were caused by synovial cysts. Six of these cases were in a subgroup of 80 patients who were older than 60 years of age, which represents 7.5% of the total for this age group. More than 200 cases of this abnormality have been reported in the world literature, but the incidence, prevalence, and natural history remain unknown.
The authors propose that the incidence of synovial cysts may be more common than recognized in the elderly and suggest that preoperative diagnosis may help limit the extent of the surgical approach.
closed_qa
Risk factors for the acquisition of genital warts: are condoms protective?
To characterise risk factors for the acquisition of genital warts and specifically to determine whether condoms confer protection from infection. A retrospective case-control study comparing demographic, behavioural, and sexual factors in men and women with and without newly diagnosed genital warts, who attended Sydney Sexual Health Centre (SSHC), an inner city public sexual health centre, in 1996. Data were extracted from the SSHC database. Crude odds ratios (OR) were calculated to compare cases and controls and significant factors were then controlled for using multivariate logistic regression to obtain adjusted odds ratios (ORs). 977 patients with warts and 977 controls matched by sex and date of attendance were included. In both sexes, univariate analysis revealed that younger age, more lifetime sexual partners, failure to use condoms, greater cigarette smoking and alcohol consumption were associated with warts, and there was a negative association with previous infection with Chlamydia trachomatis, Neisseria gonorrhoeae, hepatitis B, and genital herpes. In males, on multivariate analysis, factors which remained significant were younger age, more lifetime sexual partners; failure to use condoms, greater cigarette smoking, and previous chlamydia. In women, factors which remained significant were younger age, more lifetime sexual partners, condom use, marital status, and previous infections with Chlamydia trachomatis and herpes.
Independent risk factors for genital warts include younger age, greater number of lifetime sexual partners, and smoking. Consistent condom use significantly reduces the risk of acquiring genital warts.
closed_qa
Cervical cytology: are national guidelines adequate for women attending genitourinary medicine clinics?
To study whether all women attending a genitourinary medicine (GUM) clinic warrant a cervical smear as part of a routine screen for infection, or whether this "at risk" population is adequately covered by the national screening programme. A cervical smear and a screen for sexually transmitted infections (STI) were taken from 900 women attending a GUM clinic between May 1996 and April 1997. Of 812 smears available for analysis, 613 (75.5%) were normal, 176 (21.7%) were mildly abnormal, and 23 (2.8%) were moderately or severely abnormal. In the absence of an STI there was a 14% (37/273) risk of having an abnormal cervical smear. In the presence of cervicitis the risk was 26% (22/84) and with genital warts the risk was 34% (75/215).
The national screening programme guidelines for cervical cytology should be followed in the GUM clinic. There is no benefit in performing extra smears outside the programme nor in adopting a policy of universal screening.
closed_qa
Can children with autistic spectrum disorders perceive affect in music?
Children with autistic spectrum disorders typically show impairments in processing affective information within social and interpersonal domains. It has yet to be established whether such difficulties persist in the area of music; a domain which is characteristically rich in emotional content. Fourteen children with autism and Asperger syndrome and their age and intelligence matched controls were tested for their ability to identify the affective connotations of melodies in the major or minor musical mode. They were required to match musical fragments with schematic representations of happy and sad faces. The groups did not differ in their ability to ascribe the musical examples to the two affective categories.
In contrast to their performance within social and interpersonal domains, children with autistic disorders showed no deficits in processing affect in musical stimuli.
closed_qa
Is there a role for pneumonectomy in pulmonary metastases?
Although sublobar and lobar resections are accepted operations for pulmonary metastases, pneumonectomy is viewed as a major incursion on Stage IV patients. We considered it important to ascertain the current results of pneumonectomy for pulmonary metastases since little information is available. Of the 5,206 patients with pulmonary metastasectomy reported by the International Registry of Lung Metastases, 133 (3%) underwent primary, and 38 (1%) completion pneumonectomy between 1962 and 1994. Data were analyzed to determine the operative mortality rates, survival rates, and determinants of survival. Primary pneumonectomy was performed for metastatic disease mainly from epithelial (49%, 65 of 133) and sarcomatous (33%, 43 of 133) tumors. Indications were central lesion, eg, proximal endobronchial or hilar nodal metastases. Operative mortality was 4% (4 of 112) and a 5-year survival rate of 20% was achieved following complete resection (R0) in 112 patients. In contrast, the 21 incompletely resected patients had an operative mortality rate of 19% (4 of 21), and the majority did not survive beyond 2 years (p = 0.02). Survival was determined by the completeness of resection and not histology of the primary tumor, number of metastases, nodal status, and disease-free interval. In the 38 completion pneumonectomy patients, 35 were operated for recurrent disease and 3 for residual disease. Sarcomatous secondaries predominated in 28 patients. Complete resection was achieved in 31 patients (82%). The operative mortality rate was 3% (1 of 38 patients) and the 5-year survival rate was 30%.
Pneumonectomies for pulmonary metastases, albeit infrequently performed, were associated with acceptable operative mortality and long-term survival when performed in selected patients amenable to complete resection.
closed_qa
Cardiac surgery in octogenarians: can elderly patients benefit?
Increasing numbers of the very old are presenting for cardiac surgical procedures. There is little information about quality of life after hospital discharge in this group. From March 1995 to February 1997, 127 patients older than 80 years at operation (mean age, 83+/-2.5 years; range, 80 to 92 years) were entered into the cardiac surgery database and analyzed retrospectively. The RAND SF-36 Health Survey and the Seattle Angina Questionnaire were used to assess quality of life by telephone interview (mean follow-up, 15.7+/-6.9 months). No patient was lost to follow-up. Operations included coronary artery bypass grafting (65.4%), coronary artery bypass grafting plus valve replacement (15.8%), and isolated valve replacement (14.2%). Preoperatively, 63.8% were in New York Heart Association class IV. Thirty-day mortality was 7.9%, and actuarial survival was 83% (70% confidence interval, 79% to 87%) at 1 year and 80% (70% confidence interval, 75% to 85%) at 2 years. Preoperative renal failure significantly increased the risk of early death (relative risk, 3.96) as did urgent or emergent operation (relative risk, 6.70). In addition, cerebrovascular disease (relative risk, 3.54) and prolonged ventilation (relative risk, 3.82) were risk factors for late death. Ninety-five patients (92.2%) were in New York Heart Association class I or II at follow-up. Seattle Angina Questionnaire scores for anginal frequency (92.3+/-18.9), stability (94.4+/-16.5), and exertional capacity (86.8+/-25.1) indicated good relief of symptoms. SF-36 scores were equal to or better than those for the general population of age greater than 65 years. Of the survivors, 83.7% were living in their own home, 74.8% rated their health as good or excellent, and 82.5% would undergo operation again in retrospect.
Octogenarians can undergo cardiac surgical procedures at a reasonable risk and show remarkable improvement in their symptoms. Elderly patients benefit from improved functional status and quality of life.
closed_qa
Are older patients with mechanical heart valves at increased risk?
Controversy exists regarding the use of mechanical valves in older patients. Many authorities believe that the use of anticoagulants in the elderly is associated with an increased risk of warfarin-related complications. Therefore, we compared the results with mechanical valves in older patients to a cohort of younger patients. Aortic (AVR) or mitral valve replacement (MVR) with a mechanical valve was performed in 1,245 consecutive patients who were followed prospectively. They were grouped by age (group 1,<or = 65 years; group 2,>65 years). The study groups consisted of AVR (group 1, 459 patients; group 2, 323 patients) MVR (group 1, 313 patients; group 2, 150 patients). The average age for the groups was: AVR (group 1, 51 years; group 2, 70 years; p = 0.03) and MVR (group 1, 53 years; group 2, 70 years; p = 0.03). For AVR the incidence of thromboembolism was 0.050 (group 1) and 0.038 (group 2) (p = 0.37) and the actuarial freedom from thromboembolism was 83.0%+/-3.0% and 86.5%+/-1.0%, respectively (p = 0.13). The incidence of bleeding after AVR was 0.021 for group 1 and 0.028 for group 2 (p = 0.49). For MVR the incidence of thromboembolism was 0.059 for group 1 and 0.051 for group 2 (p = 0.75) and the actuarial freedom from thromboembolism was 78.8%+/-3.0% and 75.4%+/-8.7%, respectively (p = 0.71). The incidence of bleeding after MVR was 0.020 for group 1 and 0.027 for group 2 (p = 0.62).
Mechanical valves perform well in selected older patients with no increased risk of bleeding or thromboembolism.
closed_qa