context
stringlengths
151
5.1k
answer
stringlengths
3
2.87k
question
stringlengths
1
25.7k
The objective was to assess whether idiopathic normal-pressure hydrocephalus (iNPH) has a worse prognosis than other forms of hydrocephalus, as has been suggested. A total of 125 patients with chronic hydrocephalus, 75 of whom suffered from iNPH and the remaining (non-INPH) from sNPH or non-communicating hydrocephalus, were shunted using gravitational valves. Clinical state was assessed with our clinical grading (KI) and a co-morbidity index (CMI). Average follow-up was 5.1 +/- 1.6 years. Spearman, Kruskal-Wallis, ANOVA, chi(2)- and the Wilcoxon U tests at a significance level of pi<0.05 were used. Shunt responder rates for iNPH and non-iNPH were 72% and 86%, respectively. With shorter anamnesis (<or =1 year) or preoperative KI<6 points, iNPH patients had a similar or even better outcome than non-iNPH patients with longer anamnesis or a worse KI. Most impressive was the influence of co-morbidity: 86% of iNPH patients with a low CMI (<or =3 points) experienced clinical improvement after shunting, which was contrasted by a responder rate of 64% for non-iNPH with worse CMI.
The diagnosis of iNPH does not by itself mean a worse prognosis, and iNPH patients with favorable preconditions may have a similar or better prognosis than patients with any other kind of hydrocephalus. The worse overall clinical results of iNPH result from late recognition and in most instances worse preconditions.
Does idiopathic normal pressure hydrocephalus always mean a poor prognosis?
This study tested whether coordinated care management, a continuity of care intervention for substance-use disorders, improved employment among men and women on public assistance compared with usual welfare management. Participants were 421 welfare applicants identified via substance-use-disorder screening and assigned via a computerized allocation program to coordinated care management (CCM; n = 232) or referral and monitoring practices in usual care (UC; n = 189). Substance use, treatment attendance, job training and search activities, and employment outcomes were assessed for 1 year after baseline. Men were more likely to be working than women overall. Among women, CCM clients increased their employment over time, whereas UC clients remained stable at very low employment levels. There were no treatment effects on employment for men. Also among women only, greater substance-use-disorder treatment attendance and abstinence in the first 6 months of CCM predicted higher rates of later employment. Job training activities were low and did not differ by condition between either gender.
Findings are consistent with previous research supporting the effectiveness of case management for improving abstinence, which leads to employment gains, among substance-using women on public assistance. In contrast, various mandated elements of welfare-to-work programs for substance users-treatment attendance, case management, job training-did not improve employment rates for men. Implications of study results for designing effective welfare-to-work interventions in a post-welfare-reform era are discussed.
Does coordinated care management improve employment for substance-using welfare recipients?
The effects of thoracolumbal spinal cord stimulation (SCS) are confined to restricted microcirculatory areas. This limitation is generally attributed to a predominantly segmental mode of action on the autonomic nervous system. The goal of this study was to determine whether SCS applied close to supraspinal autonomic centers would induce generalized hemodynamic changes that could explain its alleged antianginal properties. Invasive hemodynamic tests were performed in 15 anesthetized Göttingen minipigs submitted to iterative cervical SCS of various duration and intensity. Hemodynamic changes exceeding 10% were observed in 59 of 68 SCS sessions (87%). Their extent and time to peak varied with SCS intensity. At 2, 5, and 10 V, significant (t test p<0.05) peak changes occurred in cardiac output (+34%, +29%, and +28%, respectively), stroke volume (+19%, +16%, +15%), mean pressure (+9%, +27%, +40%), heart rate (+14%, +23%, +14%), systemic (-17%, NS, NS), and pulmonary vascular (25%, NS, NS) resistances. Strikingly, at 2 V, the increase in cardiac output (+34%) was higher than the synchronous rise in rate pressure product (+22%), indicating efficient cardiac work. At 10 V, however, the cardiac work was inefficient (rate pressure product + 53%/cardiac output + 28%).
Low-voltage cervical neuromodulation reduces the postcharge and improves cardiac work efficiency. The resulting reduction in oxygen myocardial demand may account for decreased anginal pain.
Spinal cord stimulation treatment for angina pectoris: more than a placebo?
A total of 733 students from the Zagreb and 102 medical students from the Mostar University School of Medicine filled out an anonymous questionnaire during their enrollment into the next academic year. The questionnaire consisted of 10 Likert-type questions with 1-5 answer scale, which were designed to give an illustration of students' attitude towards war. The test score was calculated as the sum of all answers x 2+20. The total score ranged from minimum 40 to maximum 120 points, with a higher score indicating stronger inclination toward peaceful way of solving conflicts. There was no difference between the mean total scores of Zagreb and Mostar students (66-/+17 and 67-/+18, respectively; p=0.744). The mean score of female students was higher than that of male students (71-/+19 vs 63-/+16; p<0.001) for the whole sample as well as for Zagreb and Mostar samples separately (p<0.001 for both). The average score of 2.3-/+0.9 per question indicated that the students' choice was mostly undecided on war-prone activities. Younger students were more war-prone than older ones (p=0.008 for age, and p=0.024 and 0.013 for comparisons between students in earlier and later academic years). Students from cities that were affected by war but not severely damaged seemed less war-prone than students from cities that were either seriously damaged or not directly affected by war (p=0.032).
Women, older students, and students from cities that were under war threat but not seriously damaged showed to be more morally engaged towards peace.
Peace test: is war sometimes a better solution?
Large national trials may influence surgical practice. In this study the relation between the successful national randomized trial on the management of rectal cancer (the Dutch TME trial) and national ratio of abdomino-perineal resection to low anterior resection and anastomosis was analysed. In the study period, 1994-99, 15978 patients underwent either abdomino-perineal resection (n = 2575) or low anterior resection and anastomosis (n = 13403). The Dutch TME trial started in 1996 and a total of 1530 patients were included by 83 hospitals and 82.1% of these patients were treated from 1997 to 1999. Teaching sessions, tutor assisted surgery and quality control formed an integral and important part of the TME trial. Ratio of abdomino-perineal resection vs. low anterior resection was compared between period I (1994-96) and period II (1997-99). The ratio decreased from 0.19 to 0.13 between period I and II (95% CI, -0.08 to -0.04, P<0.001). In hospital mortality rate did not change between period I and II (3.5 vs. 3.7, 95% CI, -0.08 to 0.03, P=0.385).
Significant changes in surgical attitude may accompany successful national randomized trials in which investigated surgical procedures are specified, taught, and controlled. The APR ratio declined by 32% in the Netherlands during and following the Dutch TME trial, without a rise in hospital mortality rate for rectal resections.
Nationwide decline in annual numbers of abdomino-perineal resections: effect of a successful national trial?
To determine whether circulating levels of cell adhesion molecules, markers of endothelial damage and leucocyte activation, were increased in pre-eclampsia. Serum was prepared from peripheral venous blood and stored at -70 degrees C. The cell adhesion molecules, VCAM-1, E-Selectin and ICAM-1, were measured by ELISA. Department of Obstetrics and Gynaecology, Royal Infirmary, Glasgow. Sixteen primigravid women with pre-eclampsia were recruited for the study. The preeclampsia group were compared with 18 healthy primigravid women with uncomplicated pregnancies. The pre-eclamptic group had significantly higher serum levels of the cell adhesion molecule VCAM-1 (t = 3.673; P<0.001). There were no significant differences in the adhesion molecules ICAM-1 or E-Selectin.
Endothelial damage and dysfunction are common to all the pathological features of pre-eclampsia. This study shows that concentrations of cell adhesion molecules, which indicate leucocyte-endothelial attachment and activation, are elevated in the serum of patients with pre-eclampsia. Such increases in soluble circulating cell adhesion molecules may reflect increased expression of these molecules on the endothelium and thereby explain the mechanism for leucocyte activation in pre-eclampsia.
The cell adhesion molecule, VCAM-1, is selectively elevated in serum in pre-eclampsia: does this indicate the mechanism of leucocyte activation?
The recommended interval for colorectal cancer screening with flexible sigmoidoscopy (FS) was recently lengthened from 3 to 5 yr. Direct evidence supporting the longer interval is lacking. The appropriateness of the longer interval has been questioned. To compare the incidence of neoplasia detected on FS in individuals who had undergone an FS either 3 yr or 5 yr after a normal examination. Subjects were drawn from 5,359 individuals who underwent two FS examinations performed for colorectal cancer screening. Examinations were performed by gastroenterologists at a single academic medical center between 1987 and 2002. A total of 2,146 subjects with a normal baseline examination and a follow-up examination 3 and 5 yr later was included. To compare the incidence of neoplasia, including advanced neoplasia, detected 3 yr versus 5 yr after a normal FS. 915 subjects underwent FS at 3 yr and 1,231 subjects at 5 yr after a normal examination. Neoplasia was detected in 3.2% of the 3-yr and 4.3% of the 5-yr subjects (p=0.17). No significant differences were detected in the pathology, multiplicity, or size of neoplasms between the 3- and 5-yr groups. Advanced neoplasms occurred in 0.9% (including one adenocarcinoma) of subjects at 3 yr and 1.1% of subjects at 5 yr (p=0.67).
Few individuals will develop rectosigmoid neoplasms 3 or 5 yr after a normal FS. The majority of neoplasms detected are low-risk lesions. A screening interval of 5 yr after a normal FS does not portend an increased risk of advanced neoplasms including cancer. This direct evidence supports the current recommendations of a 5-yr interval for colorectal cancer screening with FS.
Screening for colorectal cancer with flexible sigmoidoscopy: is a 5-yr interval appropriate?
The aims were to establish a gestational-age specific curve for serum total thyroxine (T4) levels and to compare pregnancy outcomes of euthyroid women with those identified to have subclinical hypothyroidism (SCH) defined by an elevated thyroid-stimulating hormone (TSH) level in conjunction with either total T4 or free T4 determinations. Over a 2.5 year period, serum thyroid analytes were measured in all women presenting for prenatal care. After exclusion of women with overt thyroid disorders, the normal distribution of serum total T4 levels were determined by quantile curves for those screened in the first 20 weeks and who were delivered of a singleton infant weighing at least 500 g. Pregnancy outcomes for women with an elevated TSH and normal total T4 concentrations were analyzed and compared with those of women identified to have SCH defined by normal free T4 levels. Of 17,298 women tested, serum total T4 increased into the second trimester and plateaued around 16 weeks. The upper threshold for total T4 ranged from 12.6 to 16.4 μg/dL, and the lower threshold ranged from 5.3 to 8.0 μg/dL. Women identified to have SCH defined by serum free T4, total T4, or both were at risk for preterm delivery (P = .007) and placental abruption (P = .013) when compared with euthyroid women.
When combined with elevated TSH levels, free or total T4 determinations are equally sensitive to identify women with SCH who are at increased risk for preterm birth and placental abruption when compared with euthyroid women.
Is total thyroxine better than free thyroxine during pregnancy?
This prospective, randomized, controlled study included 124 short children (33 girls) who received GH treatment (Genotropin®; Pfizer Inc.) from a mean age of 11 years until near adult height [intent-to-treat (ITT) population]. Children were randomized into three groups: controls (n = 33), GH 33 μg/kg/day (n = 34) or GH 67 μg/kg/day (n = 57). Prepubertal children at study start constituted the per-protocol (PP) population (n = 101). Auxological measurements were made and puberty was staged every 3 months. Serum sex-steroid concentrations were assessed every 6 months. No significant differences were found between the groups, of both PP and ITT populations, in time elapsed from start of treatment until either onset of puberty, age at start of puberty or age at final pubertal maturation in either sex. In the ITT population, pubertal duration was significantly longer in GH-treated girls, and maximum mean testicular volume was significantly greater in GH-treated boys than controls, but there were no differences in testosterone levels between the groups.
GH treatment did not influence age at onset of puberty and did not accelerate pubertal development. In boys, GH treatment appeared to increase testicular volume.
Does growth hormone treatment influence pubertal development in short children?
Many randomized controlled trials (RCTs) collect cost-effectiveness data. Without appropriate sample size calculations, patient recruitment may cease before the cost-effectiveness of the intervention can be established or continue after the cost-effectiveness of the intervention is established beyond doubt. We determined the frequency with which cost-effectiveness is considered in sample size calculations and whether RCT-based economic evaluations are likely to come to inconclusive results at odds with the clinical findings. We searched the National Health Service Economic Evaluation Database (NHS EED) to identify RCT-based cost-utility analyses. RCTs that collected individual patient data on costs and quality-adjusted life years (QALYs) were eligible. Studies using models to extrapolate the results of RCTs or with insufficient information on incremental costs and QALYs were excluded. In total, 38 trials met eligibility criteria. Only one considered cost-effectiveness in sample size calculations. RCTs were less likely to reach definitive conclusions based on the cost-effectiveness results than the primary clinical outcome (15.8% vs. 42.1%; McNemar; p = 0.01). In trials that provided sufficient data, exploratory analysis indicated that the median power to detect important differences was 29.5% for QALYs, 94.1% for costs, and 78.7% for the primary clinical outcome. In three trials (7.9%), a definitely more effective intervention was found to be expensive and probably not cost-effective. Our results reflect trials where authors considered within-trial estimates of cost-effectiveness to be meaningful. In focusing on one primary clinical outcome from each RCT, we have simplified the clinical effectiveness results, although the primary outcome will usually be one that policy makers use in judging the 'success' of the intervention.
Economic evaluations conducted alongside RCTs are valuable, but often present inconclusive evidence. Trial results may lead to discordant messages when the most effective intervention is probably not the most cost-effective. Despite methodological advances, trialists rarely assessed the extent to which their trial might resolve the key uncertainties about the cost-effectiveness of interventions. We recommend that grant funders should do more to encourage trialists to include economic end points in sample size calculations, particularly when the majority of costs and benefits of the intervention occur within the time frame of the trial.
Cost-utility analysis conducted alongside randomized controlled trials: are economic end points considered in sample size calculations and does it matter?
True regeneration of the dental pulp-dentin complex in immature teeth with necrotic pulps has not been shown histologically. It is not known to what extent this true tissue regeneration is necessary to achieve clinically acceptable outcomes. This case report describes the treatment of a patient with an immature maxillary right central incisor with a history of impact trauma and enamel-dentin crown fracture. A diagnosis of pulp necrosis with acute apical abscess was established. A regenerative endodontic protocol that used a paste containing Augmentin for 5 weeks as an intracanal medicament was used. Follow-ups at 9, 12, 17, and 31 months revealed complete osseous healing of the periapical lesion and formation of the root apex, but without increase in root length. Clinically, the tooth was functional, asymptomatic, and nonresponsive to pulp vitality tests. The crown discolored over time. On reentering the root canal, no tissues were observed under magnification inside the root canal space. The root canal treatment was completed with mineral trioxide aggregate obturation.
Augmentin might be an acceptable choice for root canal disinfection in regenerative endodontic procedures. The protocol for regenerative endodontic treatment is not predictable for pulp-dentin regeneration. Formation of the root apex is possible without pulp regeneration.
Is pulp regeneration necessary for root maturation?
There are multiple nationally representative databases that support epidemiologic and outcomes research, and it is unknown whether an otolaryngology-specific resource would prove indispensable or superfluous. Therefore, our objective was to determine the feasibility of analyses in the National Ambulatory Medical Care Survey (NAMCS) and National Hospital Ambulatory Medical Care Survey (NHAMCS) databases as compared with the otolaryngology-specific Creating Healthcare Excellence through Education and Research (CHEER) database. Parallel analyses in 2 data sets. Ambulatory visits in the United States. To test a fixed hypothesis that could be directly compared between data sets, we focused on a condition with expected prevalence high enough to substantiate availability in both. This query also encompassed a broad span of diagnoses to sample the breadth of available information. Specifically, we compared an assessment of suspected risk factors for sensorineural hearing loss in subjects 0 to 21 years of age, according to a predetermined protocol. We also assessed the feasibility of 6 additional diagnostic queries among all age groups. In the NAMCS/NHAMCS data set, the number of measured observations was not sufficient to support reliable numeric conclusions (percentage standard error among risk factors: 38.6-92.1). Analysis of the CHEER database demonstrated that age, sex, meningitis, and cytomegalovirus were statistically significant factors associated with pediatric sensorineural hearing loss (P<.01). Among the 6 additional diagnostic queries assessed, NAMCS/NHAMCS usage was also infeasible; the CHEER database contained 1585 to 212,521 more observations per annum.
An otolaryngology-specific database has added utility when compared with already available national ambulatory databases.
Does an Otolaryngology-Specific Database Have Added Value?
To determine whether the hill of vision for Short-Wavelength Automated Perimetry (SWAP) is shallower for women who consume phyto-oestrogen-rich foods than for women who do not. Visual field data were compared for two groups of healthy amenorrhoeic women 48-69 years-old with normal vision and not using hormone replacement: (1) 24 subjects who reported consuming soy and/or flax products and (2) 20 subjects who reported not consuming these products. Two types of 24-2 visual fields were measured: (1) Full Threshold SWAP and (2) a white-on-white (W/W) field obtained using a Swedish Interactive Threshold Algorithm (SITA Standard). The reduction of SWAP sensitivity from the centre of the field (4 loci, mean eccentricity = 4.2°) to the periphery (20 loci, mean eccentricity = 21.9°) was less for soy/flax consumers than for nonconsumers, both with age-referencing (mean difference = 1.7 dB, p = 0.018) and without (p = 0.012). Corresponding distinctions existed for the SWAP - W/W difference, and there was minimal effect for W/W fields alone. The peripheral age-referenced SWAP sensitivities averaged 2.5 dB higher for consumers than nonconsumers (p = 0.022).
The between-group distinctions are consistent with the possibility (derived from the women's health literature) that phyto-oestrogens may counteract a decline of short-wavelength-sensitive cone-mediated response among postmenopausal women. These results suggest another potential application for SWAP outside its original intended purpose as a glaucoma test. Future studies should assess whether phyto-oestrogen consumption is most beneficial for women who are sufficiently young and/or not too far beyond menopause.
Variability in short-wavelength automated perimetry among peri- or postmenopausal women: a dependence on phyto-oestrogen consumption?
Two hundred and twenty-six outpatients (pts), scheduled for a first-time non-emergency EGD were randomly assigned to 4 groups: Co-group (62 pts): throat anaesthesia only; Mi-group (52 pts): CS with i.v. midazolam; Re-group (58 pts): presence of a relative throughout the procedure; Vi-group (54 pts): additional information with a videotape. Anxiety was measured using the "Spielberger State and Trait Anxiety Scales". The patients assessed the overall discomfort during the procedure on an 100-mm visual analogue scale, and their tolerance to EGD answering a questionnaire. The endoscopist evaluated the technical difficulty of the examination and the tolerance of the patients on an 100-mm visual analogue scale and answering a questionnaire. Pre-endoscopy anxiety levels were higher in the Mi-group than in the other groups (P<0.001). On the basis of the patients' evaluation, EGD was well tolerated by 80.7% of patients in Mi-group, 43.5% in Co-group, 58.6% in Re-group, and 50% in Vi-group (P<0.01). The discomfort caused by EGD, evaluated by either the endoscopist or the patients, was lower in Mi-group than in the other groups. The discomfort was correlated with "age" (P<0.001) and "groups of patients" (P<0.05) in the patients' evaluation, and with "gender" (females tolerated better than males, P<0.001) and "groups of patients" (P<0.05) in the endoscopist's evaluation.
Conscious sedation can improve the tolerance to EGD. Male gender and young age are predictive factors of bad tolerance to the procedure.
Upper gastrointestinal endoscopy: are preparatory interventions or conscious sedation effective?
There are few reports comparing the variety and frequency of postoperative complications between patients with a major clinical leak requiring emergency abdominal reoperation and those with a minor leak diagnosed from clinical signs and managed expectantly without reoperation. This study examined the association between severity of leakage and 18 other postoperative complications, postoperative mortality, and length of postoperative hospital stay. Data were drawn from a comprehensive, prospective hospital registry of 1,507 colorectal cancer resections involving an anastomosis from January 1995 to December 2006. Differences were evaluated by two-tailed Fisher's exact test, Student's t-test, or Mann-Whitney U test. Leaks occurred in 54 patients (3.6%; 95% CI, 2.7% to 4.7%), comprising 21 major (1.4%; 95% CI, 0.9% to 2.1%) and 33 minor leaks (2.2%; 95% CI, 1.5% to 3.2%). Patients with a leak were significantly (p<0.01) more likely than those without to have 11 of 18 other surgical and medical complications considered, although with few differences in complication rates between those with major and minor leaks. As compared with patients without leak, those with a leak (major or minor) had several of these complications rather than just one (p<0.001) and greatly prolonged hospital stay (p<0.001). Postoperative mortality was higher after major leaks than after minor leaks (4 of 21 and 0 of 33, respectively, p = 0.019).
A minor leak is not trivial. Apart from the fact that major clinical leakage necessitates urgent reoperation, there were few other differences between major and minor clinical leaks in the frequency of other complications.
Is a minor clinical anastomotic leak clinically significant after resection of colorectal cancer?
Ipsilateral central compartment node dissection has been proposed to reduce the morbidity of prophylactic bilateral central compartment node dissection in papillary thyroid carcinoma (PTC), but it carries the risk of contralateral metastases being overlooked in approximately 25 % of patients. We aimed to verify if frozen section examination (FSE) can identify patients who could benefit from bilateral central compartment node dissection. All the consenting patients with clinically unifocal PTC, without any preoperative evidence of lymph node involvement, observed between September 2010 and September 2011 underwent total thyroidectomy plus bilateral central compartment node dissection. Ipsilateral central compartment nodes were sent for FSE. Forty-eight patients were included. Mean number of removed nodes was 13.2 ± 6.8. Final histology showed lymph node metastases in 21 patients: ipsilateral in 15, bilateral in 6. FSE accurately predicted lymph node status in 43 patients (27 node negative, 16 node positive). Five node metastases were not detected at FSE: three were micrometastases (≤ 2 mm). Sensitivity, specificity and overall accuracy of FSE in definition of N status status were 80.7, 100, and 90 %, respectively.
FSE is accurate in predicting node metastases in clinically unifocal node negative PTC and can be useful in determining the extension of central compartment node dissection. False-negative results are reported mainly in case of micrometastases, which usually have limited clinical implications.
Can intraoperative frozen section influence the extension of central neck dissection in cN0 papillary thyroid carcinoma?
We hypothesized that novices would be able to use the McGrath MAC (Aircraft Medical Ltd, Edinburgh, UK) equally as well as the GlideScope Ranger (Verathon, Inc, Bothell, WA) for intubation in regular simulated airways. We performed a prospective, randomized crossover study of 39 medical students using the McGrath MAC, GlideScope Ranger, and Macintosh in a manikin with 2 normal airways. The primary outcome was the intubation time. Secondary outcomes included the success rates and the overall glottic view of the 3 laryngoscopes. The mean intubation times for each attempt with the McGrath MAC were 30.8 ± 16.9 seconds or less and did not differ significantly from those obtained with the GlideScope Ranger or the Macintosh in both airway scenarios (P = .18; P = .49). The mean success rates at each attempt with the McGrath MAC were 82.0% ± 38.8% or more, equal to the Macintosh and the GlideScope Ranger in both scenarios (P = .026; P = .72) except during the first intubation attempt in a normal airway (P = .008). The median grades of the glottic view visible at each intubation attempt with the McGrath Mac were Cormack-Lehane grade 1 (scenario 1: interquartile range, 1-1; scenario 2: interquartile range, 1-2), which was significantly better than the Macintosh laryngoscope in both scenarios. However, the McGrath Mac did not produce a better glottic view than the GlideScope Ranger with either scenario.
The intubation performance of novices using the McGrath MAC was equal to their performance using the GlideScope Ranger in regular simulated airways.
Can the new McGrath laryngoscope rival the GlideScope Ranger portable video laryngoscope?
To determine the effect of a voucher for free mammography on compliance with recommended mammography screening guidelines. Vouchers for free mammography distributed to a random sample of women over the age of 50 in two rural southern Minnesota counties. The vouchers were good for one year. Baseline and follow-up data were collected and rates of compliance with current mammography guidelines were observed for the voucher group and a control group of women living in the same counties. Logistic regression models were used to estimate the effect of the voucher on compliance with mammography guidelines and the impact of factors potentially influencing the effectiveness of the voucher. The voucher improved mammography rates primarily through increasing screening among women who were out of compliance at baseline.
Vouchers, even when distributed randomly within a population of rural Midwestern women, can significantly improve compliance rates. Vouchers are no less effective a means of increasing screening among vulnerable women than among other women.
Do vouchers improve breast cancer screening rates?
To estimate whether discordant growth is associated with adverse perinatal outcomes in twins after adjusting for growth restriction. This was a retrospective, hospital-based cohort study of twin gestations with 2 live births delivered at 24 weeks or later from 1992 to 2001. Twin gestations were classified as small for gestational age (SGA) if one or both infants was less than the 10th percentile at birth by singleton Brenner norms and discordant if there was a 20% or more weight discordance. Of 1318 twin pairs, 856 were appropriate for gestational age (AGA) and concordant, 70 pairs were AGA and discordant, 254 pairs were SGA and concordant, and 138 pairs were SGA and discordant. The 4 groups had similar maternal demographics and medical comorbidity. When adjusting for chorionicity, antenatal steroid use, oligohydramnios, preeclampsia, and gestational age at delivery, discordant twins were more likely to have a cesarean delivery (odds ratio 1.87; 95% confidence interval 1.22, 2.87) and to be associated with some adverse neonatal outcomes (low and very low birthweight, neonatal intensive care unit admission, neonatal oxygen requirement and hyperbilirubinemia) independent of SGA status. A statistically nonsignificant trend (odds ratio 2.4; 95% confidence interval 0.99, 6.01) toward higher rates of intraventricular hemorrhage was noted in discordant twins, and no difference was seen for ventilator requirement, respiratory distress syndrome, or necrotizing enterocolitis.
Discordance places twins at increased risk for some adverse perinatal outcomes, whether they are AGA or SGA. Discordance was not an independent risk factor for serious neonatal morbidity or mortality; however, this study was underpowered to detect those differences.
Is discordant growth in twins an independent risk factor for adverse neonatal outcome?
The aim of this study was to analyze if age alone is a risk factor in major pancreatic surgery. From September 1, 1985 to December 31, 1997, 806 patients underwent surgery for malignant and benign diseases of the pancreas in a prospective case control study performed at the Department of Surgery, Johannes Gutenberg University Hospital Mainz. In 228 patients (men: n = 139; women: n = 89; mean age: 61 years; range: 23-83 years) we performed partial (n = 178) or total (n = 50) pancreaticoduodenectomy, which was combined with portal vein resection in 16 cases. Left pancreatic resection was carried out in 72 patients (men: n = 40; women: n = 32; mean age: 65 years; range: 28-86 years). Surgical complications after pancreaticoduodenectomy occurred in 22.1% of patients<or = 70 years and in 30.2% of patients>70 years, however, less than half of them had severe complications ranging below 50%. General complications developed in 16.1% of patients<or = 70 years and in 27.9% of patients>70 years (p<0.001). The mortality rates 30 and 90 days after surgery were 3.2% (<or = 70 years) and 2.3% (>70 years), and 6.0% (<70 years) and 6.9% (>70 years), respectively. Regression analysis showed the following factors to exert an independent influence on mortality: Pre-operative serum bilirubin, the diameter of the pancreatic duct, intra-operative blood loss and the occurrence of surgical and nonsurgical complications. Age did not exert an independent influence on the prognosis of either morbidity or mortality. However, general complications developed significantly more often in elderly patients. After left pancreatic resection surgical complications developed in 29.3% (<or = 70 years) and 21.4% (>70 years) of patients, however the rate of severe complications was below 10%. General complications occurred in 10.3% (<or = 70 years) and 28.6% (>70 years) (p<0.001). Mortality rates 30 and 90 days after operation were 1.7% (<or = 70 years) and 14.2% (>70 years), and 3.4% (<or = 70 years) and 14.2% (>70 years) (p = n.s.), respectively. Regression analysis showed the intra-operative blood loss to exert an independent influence on post-operative morbidity and mortality. Age had no independent influence on either morbidity or mortality.
Results obtained by this study show that, although general complications develop significantly more often in elderly patients, age is not an independent risk factor for post-operative mortality after major pancreatic resection. Factors of importance in improving the outcome of this operation include the experience of the surgeon in selecting patients eligible to undergo the procedure, his operative skills in performing major pancreatic resections, as well as better anticipation and management of post-operative complications.
Is age a risk factor for major pancreatic surgery?
Hypothyroidism often remains undetected because of the difficulty associating symptoms with disease. To determine the relation between symptoms and biochemical disease, we assessed symptoms and serum thyroid function tests, concurrently, for patients with and without hypothyroidism. Cross-sectional study.SETTING/ Seventy-six newly diagnosed case patients with overt hypothyroidism and 147 matched control patients identified through outpatient laboratories in Michigan and Colorado. Patient symptoms were assessed by questionnaire. Case patients reported a higher proportion of hypothyroid symptoms than did control patients (30.2% vs 16.5%, p<.0001). Univariate analysis identified three significant predictors of an elevated level of thyroid-stimulating hormone (TSH) (p<.05), and 13 symptoms which, when they had changed in the past year, were reported more often by case patients with hypothyroidism than by control patients (p<.005). Individuals reporting changes in 7 or more symptoms were significantly more likely to have hypothyroidism (likelihood ratio [LR] = 8.7, 95% confidence interval [CI]3.8, 20.2); those reporting changes in 2 or fewer symptoms were less likely to have hypothyroidism (LR = 0.5, 95% CI 0.4, 0.7).
In this sample, the number of hypothyroid symptoms reported was directly related to the level of TSH. The association was stronger when more symptoms were reported. Symptoms that had changed in the past year were more powerful than symptoms reported present at the time of testing. This suggests that traditional symptoms are valuable when deciding which patients to test for hypothyroidism.
Do traditional symptoms of hypothyroidism correlate with biochemical disease?
Postconditioning by brief episodes of ischaemia performed just at the time of reperfusion have been shown to reduce the size of infarcts in animal models, and in the clinical setting of percutaneous cardiac intervention. The clinical applicability of postconditioning in cardiac surgery remains to be determined. We investigated the effect of postconditioning on myocardial protection in children undergoing cardiac surgery. We randomly assigned 40 patients scheduled for surgical correction of congenitally malformed hearts under cold blood cardioplegic arrest to postconditioning or control treatment. Postconditioning was performed by two cycles of 30 seconds ischaemia and 30 seconds reperfusion using aortic reclamping, and declamping started 30 seconds after cardioplegic arrest. We assayed creatine kinase-MB, troponin I, transcardiac release of lactate and neutrophil counts. The types of procedure, age, bypass and aortic cross-clamping times were similar in both groups. The postoperative peaks of creatine kinase-MB and troponin I were lower after aortic de-clamping in the postconditioned patients compared with their controls (128 +/- 48 units per liter as opposed to 199 +/- 79 units per liter, p = 0.016, and 0.34 +/- 0.21 nanograms per milliliter as opposed to 0.61 +/- 0.53 nanograms per milliliter, p = 0.05), with reduced inotropic scores in those submitted to postconditioning compared with their controls (4.8 +/- 3.1 versus 2.3 +/- 1.5, p = 0.036). Transcardiac release of lactate was reduced in the postconditioned patients compared with their controls (0.10 +/- 0.27 as opposed to 0.37 +/- 0.43 millimols per liter, p = 0.048). No differences between groups were found for transcardiac neutrophil count during reperfusion (10.8 +/- 6.3% for postconditioning versus 14.0 +/- 8.7% for controls, p = 0.48).
Our study demonstrates that postconditioning may protect the myocardium of children undergoing cold blood cardioplegic arrest. These data support the need for a larger clinical trial of postconditiong in children undergoing cardiac surgery.
Does cardioplegia leave room for postconditioning in paediatric cardiac surgery?
This study was done to assess testosterone deficiency in males with metabolic syndrome (MetS) and the effect of testosterone replacement on insulin resistance and biochemical parameters of this syndrome with hypogonadism. Sixty three males fulfilling the International Diabetes Federation 2005 MetS guidelines as cases and 32 healthy males as controls with the mean age of 35.29±8.16 and 34±6.76 years respectively were enrolled in the study. Fasting blood samples were collected for gonadal profile and insulin assay. Homeostasis model assessment for insulin resistance (HOMA-IR) and free testosterone index were calculated. Hypogonadism was defined when the calculated free testosterone value was<0.225 nmol/L. Total and calculated free testosterone, sex hormone binding globulin (SHBG) were lower in cases than controls (P<0.001). Hypogonadism was seen in 19 (30%) cases with MetS while 1 (3.1%) in controls. MetS cases with hypogonadism had significantly higher HOMA-IR than eugonadal cases. Hypogonadotropic hypogonadism was observed in 16 (84%) cases. Treatment with oral Testosterone 40 mg twice a day for 3 months led to significant improvement of HOMA-IR (P≤0.001) in the MetS males with hypogonadism.
This study revealed that males with MetS with or without diabetes had lower serum testosterone than their age matched healthy subjects. Hypogonadism was also common in subjects with MetS. Testosterone therapy to correct hypoandrogenimia improved insulin sensitivity and other biochemical parameters except High density lipoprotein cholesterol in MetS. This emphasizes that hypogonadism in all men with MetS or MetS in all men with hypogonadism needs clinical evaluation and subsequent management.
Is hypoandrogenemia a component of metabolic syndrome in males?
Bile duct injuries (BDI) have been reported to occur more frequently during laparoscopic cholecystectomy (LC) compared to open cholecystectomy (OC). Several studies have demonstrated various potential predisposing factors for BDI. However, there is a controversy as to whether gallbladder inflammation is a significant predictor for BDI. Therefore, out primary aim was to investigate the relationship between inflammation and BDI at LC, and secondarily to present the management and clinical outcome of BDI. We recorded all consecutive LC performed between 1993 and 2005 in our institution by nine staff surgeons. BDI were classified according to Strasberg's classification. Simple and multivariate logistic regression analysis was performed to evaluate the association between inflammation and BDI occurrence during LC. There were 2,184 patients. Among those, 344 had inflammation (16%). The conversion rate was 5% and was higher among male, elder patients, and those with inflammation. The BDI incidence was 0.69% (0.14% for major and 0.55% for minor injuries) and it was significantly higher in those with inflammation compared to those without inflammation (p = 0.01). In particular, the risk for BDI was almost 3.5 times higher in those with inflammation (OR = 3.61, 95% CI 1.27-10.21). Inflammation remained an independent risk factor for BDI even after adjustment for potential confounders. Among patients sustaining injury, one died and two have recurrent cholangitis. No association was observed between clinical outcome and management of BDI, time of diagnosis, sex, and inflammation.
We revealed that inflammation is an independent predictor of BDI occurrence during LC. Therefore, it would be advisable for surgeons to not hesitate to convert a LC to an OC in the presence of inflammation.
Is inflammation a significant predictor of bile duct injury during laparoscopic cholecystectomy?
Recent retrospective studies have suggested that patients with T1a,bN0M0 human epidermal growth factor receptor 2 (HER2)-positive breast cancer are at a higher risk for recurrence and might benefit from adjuvant trastuzumab. The absolute benefits associated with treating this subgroup are uncertain. We reviewed recent studies examining the prognostic value of HER2 in patients with node-negative T1a,b HER2-positive breast cancer. We calculated the number needed to treat (NNT) using baseline risk estimates for untreated T1a,bN0M0 breast cancer and the number needed to harm (NNH) using the incidence of cardiac events in each of the adjuvant trastuzumab clinical trials. Several studies were identified, each with limitations inherent to retrospective database analyses: small cohort sizes, lack of systematic HER2 testing in older specimens, variations in the use of adjuvant therapy and definitions of study end points, and lack of information relating to comorbidities. The 5-year disease-free survival in the pre-trastuzumab era ranged from 77% to 95%. Comparisons between small HER2 -positive and small HER2 -negative cancers showed numerically worse outcome for the HER2-positive cohort in some but not all studies. In many instances, the NNH was larger (26-250) than the NNT (13-35); however, in a subset of patients, the NNH was lower (6) than the NNT (13-35).
Better prediction tools to estimate more precisely the risk for death due to comorbid illness versus breast cancer are needed. In some patients, the risks of therapy could outweigh the benefits. Treatment selection for T1a,bN0 HER2-positive cancers remains in the transition area between evidence- and subjective judgment-based medicine.
Coping with uncertainty: T1a,bN0M0 HER2-positive breast cancer, do we have a treatment threshold?
Several studies have assumed a parabolic velocity profile through the umbilical vein (UV) to derive the mean spatial velocity that is indispensable for flow rate calculations. However, the structure and arrangement of the umbilical cord suggest that velocity profiles may vary. The aim of this study was to evaluate UV spatial flow velocity profiles at different sites along the umbilical cord. Ten singleton pregnancies with a gestational age between 26 and 34 weeks were included in the study. Ultrasound equipment with an inbuilt function for analysis of the spatial velocity profile along a line located in a fixed plane was used to obtain UV velocity profiles. Velocity profiles were obtained at the placental insertion and in a free intra-amniotic loop of the cord. Two-dimensional (2D) velocity distribution coefficients were evaluated as ratios between mean and maximum velocities along the investigated lines. 2D velocity distribution coefficients at the placental insertion (0.85 +/- 0.03) were significantly higher (P<0.00001) than those obtained from a free loop of cord (0.76 +/- 0.03). Values indicated that velocity profiles are approximately flat at the placental insertion and become more parabolic moving downstream. Moreover, profiles become skewed in association with cord curvature and show peculiar biphasic shapes immediately downstream from the placenta.
Flow velocity profiles in the UV are not perfectly parabolic and modify along the cord. These characteristics may affect the evaluation of UV blood flow rate.
Spatial velocity profile changes along the cord in normal human fetuses: can these affect Doppler measurements of venous umbilical blood flow?
Although recent reviews suggest few gender differences in smoking-cessation outcomes, it is important to establish whether gender differences exist in response to the brief interventions increasingly recommended as part of routine medical care. We used data from an efficacious primary care-based smoking intervention to examine gender differences in smoking characteristics, use of intervention components, self-reported quitting activities, and cessation outcomes among all smokers randomized to receive clinician advice and nurse-assisted intervention (n = 1,978, 58% female). Although female and male smokers differed on a number of sociodemographic and smoking-related characteristics, they were equally likely to participate in each step of the recommended intervention. Female and male smokers were also equally likely to report quit attempts and cessation at 3, 12, and 3 and 12 months (combined long-term cessation endpoint). Similarly, no gender difference in relapse at 12 months was seen. Women attempting to quit used a greater number and variety of smoking-cessation strategies, suggesting that, although outcomes were similar, the processes of cessation may vary by gender.
Since this brief intervention in primary care was equally efficacious and acceptable to female and male smokers, broader implementation in medical settings of this population-based approach to reducing tobacco use is warranted. Indeed, widespread implementation of smoking-cessation programs in medical settings may particularly benefit women, who are more likely than men to have contacts with the medical care system.
Does gender affect response to a brief clinic-based smoking intervention?
We hypothesized that segmental wall motion abnormalities (WMAs) are related to cardiac resynchronization therapy (CRT) response. We studied 108 patients who received CRT, 69 with ischemic and 39 with nonischemic heart disease. A wall motion score index (WMSI) was analyzed using a 17-segment model and calculated by the total score/number of segments analyzed. A decrease of left ventricular end systolic volume>or =15% after CRT was defined as a positive response to CRT. Of 108 patients, 1,054/1,836 segments (57%) had WMAs. The mean WMSI was 2.06 in patients with ischemic heart disease and 1.04 in patients with nonischemic heart disease (P<0.0001). The area under the receiver operating characteristic curve for a WMSI predicting a positive response to CRT was 0.70 (P = 0.0001). The cutoff point was a WMSI<or =2 for prediction of a positive response to CRT. After adjustment for age, gender, and clinical features, the WMSI persistently related to CRT responders (P = 0.01). During 15-month follow-up, the percentage of CRT nonresponders in patients with a WMSI>2 was significantly higher (82%) compared to patients with a WMSI<or =2 (47%, P = 0.005) and nonischemic heart disease (36%, P<0.001). In 59 patients with left ventricular mechanical dyssynchrony, the percentage of negative responders to CRT in patients with a WMSI>2,<or =2, and nonischemic heart disease were 53% (8 of 15), 16% (3 of 19) and 0% (0 of 25), respectively (P<0.001).
A large extent of WMAs and a WMSI>2 predicted a poorer CRT response.
Are the extent, location, and score of segmental wall motion abnormalities related to cardiac resynchronization therapy response?
Cardiac operation for severe aortic stenosis after previous mitral valve replacement is a surgical challenge in older patients with multiple morbidities. Transcatheter aortic valve implantation (TAVI) after previous mechanical mitral valve replacement has been considered a high-risk procedure, owing to possible interference with the mitral valve prosthesis. Since August 2008, 5 female high-risk patients with severe aortic stenosis and previous mitral valve replacement (mean ± SD age, 80 ± 5.1 years; logistic EuroSCORE, 39.3% ± 20.5%) underwent TAVI with a pericardial xenograft valve that was fixed with a stainless steel, balloon-expandable stent (Edwards Lifesciences SAPIEN). We used a transapical approach in 4 patients and a transfemoral approach in 1 patient. Transesophageal echocardiography and multidetector computed tomography were used for preoperative planning and assessment of operation feasibility. The mean distance between the aortic annulus and the mitral valve prosthesis was 10 ± 1 mm (range, 9-11 mm). TAVI was performed successfully in all 5 patients. There was no direct or functional interference with the mechanical mitral valve prostheses. Echocardiography revealed good valve function with no more than mild paravalvular incompetence early in the postoperative period and during routine follow-up. There were no neurologic events. After an initially uneventful course with good aortic valve function at the most recent echocardiography evaluation, however, 2 of the patients died from fulminant pneumonia on postoperative days 4 and 48.
TAVI is technically feasible in high-risk patients after previous mechanical mitral valve replacement; however, careful patient selection is mandatory with respect to preoperative clinical status and anatomic dimensions regarding the distance between aortic annulus and mitral valve prosthesis.
Transcatheter aortic valve implantation after previous mechanical mitral valve replacement: expanding indications?
Indicators of health-system outputs, such as Emergency Obstetric Care (EmOC) density, have been proposed for monitoring progress towards reducing maternal mortality, but are currently underused. We seek to promote them by demonstrating their use at subnational level, evaluating whether they differentiate between a high-maternal-mortality country (Zambia) and a low-maternal-mortality country (Sri Lanka) and assessing whether benchmarks are set at the right level. We compared national and subnational density of health facilities, EmOC facilities and health professionals against current benchmarks for Zambia and Sri Lanka. For Zambia, we also examined geographical accessibility by linking health facility data to population data. Both countries performed similarly in terms of EmOC facility density, implying this indicator, as currently used, fails to discriminate between high- and low-maternal-mortality settings. In Zambia, the WHO benchmarks for doctors/midwives were met overall, but distribution between provinces was highly unequal. Sri Lanka overshot the suggested benchmarks by three times for midwives and over 30 times for doctors. Geographical access in Zambia--which is much less densely populated than Sri Lanka--was poor, less than half the population lived within 15 km of an EmOC facility.
Current health-system output indicators and benchmarks on EmOC need revision to enhance discriminatory power and should be adapted for different population densities. Subnational disaggregation and assessing geographical access can identify gaps in EmOC provision and should be routinely considered. Increased use of an improved set of output indicators is crucial for guiding international efforts towards reducing maternal mortality.
Tracking progress towards safe motherhood: meeting the benchmark yet missing the goal?
Despite its potential, the validity of the flow-mediated dilation (FMD) test has been questioned because of lack of normalization to the vasodilatory stimulus. The hemodynamic conditions inside blood vessels lead to the development of superficial stress near the vessel walls, which can be divided into 2 categories: (1) circumferential stress (CS) and (2) shear stress (SS). Although SS is thought to be the primary governing stimulus, to the best of our knowledge, the degree to which CS contributes to FMD has not been reported in the literature. The purpose of this study was to determine the importance of CS to FMD. We defined FMD as the SS-diameter dose-response slope. Fourteen physically active, young [mean (SD) age, 26 (5) years], male subjects were tested. Progressive forearm heating and handgrip exercise elicited steady-state increases in shear rate. Hierarchical linear modeling was used to estimate change in diameter with repeated measures of SS and CS nested within each subject. Circumferential stress was found to positively promote FMD in addition to SS (β = 0.019, P = 0.019). However, the variance explained by CS was less than 1%.
The physiologic significance of CS to FMD was minimal. However, physically active, young men were recruited; it remains to be determined whether CS has a more pronounced effect in subjects exhibiting cardiovascular risk factors.
Does circumferential stress help to explain flow-mediated dilation?
Optimal management of infective endocarditis (IE) depends on the early detection of IE-causing pathogens and on appropriate antimicrobial and surgical therapy. The current guidelines of the European Society of Cardiology (ESC) recommend histopathological examination as the gold standard for diagnosing IE Habib et al. (Eur Heart J 30:2369-2413, 2005). We hypothesize that histopathological findings do not provide additional information relevant to clinical decision-making. We retrospectively reviewed a cohort of patients who had undergone surgery for native valve endocarditis (NVE) at the University Hospital Regensburg between September 1994 and February 2005. All episodes of intraoperatively confirmed endocarditis during this period were included in the study. Data were retrieved from surgical records, microbiological and histopathological reports, and medical files of the treating as well as admitting hospital. Pathogens were correlated with the site of manifestation of the affected heart valve and with clinical and histopathological findings. A total of 163 episodes of NVE were recorded and entered into our study for analysis. The valves affected were the aortic valve (45 %), the mitral valve (28 %), the aortic and mitral valve (22 %), and other valves (5 %). IE-causing pathogens were Staphylococcus aureus (22 %), viridans streptococci (18 %), enterococci (10 %), streptococci other than Streptococcus viridans (9 %), coagulase-negative staphylococci (5 %), miscellaneous pathogens (4 %), and culture-negative endocarditis (33 %). Infection with S. aureus was associated with high rates of sepsis, septic foci, and embolic events, while patients with enterococcal IE showed the highest rate of abscesses. Mortality rate in all subgroups was low without significant differences. However, histopathological findings correlated poorly with the pathogen involved and showed only few significant associations that were without clinical relevance.
The clinical presentation of IE depends on the pathogen involved. Among the episodes of NVE examined, the histopathological examination of resected heart valves did not show any pathogen-specific morphological patterns and therefore did not provide any additional information of clinical value. Based on our findings, we recommend complementary cultures of the resected materials (valve tissue, thrombotic material, pacer wire) and implementation of molecular diagnostic methods (e.g., broad-range PCR amplification techniques) instead of histopathological analyses of resected valve tissue.
Are histopathological findings of diagnostic value in native valve endocarditis?
Proteinuria and dilatation of the urinary tract are both relatively common in pregnancy, the latter with a spectrum of symptoms, from none to severe pain and infection. Proteinuria is a rare occurrence in acute obstructive nephropathy; it has been reported in pregnancy, where it may pose a challenging differential diagnosis with pre-eclampsia.The aim of the present study is to report on the incidence of proteinuria (≥ 0.3; ≥ 0.5 g/day) in association with symptomatic-severe urinary tract dilatation in pregnancy. Case series. Nephrological-Obstetric Unit dedicated to pregnancy and kidney diseases (January 2000-April 2011). database prospectively updated since the start of the Unit. Retrospective review of clinical charts identified as relevant on the database, by a nephrologist and an obstetrician. From January 2000 to April 2011, 262 pregnancies were referred. Urinary tract dilatation with or without infection was the main cause of referral in 26 cases (predominantly monolateral in 19 cases): 23 singletons, 1 lost to follow-up, 1 twin and 1 triplet. Patients were referred for urinary tract infection (15 cases) and/or renal pain (10 cases); 6 patients were treated by urologic interventions ("JJ" stenting). Among them, 11 singletons and 1 triple pregnancy developed proteinuria ≥ 0.3 g/day (46.1%). Proteinuria was ≥ 0.5 g/day in 6 singletons (23.1%). Proteinuria resolved after delivery in all cases. No patient developed hypertension; in none was an alternative cause of proteinuria evident. No significant demographic difference was observed in patients with renal dilatation who developed proteinuria versus those who did not. An association with the presence of "JJ" stenting was present (5/6 cases with proteinuria ≥ 0.5 g/day), which may reflect both severer obstruction and a role for vescico-ureteral reflux, induced by the stent.
Symptomatic urinary tract dilatation may be associated with proteinuria in pregnancy. This association should be kept in mind in the differential diagnosis with other causes of proteinuria in pregnancy, including pre-eclampsia.
Excessive urinary tract dilatation and proteinuria in pregnancy: a common and overlooked association?
Sleeve gastrectomy is being performed with increasing frequency in Australia for the treatment of morbid obesity. The aims of this study were to show that sleeve gastrectomy can be performed safely with a low rate of complications and effective short to medium term weight loss. A retrospective review of prospectively collected data from a single surgeon series between 2006 and 2009. A total of 185 patients were treated with laparoscopic sleeve gastrectomy (LSG) over a three-year period. The percentage excess weight loss (%EWL) was 47.2% at one year, 60.7% at two years and 66% at three years. There were no leaks; two staple line bleeds requiring reoperation via laparoscopy, one port site infection and one port site incisional hernia. There were no deaths. The average operating time was 111 min and the average hospital length of stay was 2.35 days.
Laparoscopic sleeve gastrectomy can be performed safely and with excellent weight loss. More long term follow up is required.
Outcomes of sleeve gastrectomy for morbid obesity: a safe and effective procedure?
This study aimed to determine the acute responses of breathing oxygen-enriched air during the recovery periods of a simulated 3 × 3-min cross-country skiing team sprint competition at simulated low altitude. Eight well-trained male endurance athletes performed two 3 × 3-min team sprint simulations on a double-poling ergometer at simulated altitude set at ∼ 1800 m. During the recovery periods between the 3 × 3-min sprints, all the athletes inhaled either hyperoxic (FiO2 = 1.00) or hypoxic (FiO2 ∼ 0.165) air in randomized and single-blind order. The mean total power output (P(mean tot)) and the mean power output of each sprint (P(mean) 1,2,3) were determined. Perceived exertion, capillary oxygen saturation of hemoglobin, partial pressure of oxygen, and blood lactate concentration were measured before and after all the sprints. No differences in P(mean tot) were found between hyperoxic (198.4 ± 27.1 W) and hypoxic (200.2 ± 28.0 W) recovery (P = 0.57, effect size [d] = 0.07). P(mean) 1,2,3 (P>0.90, d = 0.04-0.09) and RPE (P>0.13, d = 0.02-0.63) did not differ between hyperoxic and hypoxic recovery. The partial pressure of oxygen (P<0.01, d = 0.06-5.45) and oxygen saturation (P<0.01, d = 0.15-5.40) during hyperoxic recovery were higher than those during hypoxic recovery. The blood lactate concentration was also lower directly after the third sprint (P = 0.03, d = 0.54) with hyperoxic recovery.
Results indicate that trained endurance athletes who inhale 100% oxygen during recovery periods in a cross-country skiing team sprint at low altitude do not exhibit enhanced performance despite the improvement in the key physiological variables of endurance performance.
Does hyperoxic recovery during cross-country skiing team sprints enhance performance?
Retinal dialysis is a frequent cause of retinal detachment in infants and young adults. The authors report long-term results obtained with conventional detachment surgery in a large consecutive series. Fifty-two eyes of 50 patients with retinal detachment due to dialysis underwent a segmental buckling procedure between January 1990 and December 1998. Patient characteristics and surgical results at 1 year of follow-up were evaluated. In 2007, 40 eyes from these groups were reexamined for long-term results (follow-up: 9 to 17 years; median: 13.4 years). The mean age of the patients was 12.8 years (range = 6 to 28 years). Preferred locations of the dialyses were inferotemporal (72%) and superonasal (16%). The macula was detached in 82%. At 1 year of follow-up, the retina was completely reattached after one surgical procedure in 87% and after two procedures in 97%. Long-term follow-up of 40 of 52 eyes revealed no retinal redetachment, but additional surgeries had been performed. Visual acuity improved in 70% of the eyes, but only 40% reattained reading vision due to the high rate of macula-off retinal detachment preoperatively.
Scleral buckling for retinal detachment due to dialysis yields good results, even in the long term, and remains the treatment of choice for these usually young patients despite the increasing popularity of primary vitrectomy.
Is buckle surgery still the state of the art for retinal detachments due to retinal dialysis?
Cardiac surgery carries a high risk of neurological complications; therefore, these patients would be an appropriate target population for neuroprotective strategies. In this study, we evaluated postoperative diffusion-weighted imaging (DWI) as a potential surrogate marker for brain embolism and its relationship to neurobiochemical markers of brain injury. Of a total of 45 consecutive patients undergoing aortic valve replacement, 37 completed preoperative and postoperative MRI. At the time of the MRI studies, serum S100beta and neuron-specific enolase concentrations were determined. Preexisting T2 and postoperative DWI lesion volumes were quantified. All patients had a blinded neurological examination before and after operation. New perioperative DWI lesions were present in 14 patients (38%), of whom only 3 developed focal neurological deficits. Eighteen small lesions were found in the white matter or vascular border zones in all but 2 patients with territorial stroke. The appearance of new DWI lesions correlated with age, pre-existing T2 lesion volume, and postoperative S100beta concentrations on days 2 to 4 after surgery. In a forward stepwise canonical discrimination model, only T2 lesion volume was selected as a relevant variable.
The incidence of postoperative DWI lesions in aortic valve replacement is high, and a suitable marker for neuroprotective trials would be a reduction in the number of such lesions. The volume of preexisting T2 lesions is related to the development of perioperative DWI lesions.
Diffusion-weighted magnetic resonance imaging and neurobiochemical markers after aortic valve replacement: implications for future neuroprotective trials?
Population ageing may threaten the sustainability of future health care systems. Strengthening primary health care, including long-term care, is one of several measures being taken to handle future health care needs and budgets. There is limited and inconsistent evidence on the effect of long-term care on hospital use. We explored the relationship between the total use of long-term care within public primary health care in Norway and the use of hospital beds when adjusting for various effect modifiers and confounders. This national population-based observational study consists of all Norwegians (59% women) older than 66 years (N = 605676) (13.2% of total population) in 2002-2006. The unit of analysis was defined by municipality, age and sex. The association between total number of recipients of long-term care per 1000 inhabitants (LTC-rate) and hospital days per 1000 inhabitants (HD-rate) was analysed in a linear regression model. Modifying and confounding effects of socioeconomic, demographic and geographic variables were included in the final model. We defined a difference in hospitalization rates of more than 1000 days per 1000 inhabitants as clinically important. Thirty-one percent of women and eighteen percent of men were long-term care users. Men had higher HD-rates than women. The crude association between LTC-rate and HD-rate was weakly negative. We identified two effect modifiers (age and sex) and two strong confounders (travel time to hospital and mortality). Age and sex stratification and adjustments for confounders revealed a positive statistically significant but not clinically important relationship between LTC-rates and hospitalization for women aged 67-79 years and all men. For women 80 years and over there was a weak but negative relationship which was neither statistically significant nor clinically important.
We found a weak positive adjusted association between LTC-rates and HD-rates. Opposite to common belief, we found that increased volume of LTC by itself did not reduce pressure on hospitals. There still is a need to study integrated care models for the elderly in the Norwegian setting and to explore further why municipalities far away from hospital achieve lower use of hospital beds.
Does long-term care use within primary health care reduce hospital use among older people in Norway?
Working while ill has been found to predict coronary heart disease. We tested if this association was due to triggering. We used a nested case-control study in an occupational cohort to examine sickness absences during a 2-year period immediately before the first coronary event for 133 cases and 928 matched controls without a history of coronary events. Working while ill was defined as no absence despite being unhealthy (suboptimal self-rated health or psychological distress). The odds of a coronary event were not higher for cases who worked while ill than for correspondingly unhealthy controls who took>0 to 14 days of absence per year (OR = 0.62; 95% CI = 0.28 to 1.38). These results were little affected by multiple adjustments.
We found no evidence that working while ill acts as a short-term trigger for coronary events.
Does working while ill trigger serious coronary events?
Raman spectroscopy allows immediate analysis of stone composition. In vivo stone analysis during endoscopic treatment may offer advantages concerning surgical strategy and metaphylaxis. Urinary stone components were evaluated utilizing an experimental setup of a Raman system coupled to commercial laser fibers. Samples of paracetamol (acetaminophen) and human urinary stones with known Raman spectra were analyzed using an experimental Raman system coupled to common commercial lithotripsy laser fibers (200 and 940 µm). Two different excitation lasers were used at wavelengths of 532 and 785 nm. Numerical aperture of the fibers, proportion of reflected light reaching the CCD chip, and integration times were calculated. Mathematical signal correction was performed. Both the laser beam profile and the quality of light reflected by the specimens were impaired significantly when used with commercial fibers. Acquired spectra could no longer be assigned to a specific stone composition. Subsequent measurements revealed a strong intrinsic fluorescence of the fibers and poor light acquisition properties leading to a significant decrease in the Raman signal in comparison with a free-beam setup. This was true for both investigated fiber diameters and both wavelengths. Microscopic examination showed highly irregular fiber tip surfaces (both new and used fibers).
Our results propose that laser excitation and light acquisition properties of commercial lithotripsy fibers impair detectable Raman signals significantly in a fiber-coupled setting. This study provides essential physical and technological information for the development of an advanced fiber-coupled system able to be used for immediate stone analysis during endoscopic stone therapy.
Is in vivo analysis of urinary stone composition feasible?
Observational studies suggest that body mass index (BMI) is inversely associated with esophageal squamous cell carcinoma (ESCC). However, questions remain regarding reverse causation and confounding, especially by smoking, as alternative explanations. The authors examined the association between BMI and measures of weight history and risk of ESCC in a population-based Australian case-control study (from 2002 to 2005) comprising 287 patients with ESCC (cases) and a control group of 1544 individuals who were sampled from a population registry. Stratified analyses were performed specifically to explore whether this association was influenced by smoking. Multivariable logistic regression models were used to derive odds ratios (ORs). After adjusting for smoking, significant inverse associations with ESCC for BMI and weight 1 year before diagnosis, maximum adult BMI, and weight gain since age 20 years were observed (all P(trend)<.001). The risk of ESCC was reduced by 35% (range, 23%-44%) per 5-unit increase in recent BMI. Participants who gained weight after age 20 years had a lower risk than those who maintained their weight during adult life (OR for gain of>20 kg, 0.51; 95% confidence interval [CI], 0.33-0.77). In stratified analyses, higher BMI was associated with a decreasing risk of ESCC both in never-smokers (OR, 0.32; 95% CI 0.13-0.76) and smokers (OR 0.22, 95%CI 0.07-0.67) comparing the highest versus the lowest BMI quintile.
In this study, the inverse associations between BMI, long-term weight gain, and other body measures and ESCC appeared to be robust and could not be explained by smoking status or potential confounding factors.
Body mass index, long-term weight change, and esophageal squamous cell carcinoma: is the inverse association modified by smoking status?
To evaluate the suitability of alpha-1-microglobulin as a marker for cadmium induced renal dysfunction. alpha-1-Microglobulin was studied in a cross sectional survey in relation to the body burden of cadmium. Concentrations of alpha-1-microglobulin in 24 h urine of 831 people aged 2-87 years were analysed in association with urinary cadmium excretion, cadmium blood concentration, age, sex, occupational and smoking history, and estimated creatinine clearance. Participants came from a population residentially exposed to cadmium and from two control populations matched for socioeconomic status. The excretion of alpha-1-microglobulin/24 h ranged from 0.1 mg to 176.3 mg and 44.4% of samples showed concentrations near the detection limit. Ordinal logistic regression analysis of people of all ages identified a high risk only for males compared with females (odds ratio (OR) 2.14; 95% confidence interval (95% CI) 1.56 to 2.94), age group, and duration of living on contaminated soil (OR 1.03/year; 95% CI 1.02 to 1.04), but not urinary cadmium excretion (OR 1.30; 95% CI 0.96 to 1.77) as significant predictors. For people<or = 50 years of age a weaker effect of sex (OR 1.76; 95% CI 1.13 to 2.73) and age group and an effect of similar magnitude for the duration of soil exposure (OR 1.03; 95% CI 1.01 to 1.04) were found. Also, the urinary cadmium excretion (OR 2.26; 95% CI 1.38 to 3.70) and occupational exposure (OR 1.71; 95% CI 1.03 to 2.83) were found to be significant in this younger age group. The estimated creatinine clearance had no significant impact on the alpha-1-microglobulin excretion.
alpha-1-Microglobulin is a suitable marker for early tubular changes only for people<or = 50 years. It may not be sufficiently specific for cadmium, and therefore not a suitable surrogate for cadmium exposure in epidemiological studies.
alpha-1-Microglobulin: epidemiological indicator for tubular dysfunction induced by cadmium?
The evaluation of the peritoneal transport characteristics is mandatory in peritoneal dialysis (PD) patients. This is usually performed in routine clinical practice with a peritoneal equilibration test (PET) using conventional dialysates, with low pH and high glucose degradation product (GDP) concentrations. An increasing proportion of patients are now treated with biocompatible dialysates, i.e. with physiological pH and lower GDP concentrations. This questions the appropriateness to perform a PET with conventional solutions in those patients. The aim of our study is to compare the results of the PET using biocompatible and conventional dialysates, respectively. Nineteen stable PD patients (13 males, 6 females; mean age: 67.95±2.36 years, mean body surface area: 1.83±0.04 m2, dialysis vintage: 2.95±0.19 years) were included, among which 10 were usually treated with biocompatible and 9 with conventional solutions. Two PETs were performed, within a 2-week interval, in each patient. PET sequence (conventional solution first or biocompatible solution first) was randomized in order to avoid 'time bias'. Small (urea, creatinine and glucose), middle (beta-2-microglobulin) and large molecules' (albumin and alpha-2-macroglobulin) dialysate/plasma (D/P) concentration ratios and clearances were measured during each PET. Ultrafiltration (UF) and sodium filtration were also recorded. Results of both tests were compared by the Wilcoxon paired test. No statistical difference was found between both dialysates for small molecule transport rates or for sodium filtration and UF. However, a few patients were not similarly classified for small-solute transport characteristics within the PET categories. Beta-2-microglobulin and albumin D/P ratios at different time points of the PET were significantly higher with the biocompatible, when compared with the conventional, solutions: 0.10±0.03 versus 0.08±0.02 (P<0.01) and 0.008±0.003 versus 0.007±0.003 (P=0.01), respectively. A similar difference was also observed for beta-2-microglobulin that was higher with biocompatible dialysates (1.04±0.32 versus 0.93±0.32 mL/min, respectively).
Peritoneal transport of water and small solutes is independent of the type of dialysate which is used. This is not the case for the transport of beta-2-microglobulin and albumin that is higher under biocompatible dialysates. Vascular tonus modification could potentially explain such differences. The PET should therefore always be carried out with the same dialysate to make longitudinal comparisons possible.
Peritoneal equilibration test with conventional 'low pH/high glucose degradation product' or with biocompatible 'normal pH/low glucose degradation product' dialysates: does it matter?
Abnormal activations of neural networks implicated in auditory stimuli processing are hypothesized to generate auditory hallucinations (AH) in schizophrenia spectrum disorders. Because repetitive transcranial magnetic stimulation (rTMS) has the potential to modulate neural network activity, several studies have explored its use in treating medication-resistant AH, with mixed results in small-to-medium patient samples. Our aim is to apply a metaanalytic approach to exploring the efficacy of rTMS in treating medication-resistant AH. A search of the electronic databases for studies comparing low-frequency (1 Hz) rTMS over the left temporoparietal cortex to sham stimulation in patients suffering from medication- resistant AH was performed. Our search was completed by cross-referencing the articles, searching the Current Controlled Trials website, and direct contact with relevant researchers. From 265 possible abstracts, 6 parallel-arm, double-blind placebo-controlled and 4 crossover controlled trials, all randomized, matched the inclusion and exclusion criteria (n = 232). The primary outcome measure (effect of active treatment on AH at the end of the treatment) was tested with a random effect model and reached a significant homogeneous ES estimate (Hedges' g = 0.514; P = 0.001; 95CI%, 0.225 to 0.804; Q = 13.022; P = 0.162).
We found that low-frequency rTMS over the left temporoparietal cortex has a medium ES action on medication-resistant AH. This result has implications for understanding the pathophysiology of psychotic symptoms (specifically AH) and supports the use of rTMS as a complementary treatment approach in patients suffering from treatment-resistant AH.
Should we treat auditory hallucinations with repetitive transcranial magnetic stimulation?
The 'gateway' pattern of drug initiation describes a normative sequence, beginning with alcohol and tobacco use, followed by cannabis, then other illicit drugs. Previous work has suggested that 'violations' of this sequence may be predictors of later problems but other determinants were not considered. We have examined the role of pre-existing mental disorders and sociodemographics in explaining the predictive effects of violations using data from the US National Comorbidity Survey Replication (NCS-R). The NCS-R is a nationally representative face-to-face household survey of 9282 English-speaking respondents aged 18 years and older that used the World Health Organization (WHO) Composite International Diagnostic Interview (CIDI) to assess DSM-IV mental and substance disorders. Drug initiation was estimated using retrospective age-of-onset reports and 'violations' defined as inconsistent with the normative initiation order. Predictors of violations were examined using multivariable logistic regressions. Discrete-time survival analysis was used to see whether violations predicted progression to dependence. Gateway violations were largely unrelated to later dependence risk, with the exception of small increases in risk of alcohol and other illicit drug dependence for those who initiated use of other illicit drugs before cannabis. Early-onset internalizing disorders were predictors of gateway violations, and both internalizing and externalizing disorders increased the risks of dependence among users of all drugs.
Drug use initiation follows a strong normative pattern, deviations from which are not strongly predictive of later problems. By contrast, adolescents who have already developed mental health problems are at risk for deviations from the normative sequence of drug initiation and for the development of dependence.
Does the 'gateway' matter?
Farm-to-School programs (FTSPs) connect schools with locally grown food. This article examines whether FTSPs are more common in public elementary schools (ESs) in states with a formal, FTSP law or with a related, locally grown procurement law. A pooled, cross-sectional analysis linked nationally representative samples of 1872 public ESs (across 47 states) for the 2006-2007, 2007-2008, and 2008-2009 school years with state laws effective as of the beginning of September of each year that were collected and analyzed for all states. Multivariate logistic regression models examined the impact of state law on school FTSP participation, controlling for year and school-level race/ethnicity, region, locale, free-reduced lunch participation, and school size. The percentage of schools located in a state with a FTSP-specific law increased from 7.3% to 20.4% over the 3-year period, while the percentage of schools located in a state with a locally grown procurement law was approximately 30% across all years. The percentage of schools with FTSPs has more than tripled over the last 3 years (from 4.9% to 17.7%). After adjusting for all covariates except year, FTSPs were significantly more likely in states with a FTSP-specific law (OR = 2.45, 95% CI = 1.28-4.67); once adjusting for year, the results were marginally significant (OR = 1.72, 95% CI = .91-3.25). School-level FTSPs were not related to state locally grown procurement laws.
Although the percentage of schools with FTSPs is relatively small, these programs are becoming more common, particularly in states with FTSP-specific laws.
Are farm-to-school programs more common in states with farm-to-school-related laws?
The introduction of diagnosis-related groups (DRG) in Germany comprises the risk of a non-cost-effective reimbursement in complex medical treatments. The aim of this study was to compare the reimbursement between the DRG system and the system of hospital per diem charge in effect until now. The G-DRG (Version 2004) reimbursement was calculated for 1,030 polytrauma patients (average ISS 26.4) treated at the BGU Murnau from 2000 to 2004, using a base value of 2900 euros, and compared to the reimbursement of hospital per diem charge. Just half of all polytrauma patients are classified as a polytrauma according to the DRG (18.7%) or as requiring artificial respiration based on the DRG (29.1%). The average G-DRG reimbursement was 27,157 euros vs 36,387 euros (74.6%). Patients with minor trauma, increasing age, high GCS, ICU stay without artificial respiration, trauma of the upper extremity and patients who survived show the greatest discrepancy.
A revision of the G-DRG definition of polytrauma is necessary to ensure adequate reimbursement for management of patients with multiple injuries. The severity of a trauma has to be considered in the DRG system.
Is polytrauma affordable these days?
Our aim was to examine the outcomes of patients with tertiary hyperparathyroidism (3-HPT) who had limited resection of 1 or 2 parathyroids. We reviewed 140 patients with 3-HPT who underwent parathyroidectomy (PTX) at a single institution. Patients were analyzed according to their operation-limited PTX versus subtotal or total PTX. The limited PTX group consisted of 29 patients who underwent resection of 1 (n = 12) or 2 (n = 17) parathyroids. The other 111 patients had subtotal (n = 104), total (n = 3), and/or reoperative PTX (n = 12). The mean +/- SEM follow-up was 79 +/- 5 months. Eucalcemia was achieved in 94% of the patients. All patients with persistent (n = 2) hypercalcemia underwent subtotal PTX (P = not significant [NS] vs limited PTX). In a logistic regression model, the extent of operation was not associated with the development of recurrent disease. Additionally, the incidence of permanent hypocalcemia was 7% after subtotal or total PTX versus 0% after limited resection (P = NS).
Long-term outcomes in patients with 3-HPT appear to be similar after appropriate limited resection of 1 or 2 parathyroid glands compared to subtotal or total PTX. Therefore, a strategy of limited parathyroid resection seems appropriate for patients with 3-HPT when the disease is limited to 1 or 2 glands.
Tertiary hyperparathyroidism: is less than a subtotal resection ever appropriate?
As for any manual procedure, the learning curves for medical interventions can have undesirable phases, occurring mostly in the early experience of applying a technique. There have been impressive advances in endoscopic procedures during recent years, and there is an emerging trend that the number of procedures is increasing in parallel with these. In addition, the introduction of screening programs for colorectal cancer will also increase the numbers of procedures needed. Recent developments in medical simulation seem promising with regard to the possibility of "training out" undesirable parts of the learning curve outside the operating room. The aim of this study was to investigate whether the use of the AccuTouch flexible endoscopy simulator improves the early part of the learning curve in colonoscopy training. 12 endoscopy trainees, 10 surgeons and two medical gastroenterologists, all with experience in gastroscopy but with no specific colonoscopy experience, were randomly assigned to either simulator training or to a control group. They all received the same theoretical study package and the training group practiced with the AccuTouch colonoscopy simulator until a predefined expert level of performance was reached. All trainees performed their first ten individual colonoscopies described in detail in a separate protocol. Trainees in the simulator-trained group performed significantly better (P=0.0011) and managed to reach the cecum in 52% of their cases (vs. 19% in the control group), and were 4.53 times more likely to succeed compared with the controls. Additionally, there was a significantly shorter procedure time and less patient discomfort in the hands of the simulator-trained group.
Skills acquired using the AccuTouch simulator transfer well into the clinical colonoscopy environment. The results of this trial clearly support the plan to integrate simulator training into endoscopic education curricula.
Virtual reality colonoscopy simulation: a compulsory practice for the future colonoscopist?
Many people will consult a medical practitioner about lower bowel symptoms, and the demand for access to general practitioners (GPs) is growing. We do not know if people recognise the symptoms of lower bowel cancer when advising others about the need to consult a doctor. A structured vignette survey was conducted in Western Australia. Participants were recruited from the waiting rooms at five general practices. Respondents were invited to complete self-administered questionnaires containing nine vignettes chosen at random from a pool of 64 based on six clinical variables. Twenty-seven vignettes described high-risk bowel cancer scenarios. Respondents were asked if they would recommend a medical consultation for the case described and whether they believed the scenario was a cancer presentation. Logistic regression was used to estimate the independent effects of each variable on the respondent's judgement. Two-hundred and sixty-eight completed responses were collected over eight weeks. The majority (61%) of respondents were female, aged 40 years and older. A history of rectal bleeding, six weeks of symptoms, and weight loss independently increased the odds of recommending a consultation with a medical practitioner by a factor of 7.64, 4.11 and 1.86, respectively. Most cases that were identified as cancer (75.2%) would not be classified as such on current research evidence. Factors that predict recognition of cancer presentations include rectal bleeding, weight loss and diarrhoea.
Within the limitation of this study, respondents recommended that most symptomatic people present to their GP. However, we report no evidence that they recognised a cancer presentation, and duration of symptoms was not a significant variable in this regard. Cases that were identified as 'cancer' could not be classified as high risk on the available evidence.
Advice to consult a general medical practitioner in Western Australia: could it be cancer?
This study aimed to determine whether type and duration of therapy for human immunodeficiency virus (HIV) infection attenuates liver fibrosis in patients with HIV and hepatitis C virus (HCV) coinfection. Patients with HCV monoinfection (group 1) and HIV-HCV coinfection were retrospectively selected; the latter patients were classified into the following 3 groups: group 2, patients who received no therapy or only nucleoside reverse-transcriptase inhibitors (NRTIs); group 3, those who received highly active antiretroviral therapy (HAART); and group 4, those who initially received NRTIs followed by HAART. Fibrosis stage (scale, 0-6) and necroinflammatory score (scale, 0-18) were assessed according to the Ishak system. Data are presented as mean +/- standard deviation. Three hundred eighty-one patients (296 HCV-monoinfected patients and 85 HIV-HCV-coinfected patients) were recruited. The durations of HIV therapy before liver biopsy was performed for groups 2, 3, and 4 were 3.8 +/- 2.8, 3.3 +/- 1.8, and 6.6 +/- 2.2 years. The time from HIV diagnosis to HAART initiation was shorter for group 3 than for group 4 (9.1 +/- 7.3 vs. 34.1 +/- 13.1 months; P<.0001). Groups 1 and 3 had similar fibrosis stages (3.1 +/- 2 vs. 3.4 +/- 2.4), rates of fibrosis progression (0.13 +/- 0.09 vs. 0.16 +/- 0.11 per year), and necroinflammatory scores (6.1 +/- 1.8 vs. 6.1 +/- 2.0). Groups 2 and 4 had significantly more-advanced liver disease, as determined by fibrosis stage (4.6 +/- 1.8 vs. 4.3 +/- 2.0; P<.0009), rate of fibrosis progression (0.24 +/- 0.11 vs. 0.20 +/- 0.10 per year; P<.0001), and prevalence of cirrhosis (68% vs. 55%; P<.006), compared with group 1.
HIC-HCV-coinfected subjects who receive HAART as their sole form of therapy have liver histology findings comparable to those for HCV-monoinfected patients. A similar degree of benefit is not observed for HIV-HCV-coinfected patients who receive no therapy, NRTIs, or HAART after NRTIs, despite having a longer duration of therapy.
Do type and duration of antiretroviral therapy attenuate liver fibrosis in HIV-hepatitis C virus-coinfected patients?
Reported urine cytology accuracy, particular sensitivity, is highly variable. We evaluated the accuracy of urinary cytology for primary bladder cancer using population data linkage to provide valid estimates. Consecutive cytology tests processed through a major service between January 2000 and December 2004 were linked to a regional population cancer registry (allowing outcome ascertainment). Sensitivity and specificity were calculated using different thresholds, based on standardized reporting categories (C1 = negative, C2 = reactive, C3 = atypical, C4 = suspicious, C5 = malignant, Cx = inadequate). Cancer registry matching of 2,594 tests revealed 130 incident bladder cancers, of which 97 occurred within 12 months of cytology and were included in calculating accuracy. Sensitivity (C3-C5 considered positive) ranged between 40.2 and 42.3%, and specificity was 93.7-94.1%. If C3 results are counted as negative, sensitivity estimates reduced to 24.7-26.0%. The positive predictive value of a C3, C4 or C5 report was 11.7, 39.2, and 66.6%, respectively. High tumor grade was associated with significantly higher sensitivity compared to low and intermediate grades combined (p = 0.02).
Urine cytology is highly specific but has intermediate sensitivity, indicating that it has a role in adjunct diagnosis, but not in screening for primary bladder cancer. C3 results should be considered 'positive' and further investigated, and all positive results should prompt further intervention.
Is conventional urinary cytology still reliable for diagnosis of primary bladder carcinoma?
To assess whether the association between birth weight and blood pressure (BP) increases with age using three different statistical methods. A representative sample of 1232 study participants born between 1974-1978 in Limache, Chile were assessed in 2000-2002, of whom 796 were reassessed in 2010-2012. An 'amplification effect' was assessed by the change in the β coefficient in the two periods, the association between birth weight and the difference of BP overtime, and the interaction between birth weight and BP in the two periods. Birth weight was negatively associated with SBP in 2000-2002 (β = -2.46, 95% confidence interval (CI) -3.77 to -1.16) and in 2010-2012 (β = -3.64, 95% CI -5.20 to -2.08), and with DBP in 2000-2002 (β = -1.26, 95% CI -2.23 to -0.29) , and 2010-2012 (β = -1.64, 95% CI -2.84 to -0.45) after adjustment for sex, physical activity, and BMI. There was no association between birth weight and the difference in BP between the two periods or the interaction between birth weight, BP, and time interval.
Birth weight is a factor associated with BP in adults. This association increased with age, but amplification was shown only with one of the three methods.
Does the association between birth weight and blood pressure increase with age?
Patients with sickle cell disease (SCD) often present with abdominal pain, usually attributable to vasoocclusion. Experience at a single institution suggested that appendicitis was a rare cause of abdominal symptoms in this population. We sought to determine whether the incidence of appendicitis was significantly lower in patients with SCD than in the population at large. A 17-year retrospective chart review was performed at Rainbow Babies and Children's Hospital, Cleveland, OH, to determine the approximate incidence of acute appendicitis (AA) in patients with SCD. In addition, we performed a statistical analysis comparing the incidence of AA among SCD patients enrolled in the Cooperative Study of Sickle Cell Disease with that in the general population. Only two patients with SCD with pathologically confirmed AA were identified among approximately 200 patients followed at our institution during a 17-year period ( approximately 3500 patient-years), yielding an incidence rate of 5.7 cases per 10 000 patient-years. Among 3765 patients with SCD enrolled in the Cooperative Study of Sickle Cell Disease followed for a mean of 5.3 years (19 886 patient-years), a maximum of 9 cases of AA were identified, yielding an incidence rate of 4.5 cases per 10 000 patient-years. Based on data from the National Hospital Discharge Survey of 1978 to 1981, the incidence rate of AA in the general population (0 to 44 years of age) is approximately 16 per 10 000 patient-years. Paired t test analysis demonstrated a highly significant difference (P<.001) when comparing the incidence of AA among patients enrolled in the Cooperative Study of Sickle Cell Disease and the population at large.
AA is an unusual event in patients with SCD. The likelihood of developing appendicitis in SCD patients is less than one third of that for the population at large. Conservative therapy is warranted in the large majority of patients with SCD who present with acute abdominal pain. Surgical exploration is best limited to patients with clear evidence of potential surgical pathology or progressive findings during a period of observation. The biologic basis of our findings remains unknown.
Is the incidence of appendicitis reduced in patients with sickle cell disease?
To evaluate the association of inferior mesenteric arterial (IMA) type II endoleaks in patients undergoing endovascular aortic aneurysm repair (EVAR) for infrarenal abdominal aortic aneurysm with several morphologic parameters. Approval of the institutional ethical review committee was not required. This was a retrospective review of 322 computed tomographic angiographic studies that were performed in patients before they underwent elective EVAR for infrarenal abdominal aortic aneurysm. Morphologic parameters evaluated were IMA patency, origin of the IMA in relation to the aneurysm sac, diameter of the IMA, the cross-sectional area of the contrast material-enhanced aortic lumen at the level of the IMA ostium, and the number of additional patent aortic side branches from the aneurysm sac. The association of IMA type II endoleaks with each variable was analyzed by using univariate and multivariate logistic regression models. The diameter of the IMA did not influence the development of IMA type II endoleaks (P = .51). The incidence of these endoleaks was significantly higher in patients with greater cross-sectional area of the aortic lumen at the IMA ostium (P<.001). Patients with an IMA type II endoleak had significantly more patent aortic side branches before EVAR than did patients without an endoleak (3.6 ± 1.7 vs 2.2 ± 1.4; P<.001). According to the final logistic regression model that included cross-sectional area of the aortic lumen at the IMA and the number of aortic side branches as independent predictors, risk for IMA type II endoleaks was determined with a sensitivity of 78% (39 of 50) and a specificity of 79% (92 of 116).
Cross-sectional area of the contrast-enhanced aortic lumen at the level of the IMA ostium and the number of additional patent aortic side branches are associated with the development of IMA type II endoleaks.
Inferior mesenteric arterial type II endoleaks after endovascular repair of abdominal aortic aneurysm: are they predictable?
The study evaluates the clinical and pathological findings of 16 patients with locally advanced penile carcinoma (PC) submitted to emasculation, and discusses questions related to the usefulness of bilateral orchiectomy. Between 1999 and 2010, 172 patients with PC were treated. Sixteen (9 %) underwent emasculation. Data were retrieved from the institution's database including age, ethnicity, date of surgery, residential setting, level of schooling, time to diagnosis, type of reconstruction, complications, tumor stage and grade, vascular and perineural invasion along with invasion of corpus cavernosum, corpus spongiosum, testicles, scrotum and urethra. A total of 16 patients (average: 63.1 years) with locally advanced PC were included. All were illiterate or semiliterate rural dwellers and 87% were white. The time to diagnosis was 8-12 months. The mean follow-up time was 31.9 months (1-119). By the time of the last follow-up, only seven patients (43.75%) were alive. Tumors were pT4 (n = 6), pT3 (n = 8), pT2 (n = 2), Grade I (n = 5) and Grade II (n = 11). The histopathological examination revealed invasion of the urethra (n = 13), scrotum (n = 5) and testicles (n = 1). The surgical margin was positive in one patient. Six patients (37.5%) had vascular invasion and 11 (68.7%) had perineural invasion. Currently, only one of the former is alive.
The finding of focal microscopic testicular infiltration in only one of 32 testicles, even in the presence of clinically apparent scrotal invasion, suggests that emasculation without bilateral orchiectomy is a safe treatment option for patients with locally advanced PC.
Locally advanced penile carcinoma: classic emasculation or testis-sparing surgery?
Deep venous thrombosis (DVT) and pulmonary embolus (PE) remain common surgical complications, often affecting patients without any prior warning. Postoperative spinal epidural hematomas (SEH) may have a devastating impact on a patient's recovery from a routine procedure. The effect of preoperative DVT prophylaxis administration on elective spinal patients has not previously been studied. Retrospective cohort analysis. To correlate the incidence of preoperative DVT prophylaxis administration and the rate of postoperative DVT, PE, and SEH after elective spinal surgery. Earlier studies have shown a postoperative DVT rate in elective spinal patients of between 0.3% and 31%, a PE rate of 0.2% to 0.9%, and a SEH rate of approximately 0.1%. About 3870 patient notes, from 2004 to 2008 elective spinal procedures, were reviewed. DVT, PE, and SEH rates were compared between those patients receiving and not receiving preoperative DVT prophylaxis. The 36.9% of patients received preoperative DVT prophylaxis, and 19 patients suffered and DVT and/or PE. Nine of these had received preoperative prophylaxis, giving an odds ratio of 0.91. Sixteen patients suffered a SEH, and this gave an odds ratio of 1.33. The SEH's presented with a median postoperative time of 4 days.
Preoperative DVT prophylaxis does not influence the rate of postoperative DVT or PE among elective spinal patients. It probably does not influence SEH rate, and it is noted that SEH may present quite late, in contrast to currently accepted time courses.
Does preoperative DVT chemoprophylaxis in spinal surgery affect the incidence of thromboembolic complications and spinal epidural hematomas?
To investigate baseline demographics and disease characteristics as predictors of the analgesic effect of duloxetine and pregabalin on diabetic peripheral neuropathic pain (DPNP). Based on data from the COMBO-DN study, a multinational clinical trial in DPNP, the potential impact of baseline characteristics on pain relief after 8-week monotherapy with 60 mg/day duloxetine or 300 mg/day pregabalin was assessed using analyses of covariance. Subgroups of interest were characterized regarding their baseline characteristics and efficacy outcomes. A total of 804 patients were evaluated at baseline. A significant interaction with treatment was observed in the mood symptom subgroups with a larger pain reduction in duloxetine-treated patients having no mood symptoms [Hospital Anxiety and Depression Scale (HADS) depression or anxiety subscale score<11; -2.33 (duloxetine); -1.52 (pregabalin); p = 0.024]. There were no significant interactions between treatment for subgroups by age (<65 or ≥65 years), gender, baseline pain severity [Brief Pain Inventory Modified Short Form (BPI-MSF) average pain<6 or ≥6], diabetic neuropathy duration (≤2 or>2 years), baseline haemoglobin A1c (HbA1c) (<8% or ≥8%), presence of comorbidities and concomitant medication use.
Our analyses suggest that the efficacy of duloxetine and pregabalin for initial 8-week treatment in DPNP was consistent across examined subgroups based on demographics and disease characteristics at baseline except for the presence of mood symptoms. Duloxetine treatment appeared to be particularly beneficial in DPNP patients having no mood symptoms.
Are there different predictors of analgesic response between antidepressants and anticonvulsants in painful diabetic neuropathy?
Asthmatic children and adolescents attending outpatient clinics often have a history of pneumonia. Whether respiratory symptoms, lung function, and airway inflammation differ in asthmatic patients with and without a history of pneumonia remains controversial. To compare clinical, lung functional, and inflammatory variables in asthmatic outpatients with and without a history of pneumonia. Methods. In 190 asthmatic outpatients, aged 6-18 years, we assessed respiratory symptoms, lung function (flows, volumes, and pulmonary diffusion capacity, DLCO/VA), and atopic-airway inflammation as measured by the fractional concentration of exhaled nitric oxide (FE(NO)). A previous medical and radiological diagnosis of pneumonia was defined as "recurrent pneumonia" if subjects had at least three pneumonia episodes or two episodes within a year. Of the 190 outpatients studied, 38 (20%) had a history of pneumonia. These patients had more frequent upper-respiratory symptoms, nighttime awakenings in the past 4 weeks, daily use of inhaled corticosteroids, and lower FE(NO) than the 152 asthmatic children without previous pneumonia (FE(NO): 20.6 ppb, 95% CI: 15.2-28.0 vs. 31.1 ppb, 95% CI: 27.0-35.8; p<.05). Of the 38 patients with previous pneumonia, 14 had recurrent pneumonia. Despite comparable lung volumes and flows, they also had lower DLCO/VA than asthmatic children with no recurrent pneumonia and asthmatic children without previous pneumonia (DLCO/VA%: 91.2 ± 11.3 vs. 108.5 ± 14.7 vs. 97.9 ± 18.6, p<.05).
Respiratory assessment in asthmatic children and adolescents with a history of pneumonia, especially recurrent pneumonia, often discloses symptoms needing corticosteroid therapy, and despite normal lung volumes and flows, mild reductions in the variables reflecting gas diffusion and atopic-airway inflammation (DLCO/VA and FE(NO)). Whether these respiratory abnormalities persist in adulthood remains an open question.
Does a parent-reported history of pneumonia increase the likelihood of respiratory symptoms needing therapy in asthmatic children and adolescents?
The 6-minute walk test is an way of assessing exercise capacity and predicting survival in heart failure. The 6-minute walk test was suggested to be similar to that of daily activities. We investigated the effect of motivation during the 6-minute walk test in heart failure. We studied 12 males, age 45 +/- 12 years, ejection fraction 23 +/- 7%, and functional class III. Patients underwent the following tests: maximal cardiopulmonary exercise test on the treadmill (max), cardiopulmonary 6-minute walk test with the walking rhythm maintained between relatively easy and slightly tiring (levels 11 and 13 on the Borg scale) (6EB), and cardiopulmonary 6-minute walk test using the usual recommendations (6RU). The 6EB and 6RU tests were performed on a treadmill with zero inclination and control of the velocity by the patient. The values obtained in the max, 6EB, and 6RU tests were, respectively, as follows: O2 consumption (ml.kg-1.min-1) 15.4 +/- 1.8, 9.8 +/- 1.9 (60 +/- 10%), and 13.3 +/- 2.2 (90 +/- 10%); heart rate (bpm) 142 +/- 12, 110 +/- 13 (77 +/- 9%), and 126 +/- 11 (89 +/- 7%); distance walked (m) 733 +/- 147, 332 +/- 66, and 470 +/- 48; and respiratory exchange ratio (R) 1.13 +/- 0.06, 0.9 +/- 0.06, and 1.06 +/- 0.12. Significant differences were observed in the values of the variables cited between the max and 6EB tests, the max and 6RU tests, and the 6EB and 6RU tests (p<0.05).
Patients, who undergo the cardiopulmonary 6-minute walk test and are motivated to walk as much as they possibly can, usually walk almost to their maximum capacity, which may not correspond to that of their daily activities. The use of the Borg scale during the cardiopulmonary 6-minute walk test seems to better correspond to the metabolic demand of the usual activities in this group of patients.
Can the cardiopulmonary 6-minute walk test reproduce the usual activities of patients with heart failure?
The objective of this study was to determine the proportion of radiologists in three different radiology organizations who report using the American College of Radiology (ACR) musculoskeletal appropriateness criteria. Radiologists from the Society of Skeletal Radiology, Georgia Radiological Society, and Utah Radiological Society were surveyed regarding their use of the ACR musculoskeletal appropriateness criteria. The surveys were carried out during 1998 and data were collected using written survey forms, telephone, and fax. The overall survey response rate was 298 (64%) of 465. Overall, 30% of respondents reported using the musculoskeletal appropriateness criteria. The proportion of respondents who used the musculoskeletal criteria was not different across the three organizations or for private practice compared with academic radiologists.
The proportion of radiologists who report using the ACR musculoskeletal radiology appropriateness criteria is low. This result is consistent with other reports in the literature that show little impact on the practice of physicians after the distribution of written practice guidelines.
Do radiologists use the American College of Radiology Musculoskeletal Appropriateness Criteria?
Evidence that bicarbonate haemofiltration and dialysate fluids are superior to lactate in patients with acute kidney injury treated by continuous renal replacement therapy (CRRT) remains controversial. We prospectively audited acid-base during the first 48 h of CRRT in 62 patients, using bicarbonate and lactate fluids. Baseline lactate was greater in the bicarbonate group (4.76 ± 0.77 vs. 2.92 ± 0.5 mmol/l, p<0.01), but pH, bicarbonate, chloride and base excess were similar. Lactate fell significantly in the bicarbonate group to 2.88 ± 0.3 mmol/l at 24 h and 2.39 ± 0.2 mmol/l at 48 h, but not in the lactate group. Base excess improved more with bicarbonate, median increase in the first 24 h was 51.6% (29.1-96.9) versus 18.5% (-5 to 55) with lactate and 74.2% (38.5-123) versus 36.1% (-3.6 to 68), p<0.05 at 48 h. However, there were no significant differences in bicarbonate, chloride, pH, blood pressure and vasopressor requirements. 13.3% of patients were switched from lactate to bicarbonate fluids due to failure to correct acidosis. Subgroup analysis of 19 patients with liver failure showed similar results.
Bicarbonate fluids led to a more rapid fall in lactate and greater improvement in base excess during CRRT, but not overall control of acidosis.
Do bicarbonate-based solutions for continuous renal replacement therapy offer better control of metabolic acidosis than lactate-containing fluids?
To present the results of a comprehensive dietary review of a group of women with a recurrence of gestational diabetes mellitus (GDM), compared with a group of women with no recurrence of GDM during a subsequent pregnancy. The dietary intake of 14 women with a recurrence of GDM was compared with 21 women with no recurrence of GDM. Women with GDM in one pregnancy have a recurrence rate of only 30-50%. While the reasons for this have not been determined, dietary factors have been considered probable. The women with a recurrence of GDM consumed 38.4 (by diet history) and 41.4% (by food record) of their total energy intake as fats, compared with 34.1 (P<0.01) and 33.1% (P<0.001), respectively, for women with no recurrence. The percentage intake of polyunsaturated, monounsaturated, and saturated fatty acids was similar in both groups. There was a proportionate reduction in carbohydrate intake as a percentage of total energy and in fiber intake in grams for the women with a recurrence of GDM.
When the relationship between saturated fat intake and insulin resistance is considered, the possibility exists that dietary modification of fat intake before and during pregnancy may reduce the recurrence rate of GDM.
The recurrence of gestational diabetes: could dietary differences in fat intake be an explanation?
All patients discussed at the regional multidisciplinary meeting between June 2007 and August 2011 were included. Data were collated prospectively from multidisciplinary team records while site of tumour was documented from radiology, endoscopy, operative and pathology reports. Comparative statistics (χ(2) ) were performed using spss 19. Of 1487 patients included 255 were detected via the screening programme and 1232 from symptomatic presentation. More left sided tumours (splenic flexure to rectosigmoid) were detected via screening (P=0.005). Of non-screened patients (n=1232), 456 (37%) tumours were right sided (caecum to distal transverse colon), 419 (34%) were left sided and 357 (29%) were in the rectum. This compares with the screened group (n=255): right sided 74 (29%), left sided 113 (44%) and rectal 68 (27%).
More left sided tumours appear to be detected in screened patients compared with symptomatic presentation, contrary to previously published work. These results may be worthy of further consideration given the ongoing debate on the optimal means of screening.
Does the location of colorectal carcinoma differ between screened and unscreened populations?
To investigate the possible association between chronic noise-induced hearing loss and the volume of mastoid pneumatization. The study involved 46 subjects employed in the press and montage department of a gun factory: 28 in the study group with noise-induced hearing loss and 18 in the control group with no hearing loss. The volume of mastoid pneumatization was measured with computed tomography. Student's t test was used in the comparison of the mastoid volumes of the study and the control groups. The intergroup evaluations showed no significant difference between the study and control group with regards to age, use of substances or ototoxic drugs, systemic diseases, use of personal hearing protectors, duration of occupational and non-occupational noise exposure was observed (P>0.05). The mean values of mastoid pneumatization in the study and the control groups were 9717.6 mm3 and 11005.8 mm3, respectively. Although the volume of mastoid pneumatization was smaller in the study group than in the control group, this difference was not statistically significant (P>0.05).
This preliminary study showed that there was no significant correlation between mastoid pneumatization volume and chronic noise-induced hearing loss. However, this correlation could be significant in further studies with a larger number of subjects.
Is there any correlation between chronic noise-induced hearing loss and mastoid pneumatization volume?
Home-made preparation of heroin is common in countries of the former Soviet Union (FSU), and the addition of blood during its preparation and the use of contaminated syringes to distribute it may play a role in the rapid spread of HIV-1 among injecting drug users (IDUs). This study was designed to determine the viability of HIV-1 during these procedures. Field observations of home-made opiate manufacture in four FSU countries were used to develop a consensus protocol to replicate manufacture in the laboratory that included the addition of human blood contaminated with HIV-1. Following the addition of HIV-1-contaminated blood during manufacture or storage, we attempted to recover viable HIV-1. The recovery was measured by propagation of the virus in stimulated white blood cells from uninfected donors. In experiments in which HIV-1 contaminated blood was added during manufacture, no viable HIV-1 was recovered. In experiments in which chornaya was introduced into HIV-contaminated syringes, the percentage of syringes containing viable HIV-1 was reduced. The reduction appeared to be related to the interaction of HIV-1 contaminated blood with a component of the poppies. While HIV-contaminated syringes used to dispense or inject home-made opiates might transmit HIV, the ability of chornaya to reduce HIV viability seems to make this route of transmission less efficient.
The epidemic of HIV-1 among IDUs in the FSU resulted more probably from recognized injection risk behaviors-including sharing syringes and drug solutions--than from opiate solutions harboring viable HIV-1.
Can home-made injectable opiates contribute to the HIV epidemic among injection drug users in the countries of the former Soviet Union?
Various characteristics of floors and floor coverings are well established as injury hazards. Loose carpeting, such as rugs, is often cited as a hazard leading to injury. To describe the epidemiology and patterns of rug, mat, and runner-related injuries in patients seeking emergency treatment. Data from the National Electronic Injury Surveillance System from 1990 through 2009 were investigated. Sample weights were used to calculate national estimates. US Census Bureau data were used to calculate injury rates per 100 000 individuals. Linear regression and computation of relative risks (RRs) with 95% confidence intervals (CIs) were performed. An estimated 245 605 patients were treated in US emergency departments for rug-related injuries during the study period, with an average of 12 280 cases per year. Females (72.3%) and individuals older than 64 years (47.1%) sustained the largest number of injuries. Patients younger than 6 years were more likely to injure the head or neck region (RR, 3.52 [95% CI, 3.26-3.81]) compared with all other groups. Patients older than 18 years were more likely to experience a fracture or dislocation (RR, 2.52 [95% CI, 2.13-2.88]) and sustain an injury as a result of tripping or slipping on a rug (RR, 1.36 [95% CI, 1.26-1.41] compared with other age groups. Increasing age was associated with increased risk of hospitalization in this study. Patients whosustained an injury from a rubber or plastic mat/rug were significantly less likely to be admitted (RR, 0.67 [95% CI, 0.55-0.83]). Injuries occurring in kitchens or bathrooms resulted in significantly higher admission rates (RR, 1.45 [95% CI, 1.34-1.54]).
Rug-related injuries are an important source of injury for individuals of all ages.
Throw rug-related injuries treated in US EDs: are children the same as adults?
performing anal endosonography in complex fistula-in-ano allows us to design a personalized surgical strategy in each case, thereby improving results. However, there are doubts in the literature as to its utility in recurrent complex fistulas. The aim of this study was to compare the utility of anal ultrasonography in the study of primary versus recurrent complex fistula-in-ano. prospective study of patients diagnosed and treated for complex fistula-in-ano. Physical examination and anal ultrasonography provided data on primary track, internal opening, horseshoe extension and the presence of secondary tracks or cavities in a protocol designed specifically for the study. These assessments were subsequently contrasted with operative findings. we included 35 patients, 19 (54.3%) with primary complex anal fistulas and 16 (45.7%) with recurrent fistulas. According to the operative findings, fistulas were classified as high transsphincteric in 28 patients (80%), suprasphincteric in 6 (17.1%) and extrasphincteric in one patient (2.9%), with no differences between groups. Physical examination correctly classified 28 of the 35 fistulous tracks, in contrast to the 32 (91.4%) correctly described on ultrasonography (80%). We did not find any statistically significant differences between the primary and the recurrent fistula groups with regard to sensibility, positive predictive value and accuracy of the anal ultrasonography for any of the parameters studied.
the accuracy of anal ultrasonography does not decrease in recurrent complex fistula-in-ano.
Is anal endosonography useful in the study of recurrent complex fistula-in-ano?
Theoretically, video-assisted mediastinoscopy (VAM) offers improved staging of subcarinal lymph nodes (LNs) compared with standard cervical mediastinoscopy (SCM). Materials and Between 2006 and 2011, 553 patients (SCM, n = 293; VAM, n = 260) with non-small cell lung carcinoma who underwent mediastinoscopy were investigated. Mediastinoscopy was performed only in select patients based on computed tomography (CT) or positron emission tomography CT scans in our center. The mean number of LNs and stations sampled per case was significantly higher with VAM (n = 7.65 ± 1.68 and n = 4.22 ± 0.83) than with SCM (n = 6.91 ± 1.65 and 3.92 ± 86.4; p < 0.001). The percentage of patients sampled in station 7 was significantly higher with VAM (98.8%) than with SCM (93.8%; p = 0.002). Mediastinal LN metastasis was observed in 114 patients by mediastinoscopy. The remaining 439 patients (203 patients in VAM and 236 in SCM) underwent thoracotomy and systematic mediastinal lymphadenectomy (SML). SML showed mediastinal nodal disease in 23 patients (false-negative [FN] rate, 5.2%). The FN rate was higher with SCM (n = 14, 5.9%) than with VAM (n = 9, 4.4%), although this difference was not statistically significant (p = 0.490). Station 7 was the most predominant station for FN results (n = 15). The FN rate of station 7 was found to be higher with SCM (n = 9, 3.8%) than with the VAM group (n = 6, 2.9%; p = 0.623).
FN were more common in mediastinoscopy of subcarinal LNs. VAM allows higher rates of sampling of mediastinal LN stations and station 7, although it did not improve staging of subcarinal LNs.
Does video-assisted mediastinoscopy offer lower false-negative rates for subcarinal lymph nodes compared with standard cervical mediastinoscopy?
According to the 2002 Surgical Infection Society Guidelines on Antimicrobial Therapy for Intra-abdominal Infections, antimicrobial therapy is not recommended beyond 24 hours for the treatment of postoperative acute or gangrenous appendicitis without perforation. However, clinicians commonly consider gangrenous appendicitis to pose a greater risk of post-operative infectious complications, such as surgical site infections and intra-abdominal abscesses. This study examines the relative risk of post-operative infection between patients with simple and gangrenous appendicitis. A retrospective review of patients with either non-perforated gangrenous or simple appendicitis from 2010 to 2012 was performed at a large urban teaching hospital. The rate of post-operative intra-abdominal abscess formation, which was diagnosed on patient readmission to the hospital, was significantly greater in patients with non-perforated gangrenous appendicitis in comparison to those with simple non-perforated appendicitis. Also, patients with non-perforated gangrenous appendicitis received extended courses of post-operative antibiotics, despite SIS recommendations.
The role of peri-operative antibiotics for non-perforated gangrenous appendicitis merits further study.
Simple acute appendicitis versus non-perforated gangrenous appendicitis: is there a difference in the rate of post-operative infectious complications?
We wanted to explore possible associations between characteristics of carers, dementia sufferers and the caring situation and the presence of abuse that was acknowledged by carers. Eighty-two carers of dementia sufferers were interviewed in their homes about three types of abuse (verbal abuse, physical abuse and neglect) using a structured questionnaire. Fifty-two percent (n = 43) carers admitted to having carried out some form of abuse. Verbal abuse was the most common form (n = 42, 51%), while 20% (n = 16) of carers admitted to physical abuse and 4% (n = 3) to neglect abuse. Significant associations were found between verbal abuse and psychological ill health in the carer and behavioural problems in the dementia sufferer. Physical abuse was significantly associated with higher levels of self-reported good health by the carer. High expressed emotion measured in carers was highly correlated with all types of abuse.
It is possible to identify situations where people with dementia may be at high risk of abuse from their carers. Any effective intervention strategy should address psychological health problems in the carer, behavioural problems in the dementia sufferer and a strategy to manage high levels of expressed emotion in these situations.
Abuse of vulnerable people with dementia by their carers: can we identify those most at risk?
Despite the superior patency of internal thoracic artery (ITA) grafting compared with saphenous veins, frequency of bilateral ITA (BITA) grafting in Europe is still approximately 10%. The aim of the present study was to compare the early outcome of patients receiving either BITA or single ITA (SITA) grafting. A total of 11,496 patients with isolated coronary artery bypass grafting (CABG), operated between January 1996 and December 2012, were analyzed retrospectively; 0.6476 patients (mean age 65.2 years, 81.3% males) received BITA and 5,020 patients (mean age 66.6 years, 76.7% males) SITA grafting. Mean body mass index (BMI) was 27.2 versus 27.4, p = 0.017. Incidence of diabetes was 28.9 versus 28.4%, p = 0.08. Ejection fraction (EF) > 50 was 71.3% (BITA) versus 66.3% (SITA), p < 0.001. Elective operations were performed in 88.4% (BITA) versus 83.3% (SITA), and urgent/emergent surgery was necessary in 11.6% (BITA) versus 16.7% (SITA), p < 0.001. Number of grafts was 3.76 (BITA) versus 3.06, p < 0.001. Duration of surgery (194.4 vs. 180.4 minutes) as well as X-clamp time (60.4 vs. 51.7 minutes) was prolonged for BITA, p < 0.001. Perioperative infarction rate revealed 3.2% (BITA) versus 3.6%, p = 0.54. Frequency of rethoracotomy due to bleeding was higher in the BITA group (3.8 vs. 2.1%), p < 0.001. Sternal instabilities occurred in 2.3% (BITA) versus 2.2%, p = 0.749. Duration of mechanical ventilation < 12 hours was 74.6 versus 77.1%, p = 0.09 and duration of in-hospital stay was 10.5 versus 10.4 days, p = 0.68. Thirty-day mortality was 2.4% (BITA) versus 3.0%, p = 0.09. Multivariate analysis identified prolonged duration of surgery, BMI > 30, emergent operations, advanced age, and BITA grafting as predictor for sternal instabilities. EF < 30%, advanced age plus emergency were associated with increased 30-day mortality.
CABG using BITA can be performed routinely with good clinical results and low mortality. Compared with SITA grafting, bleeding complications were enhanced.
Does Bilateral ITA Grafting Increase Perioperative Complications?
Due to the high number of total hip arthroplasties (THA) revised due to instability, the use of large femoral heads to reduce instability is justifiable. It is critical to determine whether or not large femoral heads used in conjunction with thin polyethylene liners lead to increased wear rates, which can lead to osteolysis. Therefore, by using validated wear-analysis software, we evaluated linear wear rates in a consecutive cohort of patients who underwent primary THA with thin polyethylene liners. All patients were selected from a consecutive, prospectively collected database of 241 THAs performed at a single institution by two fellowship-trained joint-reconstruction surgeons between July 2007 and June 2011. These patients were 1:1 matched to a cohort of patients who had conventional-thickness polyethylene liners. No significant differences were observed between linear wear rates of thin or conventional-thickness liners. The Kaplan-Meier survivorship for both cohorts was 100 %, and no cases of polyethylene fracture were observed in either cohort.
Our results suggest that according to a mean follow-up of 4 years, the use of thin liners in THA is promising. Longer follow-up is required to assess whether these outcomes are observed later.
Is the use of thin, highly cross-linked polyethylene liners safe in total hip arthroplasty?
To verify whether the guidelines for the treatment of heart failure have been adopted at a university hospital. The guidelines recommend the following: use of angiotensin-converting enzyme inhibitors for all patients with systolic ventricular dysfunction, use of digitalis and diuretics for symptomatic patients, use of beta-blockers for patients in functional classes II or III, use of spironolactone for patients in functional classes III or IV. We analyzed the prescriptions of 199 patients. All these patients had ejection fraction (EF)</=0.50, their ages ranged from 25 to 86 years, and 142 were males. Cardiomyopathy was the most frequent diagnosis: 67 (33.6%) patients had dilated cardiomyopathy, 65 (32.6%) had ischemic cardiomyopathy. Angiotensin-converting enzyme inhibitors were prescribed for 93% of the patients. 71.8% also had a prescription for digitalis, 86.9% for diuretics, 27.6% for spironolactone, 12% for beta-blockers, 37.2% for acetylsalicylic acid, 6.5% for calcium channel antagonists, and 12.5% for anticoagulants. In regard to vasodilators, 71% of the patients were using captopril (85.2mg/day), 20% enalapril (21.4mg/day), 3% hydralazine and nitrates. In 71.8% of the cases, the dosages prescribed were in accordance with those recommended in the large studies.
Most patients were prescribed the same doses as those recommended in the large studies. Brazilian patients tolerate well the doses recommended in the studies, and that not using these doses may be a consequence of the physician's fear of prescribing them and not of the patient's intolerance.
Do cardiologists at a university hospital adopt the guidelines for the treatment of heart failure?
Ocular prevalence is defined as an unequal weighting of the eyes in the directional perception of stereo objects. Opinions differ as to the cause and relevance of ocular prevalence. Hans-Joachim Haase suggested that ocular prevalence is due to fixation disparity, brought about by incomplete compensation of heterophoria. He further suggested that prismatic spectacles determined by his "measuring and correcting methodology" (MKH) could restore bicentral fixation and thus establish a perceptual balance between both eyes. We examined 10 non-strabismic subjects with a visual acuity of>or = 1.0 in both eyes. It turned out that all 10 had a "fixation disparity type II", characterised according to Haase by a "disparate retinal correspondence". All subjects underwent the automatic Freiburg Ocular Prevalence Test, without and with MKH prisms. In addition we examined ocular prevalence under forced vergence and compared ocular prevalence with stereoacuity. Spontaneous ocular prevalence ranged between 1 and 69 %. Averaged over all 10 subjects, ocular prevalence without and with the MKH prisms were not significantly different. Statistical evaluation of single subjects revealed only in one of the 10 a significant difference (Bonferroni-corrected p = 0.001). In the subgroup of 5 subjects who underwent forced vergence, ocular prevalence remained unaltered between 0 and 18 Delta base out. The stereoscopic threshold of all 10 subjects ranged between 1.5 and 14.5 arcsec. There was no correlation between ocular prevalence and stereoscopic threshold (r = - 0.2, p = 0.5).
Our results indicate that ocular prevalence is largely independent of phoria correction and vergence stress. The excellent stereoacuity of all subjects suggests that ocular prevalence is abandoned for the sake of optimal resolution when very small differences in depth have to be judged.
Do prisms according to Hans-Joachim Haase influence ocular prevalence?
Ten healthy male subjects performed two bouts of 100 drop-to-vertical jumps (DVJs) from a 70-cm high platform at an interval of three weeks. CK activity, CK concentration, and neutrophils were measured prior to, and on four consecutive days after the interventions. Besides significant main effects, there was a significant group by time interaction for the specific CK activity (CK activity in blood [U/L] divided by the enzyme concentration [ng/mL]). Higher values following the first bout (133.1±99.4 U/µg) than the second bout (94.7±63.0 U/µg) indicate that the ratio of inactive to active CK molecules increased. Neutrophil levels were similar following both bouts and differed only at 8 hours (7.0±2.5 bout 1, 5.1±1.6 bout 2).
The findings of the present study support the hypothesis that the blunted response of CK activity after a repeated bout of eccentric exercise is not solely the result of tissue protection, but can be at least partially attributed to enzyme inactivation.
The repeated bout effect: is the blunted creatine kinase response an effect of an altered enzyme inactivation kinetic?
Entry dyspareunia is a sexual health concern which affects about 21% of women in the general population. Characterized by pain provoked during vaginal penetration, introital dyspareunia has been shown by controlled studies to have a negative impact on the psychological well-being, sexual function, sexual satisfaction, and quality of life of afflicted women. Many cognitive and affective variables may influence the experience of pain and associated psychosexual problems. However, the role of the partner's cognitive responses has been studied very little.AIM: The aim of the present study was to examine the associations between partners' catastrophizing and their perceptions of women's self-efficacy at managing pain on one side and women's pain intensity, sexual function, and sexual satisfaction on the other. One hundred seventy-nine heterosexual couples (mean age for women = 31, SD = 10.0; mean age for men = 33, SD = 10.6) in which the woman suffered from entry dyspareunia participated in the study. Both partners completed quantitative measures. Women completed the Pain Catastrophizing Scale and the Painful Intercourse Self-Efficacy Scale. Men completed the significant-other versions of these measures. Dependent measures were women's responses to (i) the Pain Numeric Visual Analog Scale; (ii) the Female Sexual Function Index; and (iii) the Global Measure of Sexual Satisfaction scale. Controlled for women's pain catastrophizing and self-efficacy, results indicate that higher levels of partner-perceived self-efficacy and lower levels of partner catastrophizing are associated with decreased pain intensity in women with entry dyspareunia, although only partner catastrophizing contributed unique variance. Partner-perceived self-efficacy and catastrophizing were not significantly associated with sexual function or satisfaction in women.
The findings suggest that partners' cognitive responses may influence the experience of entry dyspareunia for women, pointing toward the importance of considering the partner when treating this sexual health problem.
Do romantic partners' responses to entry dyspareunia affect women's experience of pain?
e-Health refers to the organisation and delivery of health services and information using the internet and related technologies. We investigated the perceptions of primary care staff towards e-health initiatives in the NHS Connecting for Health programme and whether front-line staff are ready to implement such changes. Twenty participants from different professional groups were purposively selected for interview, based on their current computer usage. The same practice staff were subsequently observed in order to gain an insight into how they use computers. Practice staff (doctors, nurses, practice managers and receptionists) who will be expected to use new information technology and primary care trust (PCT) staff who are involved in its implementation were selected to participate in this study. A north London PCT with 62 general practices. Four practices were selected for the study. Analysis of the interviews and the observational data yielded six recurrent themes that have a bearing on readiness to use information and communication systems to support clinical care: perceptions of technology and NHS Connecting for Health; issues relating to resources; patient choice; matters relating to confidentiality and security; political pressures; and how information technology is currently used within primary care.
At the time of the study the systems that form part of NHS Connecting for Health, apart from the Quality Management and Analysis System (QMAS), were not implemented across the PCT. All the practices in the study acknowledged the benefits new technology would bring to the workplace, but there were also some common concerns, which suggest that staff working in primary care practices are not ready for e-health. Successful implementation of the NHS Connecting for Health programme rests on identifying, acknowledging and overcoming these concerns. A different approach might be required for those practices that have made very little progress in using email or moving towards an electronic patient record. This study suggests that a mistrust of technology and fears as to the heavy initial workload involved in becoming fully computerised have dissuaded some practices from embracing e-health. If NHS Connecting for Health is to be a success, implementation teams might need to focus initially on practices that have been reluctant to use technology to support both clinical care and the day-to-day work of the practice.
Is primary care ready to embrace e-health?
To evaluate the treatment patterns of women aged 55 years or older with newly diagnosed breast cancer and to examine the association between age and ethnicity/race on treatment selection. A cross-sectional survey between January 1 and June 30, 2001 of 401 women was performed of Hispanic, black and non-Hispanic white women in Los Angeles County, aged 55 years or older with newly diagnosed breast cancer. Regression analysis examined the association between: (a) age and treatment selection and (b) ethnicity/race and treatment selection, adjusting for the effect of possible confounders. In this study of urban breast cancer patients (64.1% response rate), blacks were less likely to receive hormone (OR=0.36) or chemotherapy therapy (OR=0.50) while older patients were less likely to receive lymph node dissection after lumpectomy (OR=0.48) and chemotherapy (OR=0.22).
Although there are racial and age disparities in breast cancer treatment, other factors such as education, income status, insurance plan, functional status, and comorbidity also play an important role.
Do age and ethnicity predict breast cancer treatment received?
The aim of the study was to investigate the adequacy of help delivered by the healthcare system for 12 symptoms/problems in a national, randomly selected sample of advanced cancer patients in Denmark. Advanced cancer patients (n = 1630) from 54 hospital departments across Denmark received the 3-Levels-of-Needs Questionnaire (3LNQ). The 3LNQ measures 'problem burden', the degree to which a symptom or problem is perceived as a problem, and 'felt need', whether the patient receives adequate help. Prevalences were calculated for 'problems' (at least 'a little' of a problem), 'moderate/severe problems' (at least 'quite a bit' of a problem) and 'felt need' (inadequate help or no help despite wanting it). In total, 977 (60%) patients participated. The most frequent 'problems' were fatigue (73%; 'moderate/severe' 36%) and limitations doing physical activities (65%; 'moderate/severe' 36%). For the 12 symptoms/problems assessed the prevalence of 'felt need' was 11-35%. Of the patients who had received help, 34-74% viewed the help as inadequate. Of those who had not received help, 48-78% wished for help.
Advanced cancer patients are not receiving the help they need. Large proportions of patients were burdened by symptoms/problems. Of those who had received help, many viewed it as inadequate. Better symptom/problem identification and management is warranted for advanced cancer patients.
Do advanced cancer patients in Denmark receive the help they need?
Clinical studies, with a proper scientific design, on the impact of disclosing a prognosis on a patient's psychological or physical conditions are rare. We investigated the effect of patient awareness of terminal status on survival and quality of life (QoL) in a palliative care setting. This is a prospective cohort study of patients with terminal cancer. Patients with cancer at a palliative care unit were enrolled consecutively. The patients' awareness of their terminal status was determined using a semistructured interview. Sociodemographic and clinical characteristics, including Eastern Cooperative Oncology Group performance status, depressed mood, and QoL, were investigated. To determine the independent effects of awareness of illness on survival and QoL, multivariate Cox proportional-hazards regression and multivariate linear regression were used, respectively. For the 262 cases analyzed, the median survival time (interquartile range) was 28.5 (15.8-55.3) days, and 76 (29.0%) patients were unaware of their prognosis. Patients who were aware survived for a shorter period than did those who were unaware (HR, 1.44; 95% CI, 1.07-1.93, p = 0.015) after adjusting for clinical variables including physical status and depression. Also, patients who were aware reported lower subjective QoL compared with patients who were unaware in a multivariate linear regression analysis (B, -0.10; 95% CI, -0.17 to -0.03, p = 0.008).
Awareness of prognosis may negatively impact survival and QoL in terminally ill cancer patients. Therefore, the patient's preference for and individual susceptibility to receiving such information should be assessed carefully before disclosure.
Does awareness of terminal status influence survival and quality of life in terminally ill cancer patients?
The purpose of this study was to examine the importance of urban-rural context as a determinant of call rates to smoking cessation lines. This study used individual level New Zealand Quitline call data from 2005 to 2009, and 2006 New Zealand Census data on smoking to calculate Quitline call rates for smokers. Negative binomial regression examined the relationship between call rates and a sevenfold urban-rural classification, controlling for age, sex, ethnicity and deprivation. We found a significant urban-rural gradient in the rate of smokers calling Quitline. Rates were highest among smokers in main-urban areas [0.09 (95 % confidence interval (CI) = 0.089, 0.091)] decreasing with successive urban-rural classifications to the lowest rate in rural/remote areas [0.036 (95 % CI = 0.03, 0.04)]. This association was not confounded by age, sex, ethnicity or deprivation.
Smokers in rural areas are less likely to use the New Zealand Quitline, even after controlling for confounding factors. This suggests that the national quitline is less effective in reaching rural smokers and more attention to the promotion of smoking cessation in rural communities is needed.
Examining the significance of urban-rural context in tobacco quitline use: does rurality matter?
To examine hospitalisation rates for selected heart-disease-related diagnoses by age, gender, ethnicity and deprivation. Four years' data on publicly-funded hospital discharges for: (i) heart failure and (ii) cardiac interventions were cross-classified by age group, gender, ethnicity (Maori/non-Maori) and deprivation (NZDep96). Population hospitalisation rates were calculated and displayed in multi-dimensional trellis graphs. The graphs show patterns of hospitalisation for chosen variables simultaneously. The expected increase in heart failure with age is found, as is an increase for the cardiac group up to ages 65-74 years. Clear gender differences were found. A further increase of heart failure with higher deprivation is evident throughout. For cardiac interventions, the relationship with deprivation is complex. Differences by ethnicity are disturbing. Hospitalisation rates for heart failure for Maori are typically more than double the non-Maori rates. In contrast, for the cardiac group Maori intervention rates are much lower.
Graphical analysis that displays age, gender, ethnicity and deprivation simultaneously provides great insight into hospitalisation rates. Ethnic differences are particularly concerning and raise important questions about how well Maori needs are being met and how equitable is access to cardiac interventions for Maori.
Are Maori under-served for cardiac interventions?
In cases of incurable stage IV gastric cancer with distant metastases, surgical treatment has usually consisted merely of palliation. The effect of palliative resection in these highly advanced cases remains controversial. Palliative resection may be prohibited by the potential disadvantages of surgical stress. Over the past 23 years, 382 stage IV incurable gastric cancer patients with distant metastases were classified into a resection group (group R) whose subjects underwent a palliative resection of the primary tumor and the non-resection group (group N) who were treated without resection of primary tumor. In order to exclude patients with very poor prognosis due to irresectability even if trying to resect, we restricted the subjects to patients who survived more than 30 and 60 days and some months and estimated the mean survival. Cumulative survival rates were calculated by using the Kaplan-Meier method, and the mean survivals of groups R and N were compared. A significantly longer mean survival was observed in group R than in group N (381 vs. 181 days, P<0.0001). Restricting the subjects to patients who survived more than 30 and 60 days, there is also a significant difference between the mean survival of group R and that of group N. However, restricting the subjects to patients who survived more than 300 days, no significant difference was seen between the two groups. The rate of hospital death was higher in group N than in group R (15.9% vs. 3.4%)
Palliative resection of the primary tumor in stage IV gastric cancer is meaningful in view of hospital stay, long-term survival, and satisfaction with the treatment. We should resect the primary tumor in cases in which it is resectable.
Does the surgical stress associated with palliative resection for patients with incurable gastric cancer with distant metastasis shorten their survival?
To find out if it is feasible to extend the indication for local resection of submucosal gastric cancer without increasing the risk of lymph node metastases. Retrospective study. University hospital, Japan. 104 patients with gastric cancer confined to the submucosal layer who underwent conventional gastrectomy with lymphadenectomy. The risk of nodal metastases was analysed retrospectively depending on the depth of submucosal invasion, size of the tumour, and other clinicopathological findings. The degree of submucosal invasion, size of the tumour, and incidence of lymph node metastasis. 15/104 patients (14%) had lymph node metastases. No patient in whom submucosal invasion was less than 500 microm or tumour was less than 15 mm in diameter developed lymph node metastases. Fewer patients had lymphatic permeation (37/89) and venous involvement (21/89) in the group without lymph node metastases.
These data seem to support the hypothesis that early, minimally invasive, gastric cancer measuring<15 mm in diameter could be treated by endoscopic mucosal or local resection, and gastrectomy with lymphadenectomy might be unnecessary.
Is lymphadenectomy needed for all submucosal gastric cancers?
Although there is no documented evidence that tattoo pigments can cause neurological complications, the implications of performing neuraxial anesthesia through tattooed skin are unknown. In this study, we aimed to assess whether spinal puncture performed through tattooed skin of rabbits determines changes over the spinal cord and meninges. In addition, we sought to evaluate the presence of ink fragments entrapped in spinal needles. Thirty-six young male adult rabbits, each weighing between 3400 and 3900 g and having a spine length between 38.5 and 39 cm, were divided by lot into 3 groups as follows: GI, spinal puncture through tattooed skin; GII, spinal puncture through tattooed skin and saline injection; and GIII, spinal puncture through skin free of tattoo and saline injection. After intravenous anesthesia with ketamine and xylazine, the subarachnoid space was punctured at S1-S2 under ultrasound guidance with a 22-gauge 2½ Quincke needle. Animals in GII and GIII received 5 μL/cm of spinal length (0.2 mL) of saline intrathecally. In GI, the needle tip was placed into the yellow ligament, and no solution was injected into the intrathecal space; after tattooed skin puncture, 1 mL of saline was injected through the needle over a histological slide to prepare a smear that was dyed by the Giemsa method to enable tissue identification if present. All animals remained in captivity for 21 days under medical observation and were killed by decapitation. The lumbosacral spinal cord portion was removed for histological analysis using hematoxylin-eosin stain. None of the animals had impaired motor function or decreased nociception during the period of clinical observation. None of the animals from the control group (GIII) showed signs of injuries to meninges. In GII, however, 4 animals presented with signs of meningeal injury. The main histological changes observed were focal areas of perivascular lymphoplasmacyte infiltration in the pia mater and arachnoid. There was no signal of injury in neural tissue in any animal of both groups. Tissue coring containing ink pigments was noted in all GI smears from the spinal needles used to puncture the tattooed skin.
On the basis of the present results, intrathecal injection of saline through a needle inserted through tattooed skin is capable of producing histological changes over the meninges of rabbits. Ink fragments were entrapped inside the spinal needles, despite the presence of a stylet.
Does Spinal Block Through Tattooed Skin Cause Histological Changes in Nervous Tissue and Meninges?
Adipose tissue is considered an endocrine organ, producing bioactive peptides, called adipokines. Adipokines produced by periadventitial fat have been implicated in the pathogenesis of vascular disease, including atherosclerosis. Adiponectin has established antiatherogenic actions, while the role of T-cadherin as an adiponectin receptor is not fully elucidated. The apelinergic system, consisting of apelin and its APJ receptor, is a mediator of various cardiovascular functions and may also be involved in the atherosclerotic process. We investigated the protein expression of adiponectin, T-cadherin, apelin and APJ in human aortas, coronary vessels, and the respective periadventitial adipose tissue and correlated their expression with the presence of atherosclerosis and clinical parameters. Immunohistochemistry for adiponectin, T-cadherin, apelin, and APJ was performed on human aortic and coronary artery samples including the periadventitial adipose tissue. Aortic and coronary atherosclerotic lesions were assessed using the american heart association (AHA) classification. Adiponectin immunostaining, of varied intensity, was detected only in adipocytes, while T-cadherin was localized to vascular smooth muscle cells (VSMCs) and endothelial cells. Apelin immunostaining was detected in adipocytes, VSMCs, endothelial cells, and foam cells in atherosclerotic lesions, while APJ was found in VSMCs and endothelia. Periadventitial adiponectin and VSMC T-cadherin expression were negatively correlated with atherosclerosis in both sites, as was VSMC apelin expression. Several other - depot specific - associations were observed.
Our results suggest a possible role for T-cadherin as a mediator of antiatherogenic adiponectin actions, while they support the putative antiatherogenic profile for apelin and its APJ receptor in human arteries. Further research is absolutely necessary to confirm these notions.
Adiponectin/T-cadherin and apelin/APJ expression in human arteries and periadventitial fat: implication of local adipokine signaling in atherosclerosis?
Cholesterol is an essential component of cell membrane constituting the neuronal cells and mediates multiple functions affecting neuronal transmission in the central nervous system. Abnormalities in serum lipid fractions have been reported in depression, but the clinical and biological significance of such findings are yet to be elucidated. To study the abnormalities of lipid fractions in subjects with unipolar depression. Thirty patients with unipolar depression and normal controls were recruited in this study. Serum total cholesterol, low-density lipoprotein, high-density lipoprotein, very low-density lipoprotein, and triglycerides were studied. Data analysis was performed by using an analysis of covariance (ANCOVA) controlling for age, sex, body mass index, lifestyle, and dietary habits. The data showed significant elevation of serum total cholesterol in depressed patients compared with normal controls. Identical results were obtained after controlling for the effects of confounders.
Results of this study point to particular subgroup of subjects who have elevated cholesterol among the depressed cohort. This may have significant implications in risk assessment for cardiovascular disorders and planning preventive strategies.
Elevated total cholesterol in severely depressed patients: role in cardiovascular risk?
To outline the patterns and temporal trends in leukemia, regarding incidence and mortality, in Canada since 1970. A descriptive analysis of trends in incidence and mortality by age, sex, time period and leukemia subtype, using change-point modelling and log-linear regression. Data from all provincial and territorial cancer registries. Incidence and mortality rate per population of 100,000. The overall age-adjusted incidence rates for all males and females increased from 12.3 and 7.3 per 100,000 in 1970-1972 to 14.6 and 9.0 in 1979-1981, then decreased to 13.2 and 8.3 in 1991-1993, respectively. The overall age-adjusted mortality rates decreased from 9.1 and 5.7 per population of 100,000 in 1970-1972 to 8.3 and 4.8 in 1993-1995 for males and females, respectively. The incidence of acute leukemias decreased between 1970 and 1993. A sharp increase in the incidence of chronic leukemias from 1978 to 1980 was observed in the older age group. Mortality rates also showed a small increase from 1979 to 1989 in seniors.
The increase in the incidence of chronic leukemias among older subjects was probably due to improvements in diagnosis and changes in registration practices, whereas the decrease in the incidence of acute leukemias was probably a real change attributable to environmental factors. Further investigation is needed to clarify whether and to what extent diagnostic practices contributed to the increased detection of chronic leukemias among elderly Canadians.
Have diagnostic practices contributed to trends in leukemia incidence and mortality among Canadians?
Scientific papers are retracted for many reasons including fraud (data fabrication or falsification) or error (plagiarism, scientific mistake, ethical problems). Growing attention to fraud in the lay press suggests that the incidence of fraud is increasing. The reasons for retracting 742 English language research papers retracted from the PubMed database between 2000 and 2010 were evaluated. Reasons for retraction were initially dichotomised as fraud or error and then analysed to determine specific reasons for retraction. Error was more common than fraud (73.5% of papers were retracted for error (or an undisclosed reason) vs 26.6% retracted for fraud). Eight reasons for retraction were identified; the most common reason was scientific mistake in 234 papers (31.5%), but 134 papers (18.1%) were retracted for ambiguous reasons. Fabrication (including data plagiarism) was more common than text plagiarism. Total papers retracted per year have increased sharply over the decade (r=0.96; p<0.001), as have retractions specifically for fraud (r=0.89; p<0.001). Journals now reach farther back in time to retract, both for fraud (r=0.87; p<0.001) and for scientific mistakes (r=0.95; p<0.001). Journals often fail to alert the naïve reader; 31.8% of retracted papers were not noted as retracted in any way.
Levels of misconduct appear to be higher than in the past. This may reflect either a real increase in the incidence of fraud or a greater effort on the part of journals to police the literature. However, research bias is rarely cited as a reason for retraction.
Retractions in the scientific literature: is the incidence of research fraud increasing?
Randomized controlled trials of permanent atrial fibrillation ablation surgery have shown improved outcomes compared with control patients undergoing concomitant cardiac surgery. Little has been reported regarding patients with paroxysmal atrial fibrillation. We hypothesized that treating paroxysmal atrial fibrillation during cardiac surgery would not adversely affect the perioperative risk and would improve the midterm outcomes. From April 2004 to June 30 2012, 4947 patients (excluding those with transcatheter aortic valve implants, left ventricular assist devices, trauma, transplantation, and isolated atrial fibrillation surgery) underwent cardiac surgery, and 1150 (23%) had preoperative atrial fibrillation. Of these, 552 (48%) had paroxysmal atrial fibrillation. Three groups were compared using propensity score matching: treated (n = 423, 77%), untreated (n = 129, 23%), and no atrial fibrillation (n = 3797). The treated patients had 30-day mortality similar to that of the untreated patients and those without atrial fibrillation. They had fewer perioperative complications (26% vs 46%, P = .001), greater freedom from atrial fibrillation at the last follow-up visit (81% vs 60%, P = .007), and lower mortality (hazard ratio 0.47, P = .007) compared with the untreated patients. Compared with those without atrial fibrillation, the treated patients had fewer perioperative complications (25% vs 48%, P < .001), lower freedom from atrial fibrillation at the last follow-up visit (84% vs 93%, P = .001), and similar mortality.
Concomitant surgical ablation of paroxysmal atrial fibrillation was not associated with increased perioperative risk. The treated patients had greater late freedom from atrial fibrillation and midterm survival compared with the untreated patients, and similar midterm survival compared with the patients without atrial fibrillation. These results suggest that paroxysmal atrial fibrillation warrants treatment consideration in select patients undergoing cardiac surgery.
Should paroxysmal atrial fibrillation be treated during cardiac surgery?
This study aimed to examine the independent association of cardiorespiratory fitness (CRF) and physical activity (PA) with overweight and total and abdominal obesity in an elderly population. A total of 112 males and 185 females, 65-103 years, were assessed for PA with accelerometers and results from six-minute walk test were used as a CRF marker. Waist circumference was dichotomized into normal or abdominal obesity and BMI was categorized into normal, overweight, or obesity. Binary logistic regression models were performed. Binary logistic regressions showed that, moderate-to-vigorous PA (MVPA) predicted OR for abdominal obesity (OR = 1.4%; P = 0.026), obesity (OR = 2.9%; P<0.001), and both conditions coupled (OR% = 4.0%; P<0.001). Even adjusting for CRF, MVPA remained a significant predictor. CRF was associated with OR for abdominal obesity (OR = 0.4%; P = 0.001).
In conclusion, higher CRF is associated with lower risk for abdominal obesity in elderly. Independently MVPA predicts OR for obesity, abdominal obesity, and the cluster of both conditions.
Are cardiorespiratory fitness and moderate-to-vigorous physical activity independently associated to overweight, obesity, and abdominal obesity in elderly?
Ambulatory blood pressure (BP) monitoring (ABPM) is a cornerstone in resistant hypertension (RHT) management. However, it has higher cost and lower patients' acceptance than home BP monitoring (HBPM). Our objective was to evaluate HBPM usefulness in the management of patients with RHT. A total of 240 patients were submitted to 24-hour ABPM and 5-day HBPM (triplicate morning and evening measurements). Patients with uncontrolled office BP (≥140/90mm Hg) were classified as true RHT (daytime or home BP ≥135/85mm Hg) or white-coat RHT (daytime or home BP<135/85mm Hg), and patients with controlled office BP were classified as masked RHT (daytime or home BP ≥135/85mm Hg) or controlled RHT (daytime or home BP<135/85mm Hg). Sensitivity, specificity, predictive values, and likelihood ratios for HBPM were calculated. Agreement between the procedures was evaluated using kappa coefficients and the Bland-Altman method. Mean office BP was 157±26/84±16mm Hg, mean daytime BP was 134±18/77±13mm Hg, and mean home BP was 143±20/76±14mm Hg. The ABPM and HBPM diagnoses were 35% and 48%, respectively, for true RHT; 36% and 23%, respectively, for white-coat RHT; 7% and 17%, respectively, for masked RHT; and 22% and 13%, respectively, for controlled RHT. HBPM overestimated systolic BP by 8.8 (95% confidence interval (CI) = 6.8-10.7) mm Hg and diastolic BP by 0.2 (95% CI = -1.0 to 1.4) mm Hg. The specificity, sensitivity, and positive and negative predictive values of HBPM in detecting controlled ambulatory BP were 91%, 55%, 89%, and 59%.
HBPM presented good agreement with ABPM and can be used as a complementary method in the follow-up of resistant hypertensive patients, particularly in those with controlled ambulatory BPs.
Is home blood pressure monitoring useful in the management of patients with resistant hypertension?
It is well recognized that colorectal cancer does not frequently metastasize to bone. The aim of this retrospective study was to establish whether colorectal cancer ever bypasses other organs and metastasizes directly to bone and whether the presence of lung lesions is superior to liver as a better predictor of the likelihood and timing of bone metastasis. We performed a retrospective analysis on patients with a clinical diagnosis of colon cancer referred for staging using whole-body 18F-FDG PET and CT or PET/CT. We combined PET and CT reports from 252 individuals with information concerning patient history, other imaging modalities, and treatments to analyze disease progression. No patient had isolated osseous metastasis at the time of diagnosis, and none developed isolated bone metastasis without other organ involvement during our survey period. It took significantly longer for colorectal cancer patients to develop metastasis to the lungs (23.3 months) or to bone (21.2 months) than to the liver (9.8 months).
Metastasis only to bone without other organ involvement in colorectal cancer patients is extremely rare, perhaps more rare than we previously thought. Our findings suggest that resistant metastasis to the lungs predicts potential disease progression to bone in the colorectal cancer population better than liver metastasis does.
Does colon cancer ever metastasize to bone first?
To study if the pathologist's examination of surgical abortion tissue offers more information than immediate fresh tissue examination by the surgeon. Immediate examination of the fresh tissue aspirate after surgical abortion helps reduce the risk of failed abortion and other complications. Regulations in some states also require a pathologist to analyze abortion specimens at added cost to providers. We conducted this study to evaluate the incremental clinical benefit of pathology examination after surgical abortion at less than 6 weeks' gestation. As part of a prospective case series of women who had early surgical abortions at the Planned Parenthood League of Massachusetts during a 32-month period, we collected data on clinical outcomes and the results of postoperative tissue examinations. Using outcomes verified by in-person follow-up as the "gold standard," we calculated the validity of the tissue examinations by the surgeons and the outside pathologists. A total of 676 women had documented outcomes and complete tissue examination data. The sensitivity (ability of the examiner to detect an outcome other than complete abortion) was 57% (95% confidence interval [CI] 35, 76) for the surgeons' tissue inspections and 22% (95% CI 8, 44) for the pathologists' examinations. The predictive value of a positive (abnormal) tissue screen was 14% (95% CI 8, 24) and 7% (95% CI 3, 17) for the surgeons and pathologists, respectively.
Routine pathology examination of the tissue aspirate after early surgical abortion confers no incremental clinical benefit. Although the surgeons' tissue inspections predicted abnormal outcomes poorly, the pathologists did no better. Our results challenge the rationale for state regulations requiring pathologic analysis of all surgical abortion specimens.
Is pathology examination useful after early surgical abortion?
Abnormalities in cognitive control and disgust responding are well-documented in obsessive-compulsive disorder (OCD), and also interfere with flexible, outcome-driven utilitarian moral reasoning. The current study examined whether individuals with OCD differ from healthy and anxious individuals in their use of utilitarian moral reasoning, and whether abnormalities in inhibitory control, cognitive flexibility and disgust contribute to moral rigidity. Individuals with OCD (n = 23), non-OCD anxiety (n = 21) and healthy participants (n = 24) gave forced-choice responses to three types of moral dilemmas: benign, impersonal, personal. Scores on measures of cognitive flexibility, inhibitory control and trait disgust were also examined. Individuals with OCD gave fewer utilitarian responses to impersonal moral dilemmas compared to healthy, but not anxious, individuals. Poorer cognitive flexibility was associated with fewer utilitarian responses to impersonal dilemmas in the OCD group. Furthermore, greater trait disgust was associated with increased utilitarian responding to personal dilemmas in the OCD group, but decreased utilitarian responding to impersonal dilemmas in the anxious group. Although we did not find an association between inhibitory control and moral reasoning, smaller associations may be evident in a larger sample.
These data indicate that individuals with OCD use more rigid moral reasoning in response to impersonal moral dilemmas compared to healthy individuals, and that this may be associated with reduced cognitive flexibility. Furthermore, these data suggest that trait disgust may exert opposing effects on moral reasoning in individuals with OCD compared to those with other forms of anxiety.
Moral rigidity in obsessive-compulsive disorder: do abnormalities in inhibitory control, cognitive flexibility and disgust play a role?
The aim of this study was to evaluate the influence of three demographic criteria: gender, age and education level on the patients that perform a scintigraphy. The cross-sectional study was applied to 220 patients to whom it was prescribed a scintigraphy by their treating physician. Of these, 74 were men and 145 women, 10 children and 210 adults. According to their education, 88 people did not graduate a high school, a total of 56 people graduated a high school, 27 persons graduated a college, 22 people had a university diploma and 26 people a PhD title. The chi-square test was used to analyze the frequencies for the measurable variables on a nominal scale. The significance threshold is considered to be 0.05, so, only the values which are smaller were taken into consideration. We presented and analyzed only the data that fulfill this condition. From our study it has been found that gender criterion played an important role in performing a scintigraphy for the first time. 71.6% of male patients were undergoing a scintigraphy for the first time. The frequency of exploration by scintigraphy is increasing if you are a female patient than if you are a male patient: chi-square calculated value is chi2 (3)=12.398, p=0.006. From our study it has been found that age item plays an important role in the scintigraphy investigations for all the patients. The first scintigraphy was significantly performed in the 40-50 years old gap, whilst for the patients being aged over 60; they were probably not performing a first time scintigraphy. We did not find significant statistical differences in respect of the education level.
So, we can conclude that access to that medical investigation is not depending on the socioeconomic status of the patient, but in some occasions, the number and the frequency distribution of performing a scintigraphy depend on gender and age. We can also conclude that the principle of egalitarianism is fulfilled and so the justice principle in the distribution of health care resources must be reevaluated.
Is it enough scintigraphy for everyone?
To quantify mothers' social desirability bias with respect to their children's weight in a cross-regional Indian setting. The OBEY-AD was a cross-sectional study which has been realized in 7 Indian cities (Bengaluru, Mumbai, Chennai, Hyderabad, Kolkata, New Delhi and Surat), enroling 1,680 children aged 3-11 y of which 50% were females. Children's BMI scores were computed, standardized according to WHO growth charts and categorized as Normal, Overweight, Obese and Underweight. Mothers were asked to judge the weight status of their children through an iconographic test, indicating the shape, which better mirrors the size of their kids. Socio-demographic data, especially employment, income and education, was accessed by administrating a cross-sectional questionnaire to the mothers, involved for the study. Overall, 369 children resulted as obese or overweight (23.5%). Out of them, 75% (278) were not recognized as such by their mothers. Such figures range from up to 76% in Chennai and Surat down to 72% in Hyderabad, Kolkata, New Delhi and Mumbai. Overall agreement between perceived and desired weight status of children was very poor (p < 0.001). Surprisingly, overall 10% of overweight/obese children were considered as even too lean by their mothers. Misperception of children's weight status seemed to be significantly related to urban differences and socio-economic status.
This study quantifies the extent of the so-called social desirability bias, namely mother's unconscious attitude to adapt empirical evidence to more culturally legitimized ideal-types of what their children's weight status is expected to be. Its association with westernized representations of leanness as evaluation criteria for beauty has important policy implications.
Is my kid out of size?
Individuals with celiac disease (CD) are at increased risk of sepsis. The aim of this study was to examine whether CD influences survival in sepsis of bacterial origin. Nationwide longitudinal registry-based study. Through data on small intestinal biopsies from Sweden's 28 pathology departments, we identified 29,096 individuals with CD (villous atrophy, Marsh stage III). Each individual with CD was matched with five population-based controls. Among these, 5,470 had a record of sepsis according to the Swedish Patient Register (1,432 celiac individuals and 4,038 controls). Finally we retrieved data on mortality in sepsis patients through the Swedish Cause of Death Registry. CD was associated with a 19% increase in overall mortality after sepsis (95% confidence interval (CI) = 1.09-1.29), with the highest relative risk occurring in children (adjusted hazard ratio (aHR) = 1.62; 95%CI = 0.67-3.91). However, aHR for death from sepsis was lower (aHR = 1.10) and failed to reach statistical significance (95%CI = 0.72-1.69). CD did not influence survival within 28 days after sepsis (aHR = 0.98; 95%CI = 0.80-1.19).
Although individuals with CD seem to be at an increased risk of overall death after sepsis, that excess risk does not differ from the general excess mortality previously seen in celiac patients in Sweden. CD as such does not seem to influence short-term or sepsis-specific survival in individuals with sepsis and therefore is not an independent risk factor for poor prognosis in sepsis.
Does Celiac Disease Influence Survival in Sepsis?
The goal of this study was to evaluate helicopter transport to an urban level I trauma center from the scene of injury for patients with self-inflicted gunshot wounds to the head. This study is a retrospective review of the prehospital, hospital, and billing records. Despite the fact that 10 of 28 patients (36%) had an airway established by the medical flight crews, scene flights did not enhance survival. Twenty-seven of 28 patients (96%) died. The remaining patient's survival could not be attributed to the scene flight. We estimated that 27 of 28 patients would have arrived at the trauma center sooner if they had been transported by the first-responder emergency medical services ground unit. Flight service charges were approximately one-third of the hospital charges. As a group, patients with a self-inflicted gunshot wound to the head had the highest rate of organ donation in this trauma center (26%). Twenty-nine organs were harvested from the seven donors.
The use of helicopter scene flights from the scene of injury for patients with a self-inflicted gunshot wound to the head provides no medical advantage to the victims, but provides a high-yield source of desperately needed organs. The prompt establishment of an airway in the field may prolong patient survival long enough to allow evaluation for organ donation. Helicopter transport of these patients is justified only as a means of rapidly delivering the personnel capable of providing advanced airway skills to the scene. Patients requiring CPR in the field after isolated gunshot wounds to the head will not live long enough to become organ donor candidates; therefore, there is no benefit to helicopter transport for these patients.
Does the potential for organ donation justify scene flights for gunshot wounds to the head?