instruction
stringlengths
240
3.98k
input
stringlengths
21
339
output
stringlengths
35
2.11k
Answer the question based on the following context: Adrenocortical carcinoma (ACC) is a rare neoplasm with poor prognosis. Discerning ACCs from benign adenomas histologically may be difficult if invasion into surrounding tissues or metastases are missing. In order to establish molecular markers for malignancy, we analyzed seven normal adrenals, three massive macronodular ACTH-independent adrenocortical hyperplasias (MMAHs), 30 adrenocortical adenomas (ACAs) and ten ACCs. All tissues were studied for the presence of alterations in the p53 tumor suppressor gene using the PAb 1801 antibody, which detects mutant p53 protein and the pYNZ22 microsatellite marker to show loss of heterozygosity (LOH) at 17p, for expression of the proliferation-associated antigen Ki67 using the MIB1 antibody, for the rate of apoptotic tumor cells with the TdT-mediated dUTP biotin nick end labeling (TUNEL) method, and for LOH of 11q13 (menin gene locus) with the D11S956 microsatellite marker. 0/3 MMAH, 1/28 ACA and 3/10 ACC revealed immunopositive staining for p53. LOH for pYNZ22 was observed in 1/3 MMAH, 1/23 informative ACA and 6/6 informative ACC. The rate of apoptotic cells was significantly higher in ACC (P<0.0001 by ANOVA) than in ACA but there was some overlap between groups. The Ki67 index (% immunopositive cells) was 1.9+/-1.30% (mean+/-s.d.) in normal adrenals, 3.47+/-1.37% in MMAH, and 2.11+/-1.01% in ACA. ACC had the highest Ki67 index of 11.94+/-7.58% distinguishing all ACC from the ACA and MMAH studied with a cut-off level of 5%. LOH for 11q13 was detected in 2/3 MMAH, 5/26 ACA and 6/8 ACC.
Question: Discerning malignancy in adrenocortical tumors: are molecular markers useful?
We conclude that a Ki67 index above 5% is a sensitive and specific indicator of ACC and may be useful in the differentiation of adenomas from carcinomas.
Answer the question based on the following context: Detailed CT scans are often acquired during the radiotherapy planning process. This study was performed to determine the incidence of important benign and cancer-related CT findings on these scans. From December 1998 to December 2000, 162 radiotherapy patients who were to be treated curatively underwent treatment planning CT scans on a helical scanner in the radiology department at Washington Hospital, Fremont, CA. All CT scans were prospectively interpreted relative to diagnoses, and reports were dictated for the medical records. The diagnostic reports and records on all patients were reviewed to determine the incidence of previously unknown benign or cancer-related findings, the impacts of such findings on treatment, and the need for additional radiologic studies or procedures on the basis of the CT interpretations. Incidental benign findings were noted for 32 patients (20%). Potentially important benign findings were noted for three patients: two with aneurysms and one with a possible deep vein thrombosis. Potentially cancer-related findings were reported in 20 patients: a single liver lesion (four patients), multiple liver lesions (two patients), possible or probable lymphadenopathy (11 patients), abnormal soft tissue (one patient), a small-bowel obstruction (one patient), and a breast mass (one patient). After reviewing prior diagnostic studies and obtaining additional recommended studies, the physicians found that only three of the previously unknown findings required further investigation: two aneurysms, which did not require near-term treatment, and one metastatic neck node.
Question: Is diagnostic review of radiotherapy-planning CT scans important in the conformal therapy era?
Routine diagnostic interpretation of radiotherapy planning scans resulted in few important medical findings and changed patient care for less than 1% of the patients.
Answer the question based on the following context: To evaluate whether P1 and N1 evoked by ERP tasks could appropriately reflect primary visual processing in Parkinson's disease (PD). We recorded ERPs in 13 PD patients with duration of illness less than 5 years and 18 age-matched normal control subjects. P1 and N1 from Oz were evoked by a visual oddball and a delayed matching S1-S2 task. The effect of different events on P1 and N1 was studied. All patients were given an ECD-SPECT examination, and the SPECT images were overlaid on the 3D-MRI. The correlation of P1 or N1 to the regional cerebral blood flow (rCBF) was studied. P1 was not influenced by different events. There was no significant P1 differences between the PD and the normal group. N1 was significantly shorter and smaller in the patients than that in the normal group. N1 amplitude after the waveform subtraction (target-frequent) in the PD group did not show significant difference with that in the normal controls, nor with the N1 before the subtraction. Nd, the subcomponent of N1 after the subtraction in the patients was significantly earlier and smaller than that in the normal controls. P1 only weakly correlated with the rCBF in the occipital lobe. N1 was correlated with the rCBF in a global region.
Question: Do P1 and N1 evoked by the ERP task reflect primary visual processing in Parkinson's disease?
The results provided some evidence that P1 might reflect the primary visual processing, and N1 might be involved in both primary and cognitive visual processing. The altered N1 in the PD patients might be due to the deformed Nd.
Answer the question based on the following context: Recent investigations have indicated that adequate lithium treatment lowers the suicide mortality associated with affective illness. One important question is whether the mechanism by which lithium prophylaxis may be effective in prolonging survival can be explained exclusively in terms of successful protection against the recurrence of depressive episodes, or whether one should consider an independent anti-suicidal factor. We investigated a group of high-risk patients with recurrent affective disorders (n = 167) who had committed one or more suicide attempts before the start of lithium prophylaxis within a collaborative project by the International Group for the Study of Lithium Treated Patients (IGSLI). According to their recurrence-related response to long-term lithium prophylaxis, patients were classified into three groups: excellent (n = 45), moderate (n = 81) and poor responders (n = 41). Only depressive episodes resulting into hospitalisation were considered. A marked reduction in the number of suicide attempts was observed in the excellent lithium responders. However, we also found that over 80% of moderate responders and nearly 50% of poor responders did not exhibit any further suicidal behaviour during lithium treatment. Furthermore, we could demonstrate a significant reduction of suicide attempts per year as compared to a corresponding pre-lithium period in all three groups (0.10 vs. 0.33, 0.06 vs. 0.27, 0.02 vs. 0.26). There were four suicides in this high-risk group, corresponding to a suicide-related standardised mortality ratio (SMR) of 13.7. This contrasts sharply with an expected suicide SMR of approx. 100 in this population. Suicide risk was not related to the recurrence-preventing effect.
Question: Does lithium exert an independent antisuicidal effect?
The reduction in suicide attempts, in both responders and non-responders, indicates that lithium possesses a specific anti-suicidal effect besides its mood-stabilising property.
Answer the question based on the following context: Personality and cognition are often considered as disparate constructs, both in normal individuals and in those with a psychosis. The goal of the present study was to analyze the relationship between dimensions of personality and cognitive performance in individuals with psychosis. Sixty-one consecutively admitted patients with an acute psychotic episode were recruited for this study. Personality was assessed through a semistructured interview with a close relative using the Personality Assessment Schedule. A wide neuropsychological battery was applied, including attentional, executive, memory tasks and global cognition. Assessments took place when symptomatology was in remission. Higher scores on a passive-dependent dimension were significantly associated with poorer memory performance. Similarly, higher levels for a schizoid dimension were significantly associated with poorer executive performance. The results remained significant after partialling out the effect of gender, psychopathological dimensions and drug status.
Question: Are personality traits associated with cognitive disturbance in psychosis?
It is hypothesized that personality traits and cognitive performance are interrelated domains in psychosis.
Answer the question based on the following context: Adjuvant radiotherapy for node positive breast cancer postmastectomy has been recommended by two previously published randomized controlled trials (RCT). The local-regional recurrence rates in the control arms, however, were considered by some critics to be excessive (>25% at 10 years). Inadequate surgery, as evidenced by the low number of axillary nodes reported, may have resulted in the high local-regional recurrence rates, allowing for the benefits seen with radiotherapy. Fellowship trained surgical oncologists might provide "better quality" surgery, resulting in lower recurrence rates and thus making adjuvant radiotherapy unnecessary. Our objective was to establish the local-regional control rate postmastectomy in node positive breast cancer patients operated on by surgical oncologists, and to determine if treatment recommendations from previous RCTs are generalizable. Node positive stage IIb and IIIa breast cancer patients treated with mastectomy at the Medical College of Virginia Hospitals by surgical oncologists, without adjuvant radiotherapy, and entered into adjuvant chemotherapy trials between 1978 and 1993 were identified retrospectively. Pathology and follow-up records were reviewed. One hundred and thirty-seven patients were identified. A median of 18 axillary nodes was reported with a median of 4 positive nodes. The locoregional recurrence at 10-years was 27% (95% confidence interval, 19-35%).
Question: Do surgical oncologists achieve lower rates of local-regional recurrence in node positive breast cancer treated with mastectomy alone?
Despite some evidence of "better quality" surgery, there was no clinically significant difference in the local-regional recurrence rate in this case series compared to controls in two previous RCTs. Recommendations for postmastectomy radiotherapy should be considered for node positive breast cancers, even if operated upon by surgical oncologists.
Answer the question based on the following context: The aim of this study was to determine if the necessity of using specific procedures to attain complete cytoreduction in ovarian cancer correlates with innate biologic aggressiveness and independently influences survival. Between 1990 and 2000, 213 patients with Stage IIIC epithelial ovarian cancer underwent complete cytoreduction before initiation of systemic platinum-based combination chemotherapy. Survival was stratified and analyzed (log rank and Cox regression) on the basis of whether extrapelvic bowel resection, diaphragm stripping, full-thickness diaphragm resection, modified posterior pelvic exenteration, peritoneal implant ablation and/or aspiration, and excision of grossly involved retroperitoneal lymph nodes were necessary to attain a visibly disease-free cytoreductive outcome. The median and estimated 5-year survival for the cohort were 75.8 months and 54%, respectively. Survival was influenced (log rank) by the requirement of diaphragm stripping (required, median 42 months vs not required, median 79 months; P = 0.03) and the extent of mesenteric and serosal implants that required removal (none, median not reached, vs 1-50 implants, median not reached, vs>50 implants, median 40 months; P = 0.002). Survival was independently influenced (Cox regression) only by the extent of peritoneal metastatic implants that required removal (P = 0.01). The other investigated procedures and type of chemotherapy used did not influence survival.
Question: Procedures required to accomplish complete cytoreduction of ovarian cancer: is there a correlation with "biological aggressiveness" and survival?
The need to remove a large number of peritoneal implants correlates with biological aggressiveness and diminished survival, but not significantly enough to preclude long-term survival or justify abbreviation of the operative effort. The need to use the other investigated procedures had minimal or no observed influence on survival.
Answer the question based on the following context: Serum leptin levels were measured in 30 healthy premenopausal obese women before and after 12 weeks of dietary intervention and after 5 months of follow-up. After intervention body mass index (BMI) decreased from 30.6 to 25.4 kg/m2 (p<0.01) and leptin levels decreased from 16.7 to 7.7 ng/ml (p<0.01). After 5 months follow-up 12 women regained reduced weight and 18 women maintained weight loss. In the regainers leptin levels increased again, but remained low in the maintainers. Baseline leptin concentrations were lower in the regainers than in the maintainers (12.1 vs. 21.2 ng/ml, p = 0.04). During intervention leptin levels decreased three times more in the maintainers than in the regainers, although weight loss was similar in both groups.
Question: Do baseline serum leptin levels predict weight regain after dieting in obese women?
This study shows that obese women who regain weight after dieting have significantly lower baseline leptin levels than women who maintain weight loss. Our results suggest that differences in leptin resistance might exist in similarly obese women which could influence the success of dieting.
Answer the question based on the following context: Recent reforms in the federal Medicaid program have attempted to integrate beneficiaries into the mainstream by providing them with managed care options. However, the effects of mainstreaming have not been systematically evaluated. Cross-sectional survey.SETTING/ A sample of 478 adult, nonelderly asthmatics followed by a large Northern California medical group. We examined differences in self-reported access by insurance status. Compared to patients with other forms of insurance, patients covered by the state's Medicaid program (Medi-Cal) were more likely to report access problems for asthma-related care, including difficulties in reaching a health care provider by telephone, obtaining a clinic appointment, and obtaining asthma medication. Adjusting for relevant clinical and sociodemographic variables, Medi-Cal patients were more likely to report at least one access problem compared to non-Medi-Cal patients (adjusted odds ratio [AOR], 3.34; 95% confidence interval [CI], 1.43 to 7.80). Patients reporting at least one access problem were also more likely to have made at least one asthma-related emergency department visit within the past year (AOR, 4.84; 95% CI, 2.41 to 9.72). Reported barriers to care did not translate into reduced patient satisfaction.
Question: Does "mainstreaming" guarantee access to care for medicaid recipients with asthma?
Within this population of Medicaid patients, the provision of health insurance and care within the mainstream of an integrated health system was no guarantee of equal access as perceived by the patients themselves.
Answer the question based on the following context: To determine the role of carbon dioxide in the development of retinopathy of prematurity (ROP). This was a retrospective cohort study of 25 consecutive infants admitted to the neonatal unit with continuously recorded physiological data. The daily mean and standard deviation (SD) of transcutaneous carbon dioxide partial pressure (tcPCO(2)) was compared between infants who had stage 1 or 2 ROP and stage 3 ROP. The time spent hypocarbic (<3 kPa) and/or hypercarbic (>10 kPa and>12 kPa) was also compared between these groups. Intermittent arterial carbon dioxide tension was also measured and compared with the simultaneous tcPCO(2) data. There were no significant differences in carbon dioxide variability or time spent hypocarbic and/or hypercarbic between the ROP groups on any day. 86% of transcutaneous values were within 1.5 kPa of the simultaneous arterial value.
Question: Is the partial pressure of carbon dioxide in the blood related to the development of retinopathy of prematurity?
TcPCO(2) measurement can be a very useful management technique. However, in this cohort neither variable blood carbon dioxide tension nor duration of hypercarbia or hypocarbia in the first 2 weeks of life was associated with the development or severity of ROP.
Answer the question based on the following context: To study the lipid profile in a group of treated phenylketonuric patients (PKU; n = 61) compared with a group of inborn error of intermediary metabolism patients (IEM; n = 22), a group of hyperphenylalaninemic children (HPA; n = 37), and a control group without dietary restriction (n = 41). Phenylalanine was analyzed by ion exchange chromatography and triglycerides, cholesterol and HDL were determined by standard procedures with the Cobas Integra analyzer. Serum total cholesterol concentrations were significantly lower in PKU patients compared with IEM patients (whose cholesterol daily intake was similar to those of PKU patients), HPA children and the control group. A negative correlation was observed between cholesterol and phenylalanine concentrations in the PKU patients.
Question: Is there a relationship between plasma phenylalanine and cholesterol in phenylketonuric patients under dietary treatment?
Our findings support the hypothesis of a relationship between high plasma phenylalanine levels and an inhibition of cholesterogenesis, although the low cholesterol intake of the special diets may also decrease serum cholesterol values.
Answer the question based on the following context: The p53 gene is an established tumor suppressor and an inducer of apoptosis. We here attempt to determine whether the putative anticarcinogenic properties attributed to red wine and its polyphenolic constituents depend, at least in part, upon their ability to modulate p53 expression in cancer cells. Three human breast cancer cell lines (MCF-7, T47D; MDA-MB-486) and one human colon cancer cell line [Colo 320 HSR (+)] were treated for 24-h with each of four polyphenols [quercetin; (+)-catechin, trans-resveratrol; caffeic acid]at concentrations ranging from 10(-7) M to 10(-4) M, after which, p53 concentrations were measured in cell lysates by a time-resolved fluorescence immunoassay. None of the polyphenols tested affected p53 expression in the breast cancer cell lines T-47D and MDA-MB-486. p53 content of MCF-7 breast cancer cells (wild-type) was increased by caffeic acid, decreased by resveratrol, and showed a twofold increase with catechin, that reached borderline statistical significance; however, none of these effects were dose-responsive. Colo 320 HSR (+) cells (with a mutant p53 gene) had lower p53 content upon stimulation, reaching borderline statistical significance, but without being dose-responsive, in the presence of caffeic acid and resveratrol. Apart from toxicity at 10(-4) M, quercetin had no effect upon these four cell lines.
Question: Do wine polyphenols modulate p53 gene expression in human cancer cell lines?
The observed p53 concentration changes upon stimulation by polyphenols are relatively small, do not follow a uniform pattern in the four cell lines tested, and do not exhibit a dose-response effect. For these reasons, we speculate that the putative anticarcinogenic properties of wine polyphenols are unlikely to be mediated by modulation of p53 gene expression.
Answer the question based on the following context: The potential of dietary habits to confound the association between alcohol consumption and health needs further study. We examined whether eating habits differed according to alcohol consumption in a large cohort of French women. This was a cross-sectional study of the French cohort of the European Prospective Investigation into Cancer and Nutrition (E3N-EPIC). The cohort was established in 1990 and includes 100000 women born between 1925 and 1950. Dietary data were obtained between 1993 and 1995 by using self-administered food-frequency questionnaires. About 73000 questionnaires were analyzed, and women were placed into 7 categories of alcohol consumption. After adjustment for energy derived from alcohol, increasing alcohol consumption was associated with a higher total energy intake, a higher percentage of energy intake as protein and lipids, and higher intakes of cholesterol, fatty acids, retinol, iron, and vitamin E. Conversely, energy provided by carbohydrates decreased with increasing alcohol consumption, as did beta-carotene intake. Increasing alcohol consumption was associated with higher consumption of animal products, cheese, potatoes, oil, bread, and breakfast cereals and with lower consumption of vegetables and dairy products.
Question: Do eating habits differ according to alcohol consumption?
In this population of middle-aged, highly educated French women, marked differences in dietary patterns and nutrient intakes were found according to alcohol consumption. Part of the detrimental effect of alcohol on health may be due to the less healthy dietary habits of drinkers. This points to a confounding role of eating habits and nutrient intakes in the relation between alcohol and health.
Answer the question based on the following context: Calcium supplements are widely used to prevent osteoporosis. However, little is known about the metabolic effects of different dosages and of the timing of the dosages. The aim was to study the effects of the timing of the dose (study 1), the effects of the size of the dose (study 2), and the effects of small repetitive doses (study 3) of calcium on calcium and bone metabolism in women. The investigation was conducted in 3 parts, each with 10 participants. In study 1, calcium loads (0 and 25 mg/kg body wt) were taken at 0900 and 2100. In study 2, calcium loads of 0, 250, and 1000 mg were taken at 0900. In study 3, calcium loads of 0 and 200 mg were taken 4 times/d. Markers of calcium and bone metabolism were followed. There was no significant difference in the response of serum parathyroid hormone (PTH) to the calcium load taken at 0900 and that at 2100. There was a significant dose-response effect of the calcium load on serum ionized calcium (P = 0.00005) and serum PTH (P = 0.0003). Small calcium doses (200 mg) taken 4 times/d kept the PTH secretion at a lower level than during the control day (P = 0.016). None of the doses caused significant changes in the markers of bone formation and resorption measured.
Question: Does it make a difference how and when you take your calcium?
The calcium loads had no significant effect on the markers of bone formation and resorption measured, although even small calcium doses decreased serum PTH and increased serum ionized calcium concentrations rapidly. The effect was similar whether calcium was taken in the morning or in the evening.
Answer the question based on the following context: To describe risk factors associated with microalbuminuria (MA) in subjects with diabetes, investigate the predictive value of MA as a marker of risk for diabetic nephropathy (DN), and define risk factors associated with the development and progression of MA. We conducted a prospective longitudinal study of 23 diabetic subjects with persistent MA and 209 diabetic subjects without MA who attended diabetes clinics at the University of Michigan Medical Center in 1989 and 1990. Both groups were examined at baseline and after 7 years. At baseline, urinary albumin-to-creatinine ratios were studied in random, first morning, and 24-h urine samples. At follow-up, a 12-h overnight urine sample was collected and analyzed for albumin and creatinine. At baseline, MA was defined by at least two separate urine specimens with albumin-to-creatinine ratios between 30 and 299 microg albumin per milligram of creatinine. MA regressed in 56% of subjects with baseline MA without systematic application of corrective measures and developed in 16% of subjects without baseline MA. The predictive value positive of MA as a marker of risk for DN was 43%, and the predictive value negative was 77%. In the combined cohort, the incidence and progression of MA were significantly associated with poor glycemic control and duration of diabetes between 10 and 14 years.
Question: Does microalbuminuria predict diabetic nephropathy?
MA may not be as sensitive and specific a predictor of DN as previously suggested. Other markers of risk for DN are needed for optimal clinical management.
Answer the question based on the following context: Results concerning an association between cholecystectomy and right-sided colon cancer are inconsistent. Little is known about the relation between cholecystectomy and small bowel cancer. Therefore, we evaluated cholecystectomy and risk of bowel cancer. Cholecystectomized patients, identified through the Swedish Inpatient Register, from 1965 through 1997, were followed up for subsequent cancer. The standardized incidence ratio (SIR) estimated relative risk. In total, 278,460 cholecystectomized patients, contributing 3,519,682 person-years, were followed up for a maximum of 33 years after surgery. Cholecystectomized patients had an increased risk of proximal intestinal adenocarcinoma, which gradually declined with increasing distance from the common bile duct. The risk was significantly increased for adenocarcinoma (SIR, 1.77; 95% confidence interval [CI], 1.37-2.24) and carcinoids of the small bowel (SIR, 1.71; 95% CI, 1.39-2.08), and right-sided colon cancer (SIR, 1.16; 95% CI, 1.08-1.24). No association was found with more distal bowel cancer. The gradient was further pronounced when surgery of the common bile duct was included. The associations remained increased up to 33 years after cholecystectomy. No differences between sexes were found.
Question: Intestinal cancer after cholecystectomy: is bile involved in carcinogenesis?
Cholecystectomy increases the risk of intestinal cancer, a risk that declines with increasing distance from the common bile duct. Changes in the intestinal exposure to bile might be the underlying biological mechanism.
Answer the question based on the following context: We observed IgM deposits in the macula densa of the distal convoluted tubule in some renal biopsies with mesangial IgM deposits and did a systematic study to investigate the frequency of this phenomenon. We compared the findings with those in IgA disease. A total of 30 renal biopsies with either isolated predominantly mesangial IgM, or mesangial IgA (+/-IgM) deposition, were retrieved from the files and reviewed independently by both authors. Eight showed strong macula densa IgM deposits and another three showed weak deposits in the macula densa on immunoperoxidase staining. A total of 14 biopsies also showed mesangial IgA deposition but IgA was not seen in the macula densa.
Question: Association of mesangial IgM with IgM deposits in the macula densa: an indication of non-specific macromolecule transport rather than immune reactant?
These results confirm the association of IgM deposits in the macula densa with mesangial IgM, and suggest that mesangial IgM deposits may be a reflection of non-specific macromolecule transport rather than an immune reactant.
Answer the question based on the following context: The purpose of this study was to analyze clinical data and magnetic resonance imaging (MRI) findings for patients with asymptomatic, incidentally identified syringomyelia associated with Chiari I malformations who were monitored for more than 10 years, and to clarify the natural history of these lesions. The clinical records of nine patients who had not been surgically treated and were regularly subjected to neurological and MRI examinations were analyzed. In MRI studies, the axial diameter of the syrinx at the widest level, the longitudinal extent of the syrinx, and the extent of tonsillar herniation into the spinal canal were analyzed. As a control, MRI findings for 11 patients with symptomatic syringomyelia associated with Chiari I malformations who had been surgically treated were also analyzed, and these MRI parameters were statistically compared between the asymptomatic and symptomatic groups. One patient underwent surgery, because of neurological changes, 7 years after the first visit. None of the remaining patients demonstrated any neurological change during the follow-up period (11.2+/-0.7 yr), and all of them have been faring well without surgery. No statistically significant differences in MRI findings between the asymptomatic and symptomatic groups were observed.
Question: Incidentally identified syringomyelia associated with Chiari I malformations: is early interventional surgery necessary?
The long-term clinical courses of patients with asymptomatic, incidentally identified syringomyelia associated with Chiari I malformations were observed to be benign. MRI parameters did not provide predictable values to recommend interventional surgery. Unless changes in neurological or MRI findings are detected, early interventional surgery is not necessary.
Answer the question based on the following context: On the basis of the contradiction between data on experimental head trauma showing oxidative stress-mediated cerebral tissue damage and failure of the majority of clinical trials using free radical scavenger drugs, we monitored the time-course changes of malondialdehyde (MDA, an index of cell lipid peroxidation), ascorbate, and dephosphorylated ATP catabolites in cerebrospinal fluid (CSF) of traumatic brain-injured patients. CSF samples were obtained from 20 consecutive patients suffering from severe brain injury. All patients were comatose, with a Glasgow Coma Scale on admission of 6 +/- 1. The first CSF sample for each patient was collected within a mean value of 2.95 hours from trauma (SD=1.98), after the insertion of a ventriculostomy catheter for the continuous monitoring of intracranial pressure. During the next 48 hours, CSF was withdrawn from each patient once every 6 hours. All samples were analyzed by an ion-pairing high-performance liquid chromatographic method for the simultaneous determination of MDA, ascorbic acid, hypoxanthine, xanthine, uric acid, inosine, and adenosine. In comparison with values recorded in 10 herniated-lumbar-disk, noncerebral control patients, data showed that all CSF samples of brain-injured patients had high values (0.226 micromol/L; SD=0.196) of MDA (undetectable in samples of control patients) and decreased ascorbate levels (96.25 micromol/L; SD=31.74), already at the time of first withdrawal at the time of hospital admission. MDA was almost constant in the next two withdrawals and tended to decrease thereafter, although 48 hours after hospital admission, a mean level of 0.072 micromol/L CSF (SD=0.026) was still recorded. The ascorbate level was normalized 42 hours after hospital admission. Changes in the CSF values of ATP degradation products (oxypurines and nucleosides) suggested a dramatic alteration of neuronal energy metabolism after traumatic brain injury.
Question: Early onset of lipid peroxidation after human traumatic brain injury: a fatal limitation for the free radical scavenger pharmacological therapy?
On the whole, these data demonstrate the early onset of oxygen radical-mediated oxidative stress, proposing a valid explanation for the failure of clinical trials based on the administration of oxygen free radical scavenger drugs and suggesting a possible rationale for testing the efficacy of lipid peroxidation "chain breakers" in future clinical trials.
Answer the question based on the following context: To evaluate the safety of a policy of selective nonoperative management (SNOM) in patients with abdominal gunshot wounds. Selective nonoperative management is practiced extensively in stab wounds and blunt abdominal trauma, but routine laparotomy is still the standard of care in abdominal gunshot wounds. The authors reviewed the medical records of 1,856 patients with abdominal gunshot wounds (1,405 anterior, 451 posterior) admitted during an 8-year period in a busy academic level 1 trauma center and managed by SNOM. According to this policy, patients who did not have peritonitis, were hemodynamically stable, and had a reliable clinical examination were observed. Initially, 792 (42%) patients (34% of patients with anterior and 68% with posterior abdominal gunshot wounds) were selected for nonoperative management. During observation 80 (4%) patients developed symptoms and required a delayed laparotomy, which revealed organ injuries requiring repair in 57. Five (0.3%) patients suffered complications potentially related to the delay in laparotomy, which were managed successfully. Seven hundred twelve (38%) patients were successfully managed without an operation. The rate of unnecessary laparotomy was 14% among operated patients (or 9% among all patients). If patients were managed by routine laparotomy, the unnecessary laparotomy rate would have been 47% (39% for anterior and 74% for posterior abdominal gunshot wounds). Compared with patients with unnecessary laparotomy, patients managed without surgery had significantly shorter hospital stays and lower hospital charges. By maintaining a policy of SNOM instead of routine laparotomy, a total of 3,560 hospital days and $9,555,752 in hospital charges were saved over the period of the study.
Question: Selective nonoperative management in 1,856 patients with abdominal gunshot wounds: should routine laparotomy still be the standard of care?
Selective nonoperative management is a safe method for managing patients with abdominal gunshot wounds in a level 1 trauma center with an in-house trauma team. It reduces significantly the rate of unnecessary laparotomy and hospital charges.
Answer the question based on the following context: We determine whether paramedics, using written guidelines, can accurately triage patients in the field. This prospective, descriptive study was conducted at an urban county emergency medical services (EMS) system and county hospital. Paramedics triaged patients, for study purposes only, according to 4 categories: (1) needing to come to the emergency department by advanced life support (ALS) transport, (2) needing to come to the ED by any transport, (3) needing to see a physician within 24 hours, or (4) not needing any further physician evaluation. Medical records that provided patient treatment information to the point of ED disposition were subsequently reviewed (blinded to the paramedic rating) to determine which of the categories was appropriate. The protocol of the EMS system of the study site dictates that all patients should be transported except for those who refuse care and leave against medical advice. Only transported patients were included in the present study. Fifty-four paramedics triaged 1,180 patients. Mean patient age was 43.4+/-17 years; 62.0% were male. Paramedics rated 1,000 (84.7%) of the patients as needing to come to the ED and 180 (15.3%) as not needing to come to the ED. Ratings according to triage category were as follows: 804 (68.1%) category 1, 196 (16.6%) category 2, 148 (12.5%) category 3, and 32 (2.7%) category 4. Seven hundred thirty-six (62.4%) patients were discharged, 298 (25.3%) were admitted, 90 (7.6%) were transferred, 36 (3.1%) left against medical advice, and 20 (1.7%) died. The review panel determined that 113 (9.6%) patients were undertriaged; 55 (48.7%) of these patients were misclassified because the paramedics misused the guidelines. Ninety-nine patients (8.4% of the total sample) were incorrectly classified as not needing to come to the ED. This represented 55% of the patients (99/180) categorized as 3 or 4 by the paramedics. Fourteen patients (1.2% of total) were incorrectly classified as category 4 instead of 3. Of the 113 undertriaged patients, 22 (19.6%) were admitted, 86 (76.1%) were discharged, and 4 (3.5%) were transferred.
Question: Can paramedics using guidelines accurately triage patients?
Paramedics using written guidelines fall short of an acceptable level of triage accuracy to determine disposition of patients in the field.
Answer the question based on the following context: The aim of this study was to identify variations in the impact of oral health on quality of life (OHQOL) among UK residents in relation to self-reported number of teeth possessed and denture status. In addition, to determine whether recourse to a removable prosthesis for those who claimed that they had experienced considerable tooth loss (having<20 teeth) was associated with quality of life. The vehicle for this was the Office for National Statistics Omnibus survey in Great Britain. A random probability sample of 2667 addresses was selected in a multistage sampling process. Participants were interviewed about their oral health status. The impact of oral health on quality of life was measured utilising the OHQoL-UK(W) measure. The response rate was 68%. Variations in OHQoL-UK(W) scores were apparent in relation to self-reported number of teeth possessed (P<0.001) and denture status (P<0.001). Moreover, disparities in OHQOL were apparent among those who experienced considerable tooth loss who didn't have recourse to a denture (P<0.001). In regression analysis, those who claimed that they had<20 natural teeth but had no recourse to a denture were less than half as likely to enjoy enhanced oral health related quality of life compared to others in the population (OR = 0.46, 95% Cl 0.30, 0.71), controlling for socio-demographic factors.
Question: Can dentures improve the quality of life of those who have experienced considerable tooth loss?
Experience of considerable tooth loss without recourse to a removable dental prosthesis is an important predictor of oral health related quality of life, as captured by OHQoL-UK(W), and associated with reduced quality of life.
Answer the question based on the following context: Professional setting might be a key determinant of physicians' attitudes toward practice guidelines, influencing the effect of their implementation. Because no previous surveys have specifically considered this aspect, we evaluated the perceived role and usefulness of guidelines, as well as barriers to and facilitators of their implementation, for hospital, primary care, and nonpracticing clinicians. A 43-item self-administered questionnaire was sent to all National Health Service physicians in the province of Modena, Italy (593 primary care physicians, 1049 hospital physicians, and 149 nonpracticing clinicians), and 1199 (66.9%) responded. Opinions and attitudes were assessed using 5-point ordinal scales and an attitude measurement scale. Results were evaluated overall and by professional setting, sex, age, year of graduation, and academic background. Practice guidelines were generally perceived to be less useful than other sources of medical information (eg, personal experience, conferences, colleagues, articles, the Internet, and textbooks [pharmaceutical representatives were the exception]). Most physicians thought that guidelines are developed for cost-containment reasons and expressed concerns about their limited applicability to individual patients and local settings. Most respondents did not favor the involvement of health professionals other than physicians in guideline development and use and preferred nonmonetary incentives for their implementation. Answers to individual items and attitude scores varied significantly across professional settings. Primary care physicians showed, in general, the least favorable attitudes toward practice guidelines, toward nonphysicians participating in guideline development and use, and toward incentives for guideline users.
Question: Practice guidelines: useful and "participative" method?
Physicians perceived practice guidelines as externally imposed and cost-containment tools rather than as decision-supporting tools. Regularly monitoring attitudes toward practice guidelines can be helpful to evaluate potential barriers to their adoption.
Answer the question based on the following context: To compare the uptake of SH U 508A in different types of liver lesions by using stimulated acoustic emission. Thirty-seven patients with characterized lesions (metastasis, n = 17; hepatocellular carcinoma, n = 4; hemangioma, n = 9; focal nodular hyperplasia, n = 7) received 2.5 g SH U 508A. After 5 minutes, stimulated acoustic emission was elicited by using a previously described method. Liver and/or lesional differences were assessed with videodensitometry (objective conspicuity score), and two observers assessed each lesion by using a six-point scale (subjective conspicuity score). Metastases and hepatocellular carcinoma had low stimulated acoustic emission; median objective conspicuity scores were 70% and 68% (all scores were>or =43%), respectively, and subjective conspicuity scores were 2 or higher for both observers. Hemangiomas had reduced stimulated acoustic emission, with more variability; the median objective conspicuity score was 41% (range, 9%-72%), and the median subjective conspicuity scores were 2 (range, 1-4) and 3.5 (range, 1-5) for observers 1 and 2, respectively. Focal nodular hyperplasia had stimulated acoustic emission comparable to that of the liver in all cases; the median objective conspicuity score was -4.7% (all scores were<6%), and the subjective conspicuity score was 1 or lower for both observers. This finding completely separated focal nodular hyperplasia and malignancies. Significant differences were seen between focal nodular hyperplasia and all other lesion types (P<.05).
Question: Do different types of liver lesions differ in their uptake of the microbubble contrast agent SH U 508A in the late liver phase?
Strong late-phase lesional uptake of SH U 508A is characteristic of focal nodular hyperplasia, is seen in some hemangiomas, and was not observed in malignancies.
Answer the question based on the following context: Experimental studies have revealed that stent configuration influences intimal hyperplasia. The purpose of this study was to evaluate clinical outcomes for 2 stent designs in a randomized trial with quantitative coronary angiography (QCA) and intravascular ultrasonography (IVUS). We randomly assigned 100 patients with 107 lesions and symptomatic coronary artery disease to deployment of a Multilink stent (Advanced Cardiovascular Systems, Guidant, Santa Clara, Calif) or a GFX stent (Applied Vascular Engineering, Santa Rosa, Calif) with IVUS guidance. QCA and IVUS studies were performed before and after intervention and at follow-up (4.2 +/- 1.0 months). There were no significant differences in baseline characteristics and QCA and IVUS parameters before and after intervention between the 2 groups. However, minimal lumen diameter at follow-up was significantly larger in the Multilink group (2.46 +/- 0.59 vs 2.08 +/- 0.79 mm, P<.05). Maximal in-stent intimal hyperplasia was significantly larger in the GFX group (2.9 +/- 1.7 vs 1.8 +/- 1.2 mm(2), P<.01). The restenosis rate differed between the 2 groups (Multilink 4% vs GFX 26%, P =.003). In multiple stepwise logistic regression analysis, the only predictor that significantly correlated with restenosis was stent type (P<.01). The odds ratio for the GFX stent-treated vessels was 18.65 (95% confidence interval 2.10-165.45).
Question: Does stent design affect probability of restenosis?
With deployment of the GFX stent, a thicker neointima develops within the stent. Stent configuration may affect clinical outcomes.
Answer the question based on the following context: Pathologic studies and surgical observations of thickened aortic walls have suggested an increase in aortic stiffness in patients with Williams syndrome. However, in vivo objective evaluation of aortic and arterial stiffness in Williams syndrome are lacking. Moreover, systemic hypertension, although prevalent in Williams syndrome, does not have a well-defined mechanism in this syndrome. Therefore, the purpose of this study was to quantitate aortic stiffness and arterial compliance in an objective manner, as well as to determine their roles in development of hypertension, in children with Williams syndrome. We studied 13 patients with Williams syndrome (aged 3-12 years) and 16 age-matched control subjects. Aortic stiffness was calculated from the beta index as follows: beta = (ln[P(s)/P(d)])/ ([D(s) - D(d)]/D(d)), where P(s) and P(d) are systolic and diastolic blood pressures and D(s) and D(d) are systolic and diastolic aortic dimensions, respectively. Arterial compliance (C) was calculated by the area method: C= (A(d) x CO x CL) / (A(t) x [P(es) - P(d)]), where A(t) is the total area and A(d) is the area under the diastolic portion of the arterial pulse tracing, CO is the cardiac output, CL is the cycle length, and P(es) is aortic end-systolic pressure. In patients with Williams syndrome, the beta index was 2-fold higher than in control patients (9.02 +/- 3.15 vs 4.43 +/- 0.96, P<.005). Moreover, there was a strong positive correlation between the beta index and the systolic blood pressure (r = 0.8 and P<.0001). Compliance was decreased by 42% (0.41 +/- 0.11 vs 0.71 +/- 0.10 mL/mm Hg, P<.05), suggesting decreased arterial compliance.
Question: Evaluation of arterial stiffness in children with Williams syndrome: Does it play a role in evolving hypertension?
Our study indicates that in vivo arterial stiffness is increased in patients with Williams syndrome. We speculate that increased arterial stiffness may be the predisposing cause of systemic hypertension in Williams syndrome.
Answer the question based on the following context: To assess the relationship between word identification and intelligence in adults with mild mental retardation (IQ<80). A standardized evaluation was administered to 67 adults with mild mental retardation. The evaluation included a psychiatric interview, the WAIS-R, and a 2-h interview with a speech therapist and reading tests. Causes of mental retardation were diverse in the sample, and IQ scores ranged from 43 to 79 (mean score = 64). All subjects exhibited reading impairment, including 69% with severe impairment. No subject with an IQ score under 65 was able to perform adequately in the word identification tasks. Word identification was correlated with total and verbal IQ, but not with performance IQ.
Question: Word identification in adults with mild mental retardation: does IQ influence reading achievement?
Our results suggest that, in contrast to subjects with normal intelligence, IQ score is correlated with reading in subjects with mild mental retardation. Finally, remediation should be preferentially implemented for subjects with IQ score greater than 65.
Answer the question based on the following context: To investigate the discrepancies between outcomes for competence (can do) and actual performance (do do) in activities of daily living (ADLs). Baseline measurements of a population-based follow-up study. Leiden 85-Plus Study, the Netherlands. Five hundred and ninety-nine persons, age 85. The response rate was 86%. Face-to-face interviews. Measurements of competence and actual performance were based on the Groningen Activity Restriction Scale. Help received was assessed for several domains. Prevalence rates for disability were assessed according to the concepts of both competence and actual performance. Analysis was performed separately for basic activities of daily living (BADLs) and instrumental activities of daily living (IADLs). Seventy-seven percent of the oldest old were competent to perform all the BADLs and performed them regularly. Fifteen percent were not competent to perform certain BADLs independently but performed them regularly with help from others. The prevalence of disability defined as inability in one or more BADLs was 22% for women and 10% for men. The prevalence of disability defined as inactivity in one or more BADLs was 16% for women and 17% for men. Only 5% of the oldest old were competent to perform all IADLs and performed them regularly. In spite of being competent, 70% did not perform certain IADLs regularly. The prevalence of disability defined as inability in one or more IADLs was 64% for women and 55% for men. The prevalence of disability defined as inactivity in one or more IADLs was 92% for women and 98% for men.
Question: Disability in the oldest old: "can do" or "do do"?
The structural discrepancies between the outcomes of competence and actual performance have important consequences when estimating disability in old people. Promoting actual performance in IADLs may reduce disability.
Answer the question based on the following context: To assess the relative ability of principal components analysis (PCA)-derived dietary patterns to correctly identify cases and controls compared with other methods of characterising food intake. Participants in this study were 232 endometrial cancer cases and 639 controls from the Western New York Diet Study, 1986-1991, frequency-matched to cases on age and county of residence. Usual intake in the year preceding interview of 190 foods and beverages was collected during a personal interview using a detailed food-frequency questionnaire. Principal components analysis identified two major dietary patterns which we labelled 'healthy' and 'high fat'. Classification on disease status was assessed with separate discriminant analyses (DAs) for four different characterisation schemes: stepwise DA of 168 food items to identify the subset of foods that best discriminated between cases and controls; foods associated with each PCA-derived dietary pattern; fruits and vegetables (47 items); and stepwise DA of USDA-defined food groups (fresh fruit, canned/frozen fruit, raw vegetables, cooked vegetables, red meat, poultry, fish and seafood, processed meats, snacks and sweets, grain products, dairy, and fats). In general, classification of disease status was somewhat better among cases (54.7% to 67.7%) than controls (54.0% to 63.1%). Correct classification was highest for fruits and vegetables (67.7% and 62.9%, respectively) but comparable to that of the other schemes (49.5% to 66.8%).
Question: Is principal components analysis necessary to characterise dietary behaviour in studies of diet and disease?
Our results suggest that the use of principal components analysis to characterise dietary behaviour may not provide substantial advantages over more commonly used, less sophisticated methods of characterising diet.
Answer the question based on the following context: To assess and contrast awareness of the link between dietary fibre and folate and their major food sources (fruit, vegetables, bread and cereals). Mailed questionnaire investigating changes made to dietary intake of fibre, folate, fruit, vegetables, bread and cereals in the previous six months. The survey was conducted between June and November 1998 in the Australian Capital Territory. One thousand one hundred and twenty-six adults randomly selected from the electoral roll. More women than men in both older (50+ years) and younger (18-49 years) age groups reported increasing their consumption of folate, fibre, fruit and vegetables in the prior six months. In contrast, more men than women reported increased consumption of bread, cereals, rice and pasta in the previous six months. For food categories and fibre, less than 4% of respondents were unsure about changes in these food habits. However, 26% of men and women were 'not sure' about changes to folate intake. Similar proportions of men and women (about 33%) reported consuming more fruit, vegetables or cereal-based foods over the prior six months, yet only 6% of these men and 14% of these women reported consuming more folate. In contrast, 44% of men and 51% of women who reported consuming more plant foods also reported consuming more dietary fibre.
Question: Is the link between nutrients and foods understood?
The results suggested that subjects, particularly the younger age group, had a poor understanding of the relationship between folate intake and its major food sources. The understanding of the relationship between fibre intake and its food sources appeared substantial, but confusion about specific food sources was still evident. These outcomes question the effectiveness of nutrition education used to date, particularly for the current priority of increasing folate intake in younger women in the new, 'health claims' environment.
Answer the question based on the following context: This study aimed to determine whether pre-existing angiographic thrombus was associated with adverse in-hospital and six-month outcomes after percutaneous coronary interventions. There are conflicting data about whether pre-existing thrombus is an independent predictor of adverse in-hospital and short-term outcome after coronary interventions. The Angiographic Trials Pool, a data set derived from eight prospective randomized trials, was analyzed. The study population consisted of 7,917 patients who underwent coronary interventions between 1986 and 1995. Two trials were excluded because they did not collect information regarding thrombus. Patients from the other six trials were divided on the basis of the presence or absence of thrombus. In patients with (n = 2,752) and without (5,165) thrombus, in-hospital mortality following angioplasty was low (0.8 vs. 0.6%, p = 0.207). Several adverse outcomes were higher in patients with thrombus: death/myocardial infarction (8.4 vs. 5.5%, p<or = 0.001), in-hospital abrupt closure (5.9 vs. 3.9%, p<or = 0.001) and an in-hospital composite of death, myocardial infarction and/or repeat revascularization (15.4 vs. 11.2%, p<or = 0.001). Six-month mortality was low and comparable between the two groups (2.1 vs. 1.8%, p = 0.34), but the incidence of six-month death/myocardial infarction was higher in patients with thrombus (11.7 vs. 8.7%, p<or = 0.0001).
Question: Does the presence of thrombus seen on a coronary angiogram affect the outcome after percutaneous coronary angioplasty?
Percutaneous coronary angioplasty can be performed with low mortality in patients with pre-existing thrombus, although these patients are at higher risk of in-hospital and six-month death/myocardial infarction. Continued efforts are required to optimize the outcome in these high risk patients.
Answer the question based on the following context: This study was designed to determine whether arterial remodeling and plaque vulnerability are influenced by systemic factors. Atherosclerotic luminal narrowing is caused by gradual plaque growth and arterial remodeling. In the acute phase, luminal narrowing may be accelerated by acute thrombus formation, usually precipitated by rupture of a vulnerable plaque. Femoral arteries were obtained from elderly individuals at autopsy. Pairs of atherosclerotic femoral arteries from 42 individuals were examined. The arteries were divided in 1-cm intervals. Plaque size, the mode of arterial remodeling and histopathologic characteristics of plaque vulnerability (lipid-rich core and plaque inflammation) were compared between right and left femoral arteries obtained from the same individual. A role for systemic factors was assumed if a phenomenon was equally present in both arteries. There was concordance in average plaque size (r(2) = 0.5, p<0.001), expansive remodeling (kappa = 0.42, p = 0.007) and occurrence of plaques containing a large lipid-rich core (kappa = 0.60, p = 0.001), but no concordance in plaque inflammation (kappa = 0.067, p = 0.61) between right and left arteries.
Question: Plaque burden, arterial remodeling and plaque vulnerability: determined by systemic factors?
These results suggest that not only the amount of atherosclerosis, but also arterial remodeling and lipid deposition in plaques, are influenced by systemic factors. The nonhomogeneous distribution of inflammation in atherosclerotic arteries supports the hypothesis that plaque inflammation is locally affected.
Answer the question based on the following context: The aim of this study was to examine the association between atherosclerosis risk factors, aortic atherosclerosis and aortic valve abnormalities in the general population. Clinical and experimental studies suggest that aortic valve sclerosis (AVS) is a manifestation of the atherosclerotic process. Three hundred eighty-one subjects, a sample of the Olmsted County (Minnesota) population, were examined by transthoracic and transesophageal echocardiography. The presence of AVS (thickened valve leaflets), elevated transaortic flow velocities and aortic regurgitation (AR) was determined. The associations between atherosclerosis risk factors, aortic atherosclerosis (imaged by transesophageal echocardiography) and aortic valve abnormalities were examined. Age, male gender, body mass index (odds ratio [OR]: 1.07 per kg/m(2); 95% confidence interval [CI]: 1.02 to 1.12), antihypertensive treatment (OR: 1.93; CI: 1.12 to 3.32) and plasma homocysteine levels (OR: 1.89 per twofold increase; CI: 0.99 to 3.61) were independently associated with an increased risk of AVS. Age, body mass index and pulse pressure (OR: 1.21 per 10 mm Hg; CI: 1.00 to 1.46) were associated with elevated (upper quintile) transaortic velocities, whereas only age was independently associated with AR. Sinotubular junction sclerosis (p = 0.001) and atherosclerosis of the ascending aorta (p = 0.03) were independently associated with AVS and elevated transaortic velocities, respectively.
Question: Aortic valve sclerosis and aortic atherosclerosis: different manifestations of the same disease?
Atherosclerosis risk factors and proximal aortic atherosclerosis are independently associated with aortic valve abnormalities in the general population. These observations suggest that AVS is an atherosclerosis-like process involving the aortic valve.
Answer the question based on the following context: Two thirds of nursing homes are investor owned. This study examined whether investor ownership affects quality. We analyzed 1998 data from state inspections of 13,693 nursing facilities. We used a multivariate model and controlled for case mix, facility characteristics, and location. Investor-owned facilities averaged 5.89 deficiencies per home, 46.5% higher than nonprofit facilities and 43.0% higher than public facilities. In multivariate analysis, investor ownership predicted 0.679 additional deficiencies per home; chain ownership predicted an additional 0.633 deficiencies. Nurse staffing was lower at investor-owned nursing homes.
Question: Does investor ownership of nursing homes compromise the quality of care?
Investor-owned nursing homes provide worse care and less nursing care than do not-for-profit or public homes.
Answer the question based on the following context: Recently, a number of studies have reported positive results from the nonoperative management of fistula-in-ano in infancy, although it has not been of use in all patients. The purpose of this study was to discern the effective treatment methods of fistula-in-ano in infants. A retrospective review was done of 310 children who required operative management for fistula-in-ano or perianal abscess between January 1991 and July 2000. Eighteen patients displayed an onset of symptoms at less than 1 year of age and a duration of symptoms longer than 12 months. The authors analyzed these patients' medical records. All patients were boys. The mean duration of the symptoms was 26.6 +/- 27.5 months. Fourteen patients had shown an onset of symptoms at less than 6 months of age. The longest duration was 10 years. The patients showed conservative periods of over 12 months because their parents did not want them to undergo surgery. The disease in these patients followed 2 patterns. One (6 patients) was an onset of symptoms followed by a silent fistula-in-ano state. The other (12 patients) was an onset of symptoms followed by an intermittent relapse of inflammation. All patients underwent fistulotomy, and none of them had recurrent fistula during the follow-up period.
Question: Fistula-in-ano in infants: is nonoperative management effective?
Although the advantages of a nonoperative management of fistula-in-ano in infants include the avoidance of general anesthesia and surgical intervention, the lesions cannot be cured by a period of conservation. Surgical management is more effective in respect to the time factor.
Answer the question based on the following context: Anti-gliadin and anti-endomysium antibodies are useful markers in the screening and follow-up of coeliac disease. The recent finding that tissue transglutaminase is the main auto-antigen of anti-endomysium has led to the discovery of anti-tissue transglutaminase antibodies.AIM: To compare, in a prospective study, the diagnostic accuracy of anti-tissue transglutaminase, anti-gliadin and anti-endomysium antibodies in a large series of adult patients. The study involved 80 consecutive subjects undergoing upper gastrointestinal tract endoscopy for suspected coeliac disease (subsequently confirmed in 40 cases), 195 coeliac patients on a gluten-free diet, and 70 patients with different gastrointestinal disor ders and normal duodenal histology. Anti-gliadin, anti-endomysium and anti-tissue transglutaminase antibodies levels were measured using commercial kits. The diagnostic sensitivity and specificity of anti-gliadin, anti-endomysium and anti-tissue transglutaminase antibodies were, respectively, 95% and 89.1%, 100% and 97.3%, and 100% and 98.2%: the agreement between the markers was substantial or almost perfect. In terms of follow-up, the positivity of the markers varied according to the strict adherence to, and duration of the gluten-free diet; the agreement between antiendomysium and anti-tissue transglutaminase antibodies was almost perfect.
Question: Serological markers for coeliac disease: is it time to change?
Anti-endomysium and anti-tissue transglutaminase antibodies are both highly efficient for routine laboratory screening: the choice of one or the other will depend on the available facilities. However, neither can replace intestinal biopsy for general population screening because, in this case, their respective positive predictive values are only 15.7% and 21.8%. During follow-up, anti-gliadin retain their value as an early predictor of gluten ingestion.
Answer the question based on the following context: Coronary artery bypass graft (CABG) surgery and percutaneous transluminal coronary angioplasty (PTCA) are well-established treatments for symptomatic coronary artery disease. Previous studies have documented racial differences in rates of use of these cardiac revascularization procedures. Other studies suggest that these procedures are overused: that is, they are done for patients with clinically inappropriate indications. To test the hypothesis that the higher rate of cardiac revascularization among white patients is associated with a higher prevalence of overuse (revascularization for clinically inappropriate indications) among white patients than among African-American patients. Observational cohort study using Medicare claims and medical record review. 173 hospitals in five U.S. states. A stratified, weighted, random sample of 3960 Medicare beneficiaries who underwent coronary angiography during 1991 and 1992; 1692 of these patients underwent 1711 revascularization procedures within 90 days. The proportion of CABG and PTCA procedures rated appropriate, uncertain, and inappropriate according to RAND criteria, and the multivariate odds of undergoing inappropriate revascularization among African-American patients and white patients. After angiography, rates of PTCA (23% vs. 19%) and CABG surgery (29% vs. 17%) were significantly higher among white patients than among African-American patients. The respective rates of inappropriate PTCA and CABG surgery were 14% and 10%. Among the study states, rates of inappropriate use ranged from 4% to 24% for PTCA and 0% to 14% for CABG surgery. White patients were more likely than African-American patients to receive inappropriate PTCA (15% vs. 9%; difference, 6 percentage points [95% CI, -0.4 to 12.7 percentage points]), and difference by race was statistically significant among men (20% vs. 8%; difference, 12 percentage points [CI, 1.2 to 21.7 percentage points]). Rates of inappropriate CABG surgery did not differ by race (10% in both groups).
Question: Racial differences in cardiac revascularization rates: does "overuse" explain higher rates among white patients?
Among a large and diverse sample of Medicare beneficiaries in five U.S. states, overuse of PTCA was greater among white men than among other groups, but this difference did not fully account for racial disparities in revascularization. Overuse of cardiac revascularization varied significantly by geographic region.
Answer the question based on the following context: The present study focuses on opinions on the quality of nursing home care of family members of nursing home residents with dementia. Furthermore, we examined whether family members' appreciation of the care increased as a result of the implementation of emotion-oriented care. Randomized clinical trial. An 18-item questionnaire was developed. The following subjects were addressed: communication activities between staff and family members; satisfaction regarding contacts with staff; the extent to which family members can participate in care; the contact that family members experience with the person with dementia, and opinions about the way in which nursing staff treat residents. Most family members already had a positive opinion on the nursing home care prior to the implementation of emotion-oriented care. The most positive assessment concerned the way in which nursing staff treated residents. The lowest scores concerned communication activities between ward staff and family members. Comparison of the first and end measurements showed that in general opinions on the quality of care did not change. A large number of incomplete questionnaires made it impossible to conduct factor analysis on the classification of the questions in various sections and therefore allowed us only to make statements at the item level.
Question: The quality of nursing home care: do the opinions of family members change after implementation of emotion-oriented care?
For the most part family members had a positive opinion on the nursing home care. In general, implementation of emotion-oriented care did not lead to a more positive assessment. Despite the generally accepted notion that involving family members in care is important, family members were regularly treated as outsiders. This demonstrates that there is room for improvement in the communication by nursing home staff with family members (e.g. more frequent contacts and information about the illness).
Answer the question based on the following context: To investigate the relationship between the rate of migration of a low-lying placenta during the third trimester and the eventual route of delivery. All patients with a placenta lying within 3 cm of the internal cervical os or overlapping it on transvaginal ultrasound at>or = 26 weeks' gestation were included in the study. The exact distance between the center of the internal cervical os and the leading edge of the placenta was measured by transvaginal sonography, repeated at approximately 4-week intervals until delivery. The mean rates of migration in patients who had (n = 7) and who did not have (n = 29) Cesarean section for placenta previa were +0.3 mm/week and +5.4 mm/week, respectively (P<0.0001). When the placental edge was initially>20 mm from the internal os, migration occurred in all cases and no Cesarean section for placenta previa was performed. For those between -20 mm and +20 mm, sufficient migration to avoid Cesarean section occurred in 88.5% of cases. Beyond a 20 mm overlap, significant placental migration did not occur and all patients required Cesarean section.
Question: Diagnosis of low-lying placenta: can migration in the third trimester predict outcome?
Placental migration may occur progressively throughout the third trimester. The initial position of the placental edge and the subsequent rate of migration can be used to predict the eventual route of delivery.
Answer the question based on the following context: To evaluate the clinical significance of the shape of the lower placental edge in women with transvaginal sonographic diagnosis of placenta previa. A prospective observational study at a tertiary teaching hospital. A total of 104 women with confirmed transvaginal sonographic diagnosis of placenta previa before 32 weeks' gestation. Initial transvaginal sonography was performed at between 28 and 32 weeks' gestation in 138 patients with either strong clinical suspicion or previous abdominal sonographic diagnosis of placenta previa in the early third trimester. The lower placental edge was found to be positioned over the internal cervical os in 33 women (complete previa) and within 3 cm from it in 71 women (low-lying placenta). Patients with low-lying placenta were followed up by serial transvaginal sonographic examinations until delivery; detailed information including the placental location (anterior or posterior), the distance of its edge from the internal cervical os and its thickness were recorded. The clinical outcomes of the 17 who had a thick-edge low-lying placenta were compared with those who had a thin-edge one (54 women). In patients with complete placenta previa, demographic data, the shape of the lower placental edge whenever transvaginal sonography visualized it, and the clinical outcomes were documented. The incidence of major complications in thick-edge or central placenta was compared to that in the thin-edge group. Women having a low-lying placenta with a thick edge had a significantly higher rate of antepartum hemorrhage (P = 0.0002), abdominal delivery (P = 0.02), abnormally adherent placenta (P = 0.012) and low birth weight (P = 0.006) than those in whom the placental edge was thin. Cesarean hysterectomy was required in six patients with complete placenta previa because of severe peripartum hemorrhage; all of them had either central or thick-edge placenta accreta.
Question: Third-trimester transvaginal ultrasonography in placenta previa: does the shape of the lower placental edge predict clinical outcome?
Women with placenta previa are at a relatively higher risk of developing complications if the lower placental edge is thick. Integration of the shape of the lower placental edge into transvaginal sonographic assessment of placenta previa may improve the prediction of mode of delivery and clinical outcome.
Answer the question based on the following context: To confirm the hypothesis that isolated cardiac echogenic foci at the second-trimester anomaly scan do not influence our current calculation of risk of trisomy 21 in individual pregnancies, which is based on maternal age and nuchal translucency thickness at 11-14 weeks. Observational study in a fetal medicine unit. In a general pregnant population undergoing first-trimester nuchal translucency screening, data from 239 singleton pregnancies with isolated cardiac echogenic foci at the second-trimester anomaly scan were compared with those of a control group of 7449 pregnancies with normal anomaly scans. Prevalence of trisomy 21 was determined in both groups. Following the anomaly scan, the individual risks of trisomy 21 were calculated by adjusting the previous risk based on maternal age and first-trimester nuchal translucency. We assumed that echogenic foci did not alter each individual risk calculation. The expected number of cases of Down syndrome in both groups was then calculated from the sum of probabilities of each individual affected fetus. The observed number of cases was compared with the expected number in both study and control populations. There was no statistically significant difference between the prevalence of trisomy 21 in the study group (no cases) and in the control population (three cases). From individual risk calculations, observing no cases of trisomy 21 in the study group was the most likely event if echogenic foci did not increase the risk of this chromosomal abnormality (P = 0.62).
Question: Isolated echogenic foci in the fetal heart: do they increase the risk of trisomy 21 in a population previously screened by nuchal translucency?
The finding of isolated echogenic foci at the time of the 20 week-scan does not significantly change the risks of trisomy 21 if background risk and previous nuchal translucency measurements are taken into account in the individual risk calculation. We suggest that no further adjustments to risk should be used.
Answer the question based on the following context: This paper describes the processes involved in policy development and implementation with examples of how this can be influenced by the outcomes of research. The author draws on his experience in the development and implementation of Australia's National Mental Health Policy and on the literature describing public policy analysis. A five-step process of problem identification, policy development, political decision, policy implementation and evaluation is described. This process identifies how issues are considered, adopted and implemented by governments.
Question: Can research influence mental health policy?
An understanding of this process can inform mechanisms by which scientific research can impact on the issues considered and the decisions made in each step of policy analysis and development.
Answer the question based on the following context: Encouraging results of previous uncontrolled trials suggest that calcipotriol may potentiate the efficacy of psoralen plus ultraviolet (UV) A (PUVA) therapy in patients with vitiligo. We performed a placebo-controlled double-blind study to investigate whether the effectiveness of PUVA treatment could be enhanced by combination with topical calcipotriol in the treatment of vitiligo. Thirty-five patients with generalized vitiligo enrolled in the study. Symmetrical lesions of similar dimensions and with no spontaneous repigmentation on arms, legs or trunk were selected as reference lesions. In this randomized left-right comparison study, calcipotriol 0.05 mg g(-1) cream or placebo was applied to the reference lesions 1 h before PUVA treatment (oral 8-methoxypsoralen and conventional UVA units) twice weekly. Patients were examined at weekly intervals. The mean number of sessions and the cumulative UVA dosage for initial and complete repigmentation were calculated. Twenty-seven patients (nine women, 18 men; mean +/- SEM age 29.8 +/- 13.5 years) were evaluated. The mean +/- SEM cumulative UVA dose and number of UVA exposures for initial repigmentation were 52.52 +/- 6.10 J cm(-2) and 9.33 +/- 0.65 on the calcipotriol side, and 78.20 +/- 7.88 J cm(-2) and 12.00 +/- 0.81 on the placebo side, respectively (P<0.001). For complete repigmentation, respective values were 232.79 +/- 14.97 J cm(-2) and 27.40 +/- 1.47 on the calcipotriol side and 259.93 +/- 13.71 J cm(-2) and 30.07 +/- 1.34 on the placebo side (P = 0.001). Treatment with calcipotriol and PUVA resulted in significantly higher percentages of repigmentation for both initial (81%) and complete pigmentation (63%), compared with placebo and PUVA (7% and 15%, respectively).
Question: Is the efficacy of psoralen plus ultraviolet A therapy for vitiligo enhanced by concurrent topical calcipotriol?
Our results have shown that concurrent topical calcipotriol potentiates the efficacy of PUVA in the treatment of vitiligo, and that this combination achieves earlier pigmentation with a lower total UVA dosage.
Answer the question based on the following context: To investigate prognostic factors for long-term outcome of patients after inpatient withdrawal because of drug-induced chronic daily headache. Fifty-five patients (36 females) were re-examined by means of a standardized interview after inpatient withdrawal. The mean observation period was 9.28 +/- 2.85 years (mean +/- SD; median 8.58; range 5.00-13.50). Five years after withdrawal, one-third of the patients (34.6%) had an overall favourable outcome, one-third (32.7%) had no recurrent drug overuse and reported a clear-cut improvement of headache, and one-third (32.7%) developed recurrent drug overuse. Most relapses occurred within 2 years, and a small percentage within 5 years. No predictors for long-term outcome after inpatient withdrawal were found.
Question: Are there predictive factors for long-term outcome after withdrawal in drug-induced chronic daily headache?
All patients with drug-induced chronic daily headache should be considered as good candidates for inpatient withdrawal, and no patient should be excluded from that therapy.
Answer the question based on the following context: In spite of many reports focusing on prognostic factors after hepatectomy in patients with colorectal liver metastases, few studies have investigated pathological factors, eg, fibrous pseudocapsulation, growth pattern at the tumor margin, and proliferation activity of cancer cells, other than histological type and surgical margin. The aim of the present study was to investigate whether absence of pseudocapsulation, infiltrative growth pattern of metastases, and higher proliferation of cancer cells shown by Ki-67 immunohistochemical reactivity were associated with poorer survival after hepatectomy among patients with colorectal liver metastases. Between 1988 and 1998, 221 patients underwent hepatic resection of colorectal metastases with curative intent in our institution. Pathology analyses were focused on pseudocapsulation of liver metastases, growth pattern at the tumor edge, and Ki-67 labelling index (Ki-67 LI) of cancer cell nuclei. Univariate analyses of survival and of disease-free survival were performed for several clinicopathological factors, and multivariate analyses of survival and disease-free survival were also performed. The univariate survival analyses showed that pseudocapsulation, growth pattern, and Ki-67 LI were significant prognostic factors, besides synchronous versus metachronous occurrence of metastases, carcinoembryonic antigen level before hepatectomy, and number of metastases. A multivariate analysis showed that Ki-67 labeling index was the most reliable prognostic factor of survival. In addition, Ki-67 LI and microscopic growth pattern were multivariately predictive factors of disease-free survival.
Question: Is a proliferation index of cancer cells a reliable prognostic factor after hepatectomy in patients with colorectal liver metastases?
This large single-institution study showed that investigation of cancer cell proliferation and pathologic characteristics of the tumor margin are major prognostic factors.
Answer the question based on the following context: To establish the relation between recurrent peer victimisation and onset of self reported symptoms of anxiety or depression in the early teen years. Cohort study over two years. Secondary schools in Victoria, Australia. 2680 students surveyed twice in year 8 (aged 13 years) and once in year 9. Self reported symptoms of anxiety or depression were assessed by using the computerised version of the revised clinical interview schedule. Incident cases were students scoring>/=12 in year 9 but not previously. Prior victimisation was defined as having been bullied at either or both survey times in year 8. Prevalence of victimisation at the second survey point in year 8 was 51% (95% confidence interval 49% to 54%), and prevalence of self reported symptoms of anxiety or depression was 18% (16% to 20%). The incidence of self reported symptoms of anxiety or depression in year 9 (7%) was significantly associated with victimisation reported either once (odds ratio 1.94, 1.1 to 3.3) or twice (2.30, 1.2 to 4.3) in year 8. After adjustment for availability of social relations and for sociodemographic factors, recurrent victimisation remained predictive of self reported symptoms of anxiety or depression for girls (2.60, 1.2 to 5.5) but not for boys (1.36, 0.6 to 3.0). Newly reported victimisation in year 9 was not significantly associated with prior self report of symptoms of anxiety or depression (1.48, 0.4 to 6.0).
Question: Does bullying cause emotional problems?
A history of victimisation and poor social relationships predicts the onset of emotional problems in adolescents. Previous recurrent emotional problems are not significantly related to future victimisation. These findings have implications for how seriously the occurrence of victimisation is treated and for the focus of interventions aimed at addressing mental health issues in adolescents.
Answer the question based on the following context: To describe the standards of care for stroke patients in England, Wales and Northern Ireland and to determine the power of national audit, coupled with an active dissemination strategy to effect change. A national audit of organisational structure and retrospective case note audit, repeated within 18 months. Separate postal questionnaires were used to identify the types of change made between the first and second round and to compare the representativeness of the samples. 157 trusts (64% of eligible trusts in England, Wales, and Northern Ireland) participated in both rounds. 5589 consecutive patients admitted with stroke between 1 January 1998 and 31 March 1998 (up to 40 per trust) and 5375 patients admitted between 1 August 1999 and 31 October 1999 (up to 40 per trust). Audit tool-Royal College of Physicians Intercollegiate Working Party stroke audit. The proportion of patients managed on stroke units rose between the two audits from 19% to 26% with the proportion managed on general wards falling from 60% to 55% and those managed on general rehabilitation wards falling from 14% to 11%. Standards of assessment, rehabilitation, and discharge planning improved equally on stroke units and general wards, but in many aspects remained poor (41% formal cognitive assessment, 46% weighed once during admission, 67% physiotherapy assessment within 72 hours, 24% plan documented for mood disturbance, 36% carers' needs assessed separately).
Question: National stroke audit: a tool for change?
Nationally conducted audit linked to a comprehensive dissemination programme was effective in stimulating improvements in the quality of care for patients with stroke. More patients are being managed on stroke units and multidisciplinary care is becoming more widespread. There remain, however, many areas where standards of care are low, indicating a need for investment of skills and resources to achieve acceptable levels.
Answer the question based on the following context: The aim of the present study was to investigate the predictive value for subsequent stroke of different patterns of brain CT infarction in patients with carotid atheroma. Prospective study on 138 patients, with 138 carotid plaques, having, on presentation, a greater than 50 percent stenosis on duplex scanning and associated with an ipsilateral (to the plaque) amaurosis fugax (AF), hemispheric transient ischaemic attack (HTIA) or which were asymptomatic. This carotid artery defined the side of interest. All patients had a brain CT scan on presentation and subsequently were followed for a period of 1-5 years (mean 3.14). The baseline CT neurovascular findings on the side of interest were classified as pattern A (discrete subcortical and cortical infarctions), pattern B (haemodynamic infarctions, widespread white matter lesions, basal ganglia infarctions and lacunae) and normal CT. On follow-up, 5/27 (18.5 percent) of patients with pattern A, 4/38 (10.5 percent) with pattern B and 3/73 (4.1 percent) with normal CT appearance developed stroke in the hemisphere of interest (Cox regression: p=0.02).
Question: Brain CT infarction in patients with carotid atheroma. Does it predict a future event?
Pattern A confers an unfavourable prognosis in patients with carotid atheroma who are either asymptomatic or presented with amaurosis fugax or hemispheric transient ischaemic attacks.
Answer the question based on the following context: Thoracoabdominal aortic replacement requires visceral vessel revascularization and is usually performed with Crawford's inclusion technique or a large Carrel patch. This segment of retained native aorta may be prone to recurrent aneurysmal disease. We reviewed our experience with patients in whom aneurysmal expansion of the visceral patch was detected. The records of 107 patients undergoing thoracoabdominal aortic replacement operations performed or followed up at the Johns Hopkins Hospital between 1992 and 2000 were reviewed. All patients had visceral patches created for type II, III, or IV aneurysms. Visceral patches were considered aneurysmal if the maximal diameter of the aortic prosthesis and patch was 4.0 cm or more. Patch aneurysmal expansion (mean, 5.4 cm) was detected in eight patients (7.5%). All three women had connective tissue disorders (mean age, 36 years), and all five men had atherosclerotic disease (mean age, 73 years). Five patients were symptom free with their aneurysms detected by surveillance computed tomography scans; two patients had back pain prompting computed tomography scans; and one patient presented with an emergency patch rupture. Aneurysmal patches were successfully revised in three patients. Two patients died in the operating room, and three patch aneurysms (<5 cm) are still being observed. The mean time to the detection of aneurysmal expansion was 6.5 years after the original operation. Therapy consisted of replacement of a segment of the thoracoabdominal aortic graft and refashioning a smaller patch, including only the visceral artery orifices with separate attachment of the left and possibly right renal artery.
Question: Aneurysmal expansion of the visceral patch after thoracoabdominal aortic replacement: an argument for limiting patch size?
Although Crawford's inclusion method of visceral patch construction is generally durable, patients undergoing thoracoabdominal aortic replacement require yearly surveillance for the detection of aneurysmal expansion of the visceral patch. We recommend limiting visceral patch size at the original operation by routinely excluding the orifice of the left renal artery. Patients at high risk for recurrent aneurysmal expansion, such as those with connective tissue disorders, will benefit from creating small visceral patches and possibly implanting both renal arteries separately during the original operation.
Answer the question based on the following context: To determine the chronic complication rate of anterior hypospadias repair and to explore whether the practice of placing the neomeatus at the tip of the penis should be applicable to all cases in our community where urination is in a sitting/squatting position. Over a 10-year period commencing 1st September 1987, 312 patients had hypospadias repair of whom 72% had anterior hypospadias. The meatus was advanced to the tip of the penis in all repairs. The location of the meatus was also determined in 281 non-complaining men with a straight penis and normal sexual and reproductive functions. Following prior information that anterior hypospadias was not associated with sexual and reproductive dysfunction, 51 patients were given a choice between repair or no repair. Urethrocutaneous fistula occurred in 5% of patients, urethral stricture in 3% and meatal retraction in 3%, with 92% of patients having no complications. Forty six percent of non-complaining men had the meatus in locations other than the tip of the penis. Of 51 patients with the benefit of informed consent, 73% opted for no repair.
Question: Anterior hypospadias. Is repair necessary with urination in a sitting or squatting position?
Our results of anterior hypospadias repair compare favourably with those of other centers. Placement of the meatus at the tip of the penis for anterior hypospadias should not be applicable to all patients in this community where urination is in a sitting/squatting position. Before such repairs, an informed consent is warranted by making the patients and their parents aware of the non-association of sexual and reproductive disorders with these anomalies.
Answer the question based on the following context: Contentious moonlighting policies and the proliferation of nonphysician clinicians (NPCs) in academic emergency departments (EDs) send conflicting messages to emergency medicine (EM) residents regarding appropriate ED staffing patterns. The objective was to assess EM resident (EMR) views on the ED utilization of unsupervised residents and NPCs from their perspectives as both physicians and prospective patients. A survey was mailed to a random sample of senior EMRs (sampling fraction, 68%) from the Emergency Medicine Residents Association membership list. Respondents were instructed to assume the role of patient when presented with hypothetical clinical scenarios of increasing severity; outcomes included provider preferences and the impacts of medical urgency, time delays, costs, and supervision on those preferences. Survey items asked about willingness to see residents, nurse practitioners (CRNPs), and physician assistants (PAs), and perceived impact of NPCs on professional identity. A total of 251 EMRs responded. Senior EMRs are more willing to have their care handled by residents as opposed to mid-level providers. For a moderate illness or injury scenario, 54% agreed to be seen by a resident alone compared with only 17% and 24% willing to be seen by a CRNP and PA, respectively. Only a small fraction of the residents (22.7%) would allow another resident to treat them for a major injury or illness. Residents are more willing to be seen by mid-level providers if a savings in time can be realized but showed little interest in using NPCs to save money. Approximately one-third (34%) of the residents view mid-level providers as a professional threat, but logistic regression reveals this perception to be 2.25 (1.3, 4.0) times higher in male EMRs and 1.94 (1.1, 3.4) times higher in those with higher household incomes (>or =$75,000).
Question: Doing unto others?
When assuming the patient role, senior EMRs have preferences for ED care that are consistent with restrictive EMR moonlighting and NPC staffing policies.
Answer the question based on the following context: Increased aspartate aminotransferase (AST), alanine aminotransferase (ALT), and bilirubin levels were noted incidentally after a laparoscopic cholecystectomy. The percentage in which such elevation occurs and its clinical significance in the absence of bile duct injury were investigated. Bile duct injury is the most feared complication of laparoscopic cholecystectomy. Some laboratory tests may be indicative of this complication, such as increases in liver enzyme (AST, ALT, and alkaline phosphatase [ALP]) and bilirubin. These parameters have not been investigated in patients who had laparoscopic cholecystectomy and in whom no damage to the bile duct was noted. Sixty-seven patients with normal results of preoperative liver function test were entered into the study. Blood was collected 24 hours after laparoscopic cholecystectomy, and AST, ALT, ALP, and bilirubin levels were measured. A mean 1.8-fold increase in AST occurred in 73% of patients; 82% showed a 2.2-fold increase in ALT. A statistically nonsignificant increase was noted in 53% of patients (ALP remained within normal limits), and in 14% of patients bilirubin levels were increased (they were primarily of the unconjugated type).
Question: Are elevated liver enzymes and bilirubin levels significant after laparoscopic cholecystectomy in the absence of bile duct injury?
In many patients a significant increase in AST and ALT levels occurred after laparoscopic cholecystectomy, but they returned to normal values within 72 hours. The cause of this is unclear, and these elevations appear to have no clinical significance.
Answer the question based on the following context: In developing countries including Turkey, tuberculosis is still a major problem. Rapid diagnosis and early medical intervention are the two most important considerations in preventing the spread of the disease. This study was carried out to determine the diagnostic value of BCG test in childhood tuberculosis and compare it with tuberculin test in this regard. 50 patients and 20 healthy children without any evidence of previous BCG vaccination and aged 80 days-15 years were simultaneously tested with purified protein derivative (PPD) and BCG vaccine. In pulmonary tuberculosis BCG test was positive in 100% of cases and the PPD test in 44.5%. Similarly, BCG test was positive in 100% of military tuberculosis and tuberculous meningitis cases but PPD test was negative in all of them. Out of 22 patients with malnutrition 18 (82%) had positive BCG test and 4 had positive PPD test. BCG test showed uniformly high positivity in all grades of malnutrition.
Question: Is the BCG test of diagnostic value in tuberculosis?
BCG is more reliable and sensitive than the tuberculin test in the diagnosis of tuberculosis. It is still valuable in the diagnosis of tuberculosis especially in developing countries where the disease is still a major public health problem and where sophisticated methods such as rapid culture with BACTEC and demonstration of bacilli with DNA probes are not widely available.
Answer the question based on the following context: To determine the clinical significance of hyperechoic bowel seen sonographically in second-trimester fetuses. Fifty fetuses (0.6%) with echogenic bowel were identified sonographically from a population of 8680 consecutive second-trimester fetuses over 21 months. The fetal bowel was considered hyperechoic if its echogenicity was similar to that of surrounding bone. Follow-up was obtained through medical record review. Twenty-nine of 50 fetuses (58%) were normal; eight (16%) were aneuploid, including six Down syndrome, one trisomy 13, and one Turner syndrome. All eight fetuses with aneuploidy had sonographic anomalies in addition to the echogenic bowel. Eight of 50 fetuses (16%) were growth-retarded, and five others (10%) had normal karyotypes but are still undelivered. Among the eight growth-retarded fetuses, there were five intrauterine or neonatal deaths, one elective abortion, and two survivors. In addition, the six fetuses with Down syndrome and echogenic bowel represented 12.5% of all second-trimester Down syndrome fetuses karyotyped in our laboratory during the study period. Combining results from the present study (six Down syndrome fetuses) with three studies from the literature (21 additional Down syndrome fetuses), a total of 27 fetuses with echogenic bowel and Down syndrome were identified, 11 (40.7%) of whom had no other sonographic findings. We calculate that if 1,000,000 second-trimester fetuses were scanned, 5105 would have hyperechoic bowel as the only finding, of whom 71 would have Down syndrome and 5034 would not. The risk of Down syndrome in fetuses with isolated hyperechoic bowel is, therefore, 71 in 5105 or 1.4%.
Question: Is fetal hyperechoic bowel on second-trimester sonogram an indication for amniocentesis?
The finding of isolated hyperechoic bowel in the second trimester should prompt genetic counseling and consideration of karyotypic analysis.
Answer the question based on the following context: To compare the incidence of sudden infant death syndrome (SIDS) and apparent life-threatening event (ALTE) in infants with bronchopulmonary dysplasia (BPD) and birth weight-matched control infants in view of the changing pattern of chronic lung disease of prematurity. The study population consisted of 78 preterm infants of 26 to 33 weeks gestation who were diagnosed as having BPD and discharged. The 78 control infants were matched with the study infants for birth weight categories. Infants unable to maintain adequate oxygenation without supplemental oxygen when they were feeding well and thriving were discharged on home oxygen. All infants were at least 8 months of age at follow-up and information concerning the occurrence of any ALTE was obtained by direct parent interview. No infant died during the period of follow-up. Seven (8.9%) of the study group compared with eight (10.5%) of the control infants had an ALTE. Three infants (one study, two control infants) were hospitalized for further investigation. No infant discharged on the home oxygen program had an ALTE.
Question: Are infants with bronchopulmonary dysplasia at risk for sudden infant death syndrome?
The data from this study suggest that preterm infants with BPD are not at increased risk from SIDS compared with preterm infants without this condition. This may be related to close monitoring of the infants' oxygenation status and the provision of home oxygen when appropriate, which should eliminate episodes of unrecognized and untreated hypoxemia. Home monitoring for infants with BPD may not be warranted.
Answer the question based on the following context: The aim of this research was to describe the postoperative respiratory complications after tonsillectomy and/or adenoidectomy (T and/or A) in children with obstructive sleep apnea syndrome (OSAS), to define which children are at risk for these complications, and to determine whether continuous positive airway pressure (CPAP) is an effective strategy for dealing with these complications. The data for this study were gathered through a retrospective chart review of all children 15 years of age or younger with polysomnographically (PSG) proven OSAS who had a T and/or A at Hennepin County Medical Center between January 1985 and September 1992. Particular attention was paid to factors that contributed to the OSAS, postoperative respiratory complications, and intervention strategies for dealing with these complications. The charts of 37 children with OSAS documented by preoperative PSG who later had a T and/or A were reviewed retrospectively. Ten of these children had significant postoperative respiratory compromise secondary to OSAS that prolonged their hospital stay from 1 to 30 days and caused symptoms ranging from O2 desaturation<80% to respiratory failure. These children were younger and had significant associated medical problems that contributed to or resulted from their OSAS in addition to large tonsils and adenoids. The associated medical problems included craniofacial anomalies, hypotonia, morbid obesity, previous upper airway trauma, cor pulmonale, and failure to thrive. The children with postoperative respiratory complications also had more severe apnea on their preoperative PSG. One child had a uvulopalatopharyngoplasty (UPPP) in addition to the T&A. Taken together, the history, physical and neurological examination, and the PSG were able to identify successfully the children who subsequently developed respiratory compromise secondary to OSAS after a T and/or A. Nasal continuous positive airway pressure (CPAP) and bilevel CPAP was used successfully to manage the preoperative and/or postoperative upper airway obstruction in five of these children.
Question: Postoperative respiratory compromise in children with obstructive sleep apnea syndrome: can it be anticipated?
Based on these findings, overnight observation is recommended with an apnea monitor and oximeter for patients undergoing a T and/or A who have OSAS and meet any of the following high-risk clinical criteria: (1)<2 years of age, (2) craniofacial anomalies affecting the pharyngeal airway particularly midfacial hypoplasia or micro/retrognathia, (3) failure to thrive, (4) hypotonia, (5) cor pulmonale, (6) morbid obesity, and (7) previous upper airway trauma; or high-risk PSG criteria: (1) respiratory distress index (RDI)>40 and (2) SaO2 nadir<70%; or undergoing a UPPP in addition to the T and/or A. Nasal CPAP/bilevel CPAP can be used to manage the preoperative and/or postoperative upper airway obstruction in patients with OSAS undergoing a T and/or A.
Answer the question based on the following context: Homocystinuria due to cystathionine beta-synthase deficiency and familial hypercholesterolemia are inherited disorders of metabolism that are associated with premature development of cardiovascular disease. This study addresses the possibility that different patterns of carotid wall damage and cerebral blood flow hemodynamics are present in these two metabolic diseases. Twelve patients with homocystinuria due to cystathionine beta-synthase deficiency (mean age, 24 years), 10 patients with homozygous familial hypercholesterolemia (mean age, 26 years), and 11 healthy control subjects (mean age, 26 years) underwent a vascular examination by noninvasive methods. B-mode ultrasound imaging was used to obtain measurements of intima-media thickness of common carotid, bifurcation, and internal carotid arteries as an index of atherosclerosis. Cerebral blood flow velocity was estimated from vascular examination of the middle cerebral artery by transcranial Doppler. Systolic, diastolic, and mean velocities were measured. Pulsatility index, a possible indicator of vascular resistance in the cerebral circulation, was also calculated. Mean maximum intima-media thickness was 1.4 mm in patients with familial hypercholesterolemia, 0.6 mm in patients with homocystinuria, and 0.6 mm in control subjects. The difference between hypercholesterolemic and homocystinuric patients or control subjects was statistically significant (P<.001). Diastolic blood flow velocities were significantly reduced in the middle cerebral arteries of hypercholesterolemic patients compared with homocystinuric patients or control subjects (P<.05), whereas systolic or mean velocities did not differ. The pulsatility index, a possible indicator of vascular resistance in the cerebral circulation, was significantly higher in hypercholesterolemic patients compared with homocystinuric patients or healthy control subjects (P<.01). A direct relation was demonstrated between pulsatility index of the middle cerebral artery and mean maximum intima-media thickness of carotid arteries on the same side (P<.001).
Question: Premature carotid atherosclerosis: does it occur in both familial hypercholesterolemia and homocystinuria?
Familial hypercholesterolemia is responsible for diffuse and focal thickening of carotid arteries and possibly also for hyperlipidemic endothelial dysfunction extending to small resistance arteries and leading to a disturbed cerebral blood flow. Patients with homocystinuria due to homozygosis for cystathionine beta-synthase deficiency seldom have plaques in their carotid arteries. They are similar to healthy control subjects with regard to both intima-media thickness and blood flow velocity in the middle cerebral artery. Therefore, it is unlikely that typical atherosclerotic lesions precede thrombotic events in homocystinuria. However, it is possible that arterial dilatations caused by medial damage lead to thrombosis in homocystinuric patients.
Answer the question based on the following context: The aim of our study was to investigate plasma and genetic risk factors for rupture of cerebral aneurysms. In London, a case-control study was made of 56 consecutive patients admitted to a regional neurosurgical service for treatment of ruptured cerebral aneurysm and of 93 control subjects. A further 40 consecutive patients admitted in Arhus with ruptured cerebral aneurysm also were studied. The British case-control study showed that smoking was associated with an increased risk of ruptured cerebral aneurysm (odds ratio, 9.1; 95% confidence interval [CI], 3.4 to 23.8; P<.001 for a history of>10 pack years). After age and sex adjustment, factors associated with ruptured cerebral aneurysm included a cholesterol concentration in the highest tertile (>or = 6.3 mmol/L; odds ratio, 10.2; 95% CI, 3.9 to 26.7; P<.001), an apolipoprotein B concentration in the highest tertile (>or = 0.84 g/L; odds ratio, 6.4; 95% CI, 2.5 to 16.3; P<.001), and concentrations of HDL cholesterol in the lowest tertile (<1.1 mmol/L; odds ratio, 3.6; 95% CI, 1.4 to 8.2; P<.01). History of hypertension was of less importance (odds ratio, 4.0; 95% CI, 1.41 to 11.7; P<.01). Smoking history (P<.001) and increased concentrations of cholesterol (P<.0001) were the most important independent risk factors associated with ruptured cerebral aneurysm on multivariate analysis. The histories of hypertension and smoking, together with apolipoprotein B levels, in the Danish patients were similar to those in the British patients. In the entire patient group, the frequencies of two polymorphic variations in the type III collagen gene and polymorphisms at the apolipoprotein B, apolipoprotein C-III, and haptoglobin gene loci were not different from control subjects or the normal population; allele frequencies in British and Danish patients were similar.
Question: Are cerebral aneurysms atherosclerotic?
An atherosclerotic profile including increased total cholesterol concentration and a long smoking history may contribute to the rupture of cerebral aneurysms. This study provides no support for the hypothesis that inherited abnormalities of type III collagen are a common cause of cerebral aneurysms.
Answer the question based on the following context: Fat suppression has shown promise in improving the quality of T2-weighted spin-echo MR images of the upper part of the abdomen. The purpose of this study was to determine whether fat-suppressed images should be routinely used in lieu of conventional images. Accordingly, we prospectively compared the two techniques in a series of patients with both normal and abnormal findings in the upper part of the abdomen. Conventional and fat-suppressed T2-weighted spin-echo images (3000/80,160 [TR/TE]) were obtained in 45 consecutive patients referred for MR imaging of the upper part of the abdomen. Thirty-three patients had abnormal findings, and 22 of those 33 patients had histologic or follow-up confirmation of the diagnosis (14 with metastasis, one with hepatoma, four with hemangiomas, and three with cysts). Signal intensities (hepatic lesions, liver, spleen) and noise were measured to calculate signal-to-noise ratios and contrast-to-noise ratios. Qualitative comparison (liver, hepatic lesions, porta hepatis, spleen, pancreas, bowel, kidneys, adrenal glands, noise), evaluation of the number of hepatic lesions, and characterization of hepatic lesions were done by independent observers. Compared with conventional images, fat-suppressed images had higher signal-to-noise ratios (lesions, liver, spleen) and contrast-to-noise ratios (lesion-liver and spleen-liver) (p<.005). In qualitative comparison, three of three radiologists preferred fat-suppressed over conventional images for depiction of hepatic lesions and all upper abdominal organs except the liver, for which no clear preference was shown for either technique. Detection rates for hepatic lesions were similar with both types of images (observer 1: 112 lesions on fat-suppressed vs 118 on conventional images, observer 2: 142 vs 135), as was the characterization of hepatic lesions (91% accuracy on fat-suppressed images and 84% accuracy on conventional images, for 22 proved lesions and two observers).
Question: T2-weighted MR imaging of the upper part of the abdomen: should fat suppression be used routinely?
Fat-suppressed T2-weighted spin-echo MR images were better than non-fat-suppressed images for evaluation of the upper part of the abdomen. These results suggest that fat suppression should be routinely used in T2-weighted MR imaging of the upper part of the abdomen.
Answer the question based on the following context: The role of imaging in patients with newly diagnosed prostatic carcinoma is controversial. Currently, 35% of patients with prostatic carcinoma undergo CT at the time of diagnosis, despite reports of the lack of efficacy of CT in staging the disease. We sought to evaluate the cost-effectiveness of CT in detecting unrelated comorbid disease (significant disease unrelated to prostatic carcinoma) that might affect decisions on treatment in this population of patients. We reviewed the medical records of 273 consecutive patients with newly diagnosed prostatic carcinoma who had CT of the abdomen and pelvis as part of their preoperative evaluation. Using costs based on Medicare reimbursements, we assessed the impact of the CT findings (related to comorbid disease) on overall costs and savings related to the workup and treatment of these patients. Sixty-six patients (24%) had findings suggestive of comorbid disease. The CT findings had near-term impact on only four patients (two in whom large abdominal aortic aneurysms were detected and two in whom second primary cancers were found), despite nearly $155,000 spent on the screening CT scans and more than $4400 spent on further evaluation of false-positive CT findings. The clinical impact varied from intervening semiurgent surgery to cancellation of prostatic surgery and institution of radiation therapy.
Question: CT screening for comorbid disease in patients with prostatic carcinoma: is it cost-effective?
CT is not cost-effective in screening for comorbid disease that would affect treatment in patients with newly diagnosed prostatic carcinoma.
Answer the question based on the following context: The differentiation between cardiac and esophageal causes of retrosternal chest pain is notoriously difficult. Theoretically, cardiac and esophageal causes may coexist. It has also been reported that gastroesophageal reflux and esophageal motor abnormalities may elicit myocardial ischemia and chest pain, a phenomenon called linked angina pectoris. The aim of this study was to assess the incidence of esophageal abnormalities as a cause of retrosternal chest pain in patients with previously documented coronary artery disease. Thirty consecutive patients were studied, all of whom had undergone coronary arteriography. The patients were studied after they were admitted to the coronary care unit with an attack of typical chest pain. On electrocardiograms (ECGs) taken during pain, 15 patients (group I) had new signs of ischemia; the other 15 patients (group II) did not. In none of the patients were cardiac enzymes elevated. As soon as possible, but within 2 hours after admission, combined 24-hour recording of esophageal pressure and pH was performed. During chest pain, 12-lead ECG recording was carried out. In group I, all 15 patients experienced one or more pain episodes during admission, 25 of which were associated with ischemic electrocardiographic changes. The other two episodes were reflux-related. Only one of the 25 ischemia-associated pain episodes was also reflux-related, ie, it was preceded by a reflux episode. In group II, 19 chest pain episodes occurred in 11 patients. None of these was associated with electrocardiographic changes, but 8 were associated with reflux (42%) and 8 with abnormal esophageal motility (42%).
Question: Esophageal dysfunction as a cause of angina pectoris ("linked angina"): does it exist?
Linked angina is a rare phenomenon.
Answer the question based on the following context: The aim of this investigation was to ascertain how the length of anal canal preserved above the dentate line in stapled end-to-end ileoanal anastomosis influenced late outcome. Two groups, high cuff group and low cuff group of nine subjects with stapled anastomosis, matched for sex, age, pouch configuration, and mean follow-up, representing the highest (median, 2.5 cm) and lowest (median, 0.7 cm) anal cuff lengths in our series, were selected. Physiologic and functional parameters were appraised preoperatively, at the time of ileostomy closure, and at 1, 3, 6, and 12 months after reestablishment of intestinal continuity. At one year, the drop in mean anal canal resting pressure was 13 percent in the high cuff group (not significant) and 31 percent in the low cuff group (P<0.05); mean maximum squeezing pressure did not differ significantly from preoperative values in both groups. The mean volume of the ileal pouch was higher in the low cuff group at all insufflation pressures. The rectoanal inhibition reflex reappeared in four high cuff group patients and in none of the low cuff group patients. Mean distention pressure (cm H2O) and volume (ml) eliciting urge sensation were 80 and 360 in the low cuff group compared with 40 and 240 in the high cuff group (P<or = 0.05). Daytime bowel movements and night incontinence were significantly better in the low cuff group. No statistical differences were observed for night stool frequency, daytime incontinence, pad use (day and night), discrimination between gas and feces, ability to defer evacuation, and difficulty in emptying the pouch.
Question: Does the level of stapled ileoanal anastomosis influence physiologic and functional outcome?
Patients with stapled anastomoses and a low rectal cuff length, despite presenting lower anal resting pressure and absence of rectoanal inhibition reflex, had a better functional outcome in terms of continence than those with a high cuff length.
Answer the question based on the following context: To determine what proportion of pre-hospital deaths from accidental injury--deaths at the scene of the accident and those that occur before the person has reached hospital--are preventable. Retrospective study of all deaths from accidental injury that occurred between 1 January 1987 and 31 December 1990 and were reported to the coroner. North Staffordshire. Injury severity score, probability of survival (probit analysis), and airway obstruction. There were 152 pre-hospital deaths from accidental injury (110 males and 42 females). In the same period there were 257 deaths in hospital from accidental injury (136 males and 121 females). The average age at death was 41.9 years for those who died before reaching hospital, and their average injury severity score was 29.3. In contrast, those who died in hospital were older and equally likely to be males or females. Important neurological injury occurred in 113 pre-hospital deaths, and evidence of airway obstruction in 59. Eighty six pre-hospital deaths were due to road traffic accidents, and 37 of these were occupants in cars. On the basis of the injury severity score and age, death was found to have been inevitable or highly likely in 92 cases. In the remaining 60 cases death had not been inevitable and airway obstruction was present in up to 51 patients with injuries that they might have survived.
Question: Are pre-hospital deaths from accidental injury preventable?
Death was potentially preventable in at least 39% of those who died from accidental injury before they reached hospital. Training in first aid should be available more widely, and particularly to motorists as many pre-hospital deaths that could be prevented are due to road accidents.
Answer the question based on the following context: To investigate whether an intervention designed to improve overall immunisation uptake affected social inequalities in uptake. Cross-sectional small area analyses measuring immunisation uptake in cohorts of children before and after intervention. Small areas classified into five groups, from most deprived to most affluent, with Townsend deprivation score of census enumeration districts. County of Northumberland. All children born in country in four birth cohorts (1981-2, 1985-6, 1987-8, and 1990-1) and still resident at time of analysis. Overall uptake in each cohort of pertussis, diphtheria, and measles immunisation, difference in uptake between most deprived and most affluent areas, and odds ratio of uptake between deprived and affluent areas. Coverage for pertussis immunisation rose from 53.4% in first cohort to 91.1% in final cohort. Coverage in the most deprived areas was lower than in the most affluent areas by 4.7%, 8.7%, 10.2%, and 7.0% respectively in successive cohorts, corresponding to an increase in odds ratio of uptake between deprived and affluent areas from 1.2 to 1.6 to 1.9 to 2.3. Coverage for diphtheria immunisation rose from 70.0% to 93.8%; differences between deprived and affluent areas changed from 8.6% to 8.3% to 9.0% to 5.5%, corresponding to odds ratios of 1.5, 2.0, 2.5, and 2.6. Coverage for measles immunisation rose from 52.5% to 91.4%; differences between deprived and affluent areas changed from 9.1% to 5.7% to 8.2% to 3.6%, corresponding to odds ratios of 1.4, 1.4, 1.7, and 1.5.
Question: Do interventions that improve immunisation uptake also reduce social inequalities in uptake?
Despite substantial increase in immunisation uptake, inequalities between deprived and affluent areas persisted or became wider. Any reduction in inequality occurred only after uptake in affluent areas approached 95%. Interventions that improve overall uptake of preventive measures are unlikely to reduce social inequalities in uptake.
Answer the question based on the following context: When tangential radiation beams are used in patients with breast cancer after breast-conserving surgery, the amount of lung included in the radiation field varies because of patient anatomy and treatment technique. The question of how much lung tissue can be irradiated incidentally without acute or late complications requires quantitative study. Thirty-four women were enrolled in a prospective study of pulmonary function after breast-conserving surgery and radiotherapy for early stage breast cancer. The percentage of lung volume irradiated was estimated from computed tomography scans. Pulmonary function tests including spirometrics, lung volume, and diffusing capacity of carbon monoxide (DLCO) were performed before, during, and at regular intervals after radiotherapy. Both acute and long term changes in pulmonary function were analyzed in 29 eligible patients. Acutely, DLCO values dropped, but they returned to normal levels by 24 months. At 5 years, pulmonary function did not vary significantly according to the percentage of lung irradiated, the use of regional lymphatic irradiation, or the addition of chemotherapy. Symptomatic pneumonitis occurred only in two women with baseline deficits in DLCO (P = 0.016), who had more than 10% of the total lung volume irradiated. Patients with a smoking history had a clinically significant baseline deficit of 32% in DLCO values (P = 0.0011) but showed a 21% improvement (P = 0.11), which probably correlated with quitting smoking.
Question: Is radiation treatment volume a predictor for acute or late effect on pulmonary function?
Within the range evaluated in this study, the volume of lung irradiated did not predict a late decrease in pulmonary function, although pneumonitis was observed only when more than 10% of the lung was irradiated.
Answer the question based on the following context: The authors performed reexcision lumpectomy on patients with breast cancer with tumor at or close to the resection margin or if the margin status was unknown. Frozen section analysis (FSA) of reexcision lumpectomy margins was performed to allow additional excision of margins or mastectomy, saving the patient another operation or an additional radiation boost. The authors reviewed the accuracy of FSA of margins in 107 patients undergoing reexcision lumpectomy between 1987 and 1992. There were 359 frozen sections performed on 156 specimens. Sensitivity and specificity of FSA for each frozen section margin, specimen, and patient were evaluated, as was gross inspection of tumor involvement at the resection margins. The accuracy of each pathologist's use of FSA also was evaluated. FSA sensitivity per frozen section margin, specimen, and patient was 0.90, 0.89, and 0.85, respectively. The specificity of gross inspection was 0.97, 0.96, and 0.96 (sensitivity, 0.44), which was significantly less accurate than that of FSA (P = 0.0015) or permanent section (P = 0.019). There was no significant discordance between FSA and permanent section. Of 19 pathologists doing FSA, 6 evaluated 10 or more specimens. The error rate ranged from 4% to 10% among pathologists with 10 or more readings, whereas 12 of 13 pathologists with fewer readings had no errors. The final pathologist had a 100% error rate, significantly worse (range, P = 0.0085-0.02) than any experienced pathologist. Thirty-four (32%) patients underwent additional excision (24 patients) or mastectomy (10 patients) based on the results of FSA, which saved the patients from undergoing another operation. No one required an additional operation or a mastectomy because of a false FSA result.
Question: Is frozen section analysis of reexcision lumpectomy margins worthwhile?
FSA is safe and accurate in evaluating reexcision lumpectomy margins. Gross inspection is not reliable in margin evaluation. FSA saved an additional operation 32% of the time. Obtaining clear margins during one procedure eliminates the necessity of an additional radiation boost and probably will improve cosmesis.
Answer the question based on the following context: Both locus of control and alexithymia have been considered personality factors fostering health concerns and behaviors. This study investigates the relationship between the health locus of control and alexithymia. Seventy-eight psychiatric outpatients were administered the Wallston Health Locus of Control Scale (HLC), the Toronto Alexithymia Scale (HLC), and the Five Factor Inventory, which measures neuroticism, extraversion, openness, agreeableness, and conscientiousness. Depressive and anxious affect was also measured. Regression models were developed to assess the influence of the above variables upon alexithymia. Although there was a significant bivariant correlation between an external locus of control and increased alexithymia, regression models found that HLC did not significantly predict TAS. Neuroticism, however, provided the most significant contribution to predict increased alexithymia.
Question: Is alexithymia distinct from health locus of control?
Neuroticism may link HLC and TAS due to the face validity of each construct. A sense of vulnerability is stated in each measure. This may foster somatic preoccupation. The data suggest HLC and TAS to be separate phenomena and further support the validity of alexithymia as a unique personality trait.
Answer the question based on the following context: Underpinned by increased confidence in cure of metastatic seminoma by chemotherapy during the past 12 years, three management strategies for Stage I seminoma have been evaluated by six collaborating centers within the Anglian Germ Cell Tumor Group. This paper evaluates the efficacy of surveillance, prophylactic radiotherapy and adjuvant chemotherapy, and discusses these differing management approaches. Patients were recruited into the study between 1982 and 1992. There was no randomization between treatment groups. Seventy-nine patients received prophylactic radiotherapy (median follow-up = 51 months), 67 patients had surveillance alone (median follow-up = 61 months) and 78 patients were treated with adjuvant single agent platinum (median follow-up = 44 months). Fifty-three of these patients received two courses of platinum (median follow-up = 51 months) and 25 patients received one course (median follow-up = 29 months, range 22-72 months). There were 18 (27%) recurrences on surveillance, five (6%) after radiotherapy, one (1%) after two courses of adjuvant single agent platinum and none after one course of carboplatin. There was one death from testis cancer after radiotherapy and none after adjuvant chemotherapy treatments. Two patients died with drug resistant disease after relapse on surveillance. There was one death from a myocardial infarction after prophylactic radiotherapy and one death from suicide in the surveillance group. A retrospective quality of life questionnaire reviewing the incidence of early and late toxicity revealed no major differences though they suggest that those treated with one course adjuvant carboplatin had somewhat less sickness and an earlier return to work.
Question: Pilot studies of 2 and 1 course carboplatin as adjuvant for stage I seminoma: should it be tested in a randomized trial against radiotherapy?
Single agent carboplatin appears well tolerated and is an effective adjuvant treatment for Stage I seminoma. A multicenter randomized trial of the different treatment modalities is required to further evaluate its use.
Answer the question based on the following context: Intrahepatic biliary strictures or parenchymal infarcts may occur after liver transplantation as a complication of ischemic damage to the graft. In some selected cases the lesions appear to be confined to a part of the liver. We report our experience with partial graft resection in this setting. From January 1984 to December 1991, 286 liver transplantations were performed in 257 recipients. Seven patients, three children and four adults, underwent partial hepatectomy 3 to 218 weeks after liver transplantation of a full-size graft. The clinical presentation included septic parenchymal infarcts (n = 4) and nonanastomotic biliary strictures (n = 3) complicating (n = 5) artery thrombosis or not (n = 2). There were four left hepatectomies, two left lobectomies, and one right hepatectomy. In four instances partial hepatectomy was performed after failed attempt at biliary reconstruction (n = 2) or arterial revascularization (n = 2). Partial graft resection was performed extrafascially without Pringle's maneuver and mobilization of the remnant liver to preserve its vascularization. No surgical complications occurred, and none of the patients experienced acute hepatic failure during the postoperative period. All patients were discharged home 10 to 96 days (median, 23 days) after liver resection. Two patients had recurrent ischemic cholangitis. One patient underwent successful regrafting for recurrent Budd-Chiari syndrome; one patient died of tumor recurrence. Six patients were alive with a follow-up ranging from 12 to 45 months.
Question: Partial hepatic resection for ischemic graft damage after liver transplantation: a graft-saving option?
These results suggest that partial graft resection is a safe and graft-saving option after liver transplantation in selected patients with localized ischemic damage of the graft.
Answer the question based on the following context: The purpose was to compare auscultatory and oscillometric techniques in the determination of maternal blood pressure in normotensive primigravid patients and primigravid patients with proteinuric preeclampsia (blood pressure>140/90 on two occasions and proteinuria>0.5 gm/L). A prospective comparison of systolic and diastolic blood pressure was made with an automated device using oscillometric principles and two observers using a double-headed stethoscope to determine auscultatory observations (phase I and phase IV of the vascular sounds) in normotensive primigravid patients (N = 40) and primigravid patients with proteinuric hypertension (N = 17). In patients with proteinuric preeclampsia the mean differences between auscultatory (phase I and phase IV) and oscillometric observations were 5.4 mm Hg (SEM 1.4 mm, p<0.05) and 14.8 mm Hg (SEM 2.9 mm, p<0.01) for systolic and diastolic observations, respectively. In normotensive patients the mean differences between auscultatory (phase I and phase IV) and oscillometric observations were 2.4 mm Hg (SEM 0.9 mm, p not significant) and 7.5 mm Hg (SEM 1.9 mm, p<0.01) for systolic and diastolic observations, respectively.
Question: Automated blood pressure measurement devices: a potential source of morbidity in preeclampsia?
Automated devices using oscillometric principles "underrecord" systolic and diastolic blood pressure compared with auscultatory observations (phase I and phase IV) in patients with proteinuric preeclampsia. In some cases the difference between observations exceeds 30 mm Hg.
Answer the question based on the following context: The purpose of this study was to assess the willingness of intravenous drug users to participate in a preventive human immunodeficiency virus (HIV) vaccine efficacy trial. Of the 347 intravenous drug users in methadone treatment who were approached for participation, 257 completed a battery of self-administered questionnaires assessing risk behaviors, interest in vaccine trials, and other vaccine-related information. Data from 16 known seropositives and 1 inconsistent responder were dropped from analyses (n = 240). Fifty-two percent of the subjects expressed a willingness to be one of the first individuals to participate in a preventive HIV vaccine efficacy trial. Subjects who had recently shared needles or works and subjects who trusted the government to ensure vaccine safety were both twice as likely to report interest in participation. Twenty-two percent of subjects reported that they would increase needle sharing if vaccinated. Thirty percent did not know what a vaccine was.
Question: HIV vaccine trials: will intravenous drug users enroll?
These findings suggest that some in-treatment intravenous drug users would volunteer for a preventive HIV vaccine efficacy trial. Education and counseling will be required to ensure that subjects fully understand the trial's purposes, methods, risks and benefits.
Answer the question based on the following context: To determine whether and where universal neonatal screening for hemoglobinopathies, chiefly sickle-cell disease, could be performed at socially acceptable costs. We made projections of the cost-effectiveness of nonuniversal and universal sickle-cell disease screening throughout the United States. We then compared the cost-effectiveness of universal sickle-cell disease screening with that of universal phenylketonuria screening. Finally, we asked if "high-cost" states, that is, those in which the cost of finding a case of sickle-cell disease exceeded one half the cost of finding a case of phenylketonuria, could enhance their cost-effectiveness by joining demographically complementary states in screening cooperatives. If all states conducted independent screening and if the value of finding a case of sickle-cell disease were no more than one half that of finding a case of phenylketonuria, seven of the 19 states that do not currently conduct universal screening for hemoglobinopathies would begin to do so, but six of the 34 that currently do so would stop. Of the six that would stop, three have already formed a screening cooperative, reducing their projected average costs for finding either sickle-cell disease or phenylketonuria or both; the other three could similarly improve cost-effectiveness through cooperative arrangements. Nineteen states realize economies of scale in six cooperative groups; more could do so.
Question: Is universal neonatal hemoglobinopathy screening cost-effective?
Universal neonatal hemoglobinopathy screening can be made available at socially acceptable costs to the citizens of demographically various states.
Answer the question based on the following context: To assess the long-term effect of an extensive rheumatology curriculum on graduates of family practice residencies. Cohort analytic study using a mailed survey and a multiple-choice test based on clinical vignettes that were administered 3 to 7 years after graduation from residency training. Practicing family physicians who had graduated from a community hospital family practice residency with an extensive rheumatology curriculum (trained) were compared with graduates from a similar program without specific rheumatology training (untrained). Total test scores, results of individual test questions, practice style, and attitudes toward rheumatology training and practice. We received 39 (85%) responses from 46 potential respondents in the trained group and 25 (89%) responses from 28 potential respondents in the untrained group. Physicians in the two groups had similar backgrounds and practice styles. The trained physicians scored higher on the multiple-choice test (mean +/- SD, 25 +/- 5 vs 22 +/- 6; P<.03). The clinical significance of these differences is a matter of individual interpretation. One hundred percent of the trained physicians believed that the quality of their rheumatology training was good to excellent compared with 25% of the untrained physicians. Seventy-six percent of the untrained physicians wished that they knew more about rheumatology. No variables other than rheumatology training accounted for the differences between the two groups.
Question: Does curriculum make a difference?
The difference in rheumatology knowledge, evident during and soon after residency between trained and untrained physicians, persists for 3 to 7 years.
Answer the question based on the following context: The clinical significance of exercise-induced chest pain remains controversial, as reflected by sharply discordant clinical results within the medical literature. Thus, we developed a prospective study to compare the functional significance of silent versus symptomatic ischemia and to evaluate whether patient selection biases influence this analysis. We evaluated 117 patients (mean age, 63 +/- 9 years) with ischemic ST-segment depression during treadmill testing. Each patient underwent Tl-201 myocardial perfusion single-photon emission computed tomography (SPECT) after exercise followed by 24-ambulatory ECG monitoring. Patients were divided into silent versus symptomatic cohorts and were compared for the degree of hemodynamic, exercise and ambulatory ECG, and thallium abnormalities during stress testing. Analyses were repeated as the patient population became increasingly restricted. Compared with the silent patients, patients with chest pain during exercise had a shorter exercise duration (P<.009), lower peak heart rate (P = .009) and double product (P = .005), lower heart rate threshold for ST depression (P<.05), more episodes of ambulatory ST-segment depression (P<.05), a higher frequency of ischemia abnormalities during Tl-201 SPECT (P = .02), and higher summed Tl reversibility scores (P = .002). As the population became increasingly restricted, the relative magnitude of differences in silent versus symptomatic cohorts diminished, whereas the absolute magnitude of ischemic abnormalities progressively increased in both cohorts. For example, within the restricted group having ischemia on both exercise and ambulatory ECG, 50% of the silent cohort had severe ischemia on Tl SPECT (five or more reversible defects) and more than one third demonstrated the ominous finding of transient left ventricular dilation after exercise.
Question: Is 'silent' myocardial ischemia really as severe as symptomatic ischemia?
The induction of chest pain is associated with substantially more functional abnormalities when it is analyzed in a relatively "broad-spectrum" coronary artery disease population; by contrast, chest pain tends to lose its apparent value as a clinical test parameter when its analysis is restricted to coronary artery disease populations with a greater a priori likelihood of manifesting inducible ischemia. These findings may help resolve some of the previous discordant literature reports.
Answer the question based on the following context: An impaired coronary flow reserve in syndrome X has been demonstrated by many studies. Recently, however, a normal coronary flow reserve in response to papaverine was reported, but the number of patients in these studies was small. The aim of this study was to investigate whether coronary flow reserve in response to intracoronary papaverine is really impaired in syndrome X. We investigated 53 syndrome X patients (typical angina, a positive exercise test, and completely normal coronary arteries on angiography) and 26 heart transplant patients with normal coronary arteries (control group). All antianginal medications were stopped 48 hours before the study. A 3.6F intracoronary Doppler catheter was positioned in the proximal left anterior descending coronary artery and was connected to a Millar velocimeter. The coronary blood flow velocity at rest and in response to a hyperemic dose of papaverine was measured. Coronary flow reserve was defined as the ratio of hyperemic coronary blood flow velocity in response to papaverine and resting coronary blood flow velocity. The coronary flow reserve (mean +/- SD) in the syndrome X group was 2.72 +/- 1.39. The coronary flow reserve in the control group was significantly higher at 5.22 +/- 1.26 (P<.01). In both groups there was no significant difference in the heart rate or the mean arterial pressure during the study.
Question: Is coronary flow reserve in response to papaverine really normal in syndrome X?
Our study shows that coronary flow reserve in response to intracoronary papaverine is impaired in syndrome X patients.
Answer the question based on the following context: The authors investigated whether cognitive behavioral treatment could facilitate discontinuation of alprazolam therapy and maintenance of drug abstinence among panic disorder patients treated with alprazolam doses sufficient to suppress spontaneous panic attacks. Twenty-one outpatients who met DSM-III-R criteria for panic disorder with mild to severe agoraphobia were made panic-free with alprazolam (mean dose = 2.2 mg/day) and were then randomly assigned to receive either supportive drug maintenance and slow, flexible drug taper or an identical medication treatment plus 12 weeks of concurrent, individual cognitive behavioral treatment. Taper in the combined treatment group was sequenced to conclude before cognitive behavioral treatment ended. Twenty subjects completed the study. There was no significant difference between groups in the rate of alprazolam discontinuation (80% and 90%, respectively, in the alprazolam-only group and the combined treatment group). However, during the 6-month follow-up period, half of the subjects who discontinued alprazolam without cognitive behavior therapy, but none of those who were given cognitive behavior therapy, relapsed and resumed alprazolam treatment.
Question: Does cognitive behavior therapy assist slow-taper alprazolam discontinuation in panic disorder?
Cognitive behavioral treatment administered in parallel with alprazolam maintenance and taper was effective in preventing relapse after drug discontinuation. The results warrant further research on the thoughtful integration of these two therapeutic modalities.
Answer the question based on the following context: To evaluate the difference among time sources in an emergency medical system. Prospective; comparison to a criterion standard. Five emergency departments and three emergency medical services systems in Indianapolis, Indiana. Coordinated Universal Time (UTC), generated by the atomic clock in Boulder, Colorado, and broadcast by the US Commerce Department's National Institute of Standards and Technology, was used as the time standard. The investigators, on a single day, made unannounced visits to the five EDs and the ambulances and fire stations in the three emergency medical services systems. The times displayed on all time sources at each location were recorded. The accuracy to the second of each time source compared to UTC was calculated. Three time sources were excluded (two defibrillator clocks and one ED wall clock that varied more than three hours from UTC). Of the 152 time sources, 72 had analog displays, 74 digital, three both, and three other. The average absolute difference from UTC was 1 minute 45 seconds (SEM, 9 seconds) with a range of 12 minutes 34 seconds slow to 7 minutes 7 seconds fast. Thus, two timepieces could have varied by as much as 19 minutes 41 seconds. Compared to UTC, 47 timepieces (31%) were slow, 100 (66%) were fast, and five (3%) were accurate to the second. Fifty-five percent of the time sources varied one minute or more from UTC.
Question: Does anybody really know what time it is?
Time sources in this health care system varied considerably. Time recording in medicine could be made more precise by synchronizing medical clocks to UTC, using computers to automatically "time stamp" data entries and using only digital time sources with second displays.
Answer the question based on the following context: Recent case reports have raised the possibility that use of injectable bovine collagen may be associated with polymyositis/dermatomyositis (PM/DM). Because the number of collagen users is high, PM/DM would be expected to occur in some for reasons unrelated to the collagen use. A central issue is whether the number of observed cases exceeds the number expected on the basis of background rates alone. The present study was undertaken to investigate this. The number of observed cases was determined by review of the medical records of collagen users who had reported illnesses consistent with PM/DM: Because of the uncertainty about diagnosis of PM/DM, population incidence rates, number of patients treated with collagen, and duration of followup after treatment, we examined a range of estimates of each of these factors that would affect the expected number of cases. From reports among collagen users, 7 probable or definite cases of PM/DM were confirmed. In contrast, 13 cases would be expected based on the best estimates of relevant factors. Under the most conservative estimates for factors that influence the number of expected cases, 12 cases would be expected, while worst-case assumptions would yield an expected 130 cases.
Question: Is there an association between injectable collagen and polymyositis/dermatomyositis?
The consistent finding of fewer-than-expected PM/DM cases among collagen users suggests that collagen use is not associated with the development of PM/DM:
Answer the question based on the following context: To determine whether the high proportion of patients reported to have prominence of normal right atrial structures by MRI may lead to inappropriate diagnosis of intracardiac tumors. One hundred forty-nine subjects were examined by spin-echo MRI: patients with cardiac (no. 40), pericardial (no. 30), or thoracic aortic disease (no. 40) and mediastinal tumor (no. 15), and normal volunteers (no. 24). Imaging was reviewed to determine the frequency of a prominent crista terminalis/Chiari network and the likelihood of misdiagnosis of cardiac tumor. Prominent intraatrial structures were seen in 59% of subjects, a single prominent nodule in 36%, an intraatrial strand in 13%, and both in 10%. In no case were these findings originally or on review thought to represent a pathological mass or was it felt likely that they could reasonably be misinterpreted as such.
Question: MRI of right atrial pseudomass: is it really a diagnostic problem?
Normal structures within the right atrium, such as the crista terminalis and Chiari network, may be seen more commonly with MRI than with other imaging modalities. An appreciation of the frequency with which these findings are seen should prevent inappropriate misdiagnosis of pathological masses when none is present.
Answer the question based on the following context: To show the advantages and disadvantages of a multi-dimensional small area classification in the analysis of child health data in order to measure social inequalities in health and to identify the types of area that have greater health needs. Health data on children from the district child health information system and a survey of primary school children's height were classified by the census enumeration district of residence using the Super profiles neighbourhood classification. County of Northumberland, United Kingdom. One cohort comprised 21,702 preschool children age 0-5 years resident in Northumberland, and another cohort 9930 school children aged 5-8.5 years. Variations between types of area in the proportions of babies with birthweight less than 2.8 kg; births to mothers aged less than 20 years; pertussis immunisation uptake; child health screening uptake; and mean height of school children. Areas with the poorest child health measures were those which were most socially disadvantaged. The most affluent areas tended to have the best measures of health, although rural areas also had good measures. Problems in analysis included examples of the "ecological fallacy", misleading area descriptions, and the identification of the specific factors associated with poor health measures. Advantages included a wider view of social circumstances than simply "deprivation" and the ability to identify characteristic types of areas with increased child health needs.
Question: Are multidimensional social classifications of areas useful in UK health service research?
There is a limited place for multidimensional small area classifications in the analysis of health data for both research and health needs assessment provided the inherent drawbacks of these data are understood in interpreting the results.
Answer the question based on the following context: European guidelines for quality assurance in colorectal cancer screening recommend snare resection for polyps> 5 mm. The aim of this study was to investigate polypectomy technique according to lesion size and shape, and to assess adherence of endoscopists enrolled in the national quality assurance program to the European guidelines. This cohort study included screening colonoscopies performed between 2007 and 2013 within a quality assurance program in Austria. Resection technique was analyzed according to lesion characteristics and endoscopy facility (private practices, hospitals, outpatient clinics) before publication of the EU guidelines (2007 - 2010) and adherence to the guidelines after publication (2011 - 2013). All surveillance colonoscopies and examinations with missing data were excluded. A total of 128 969 screening colonoscopies performed by 278 endoscopy units were included. The polyp detection rate was 39.6 % (n = 47 797) and 95.6 % of polyps were resected. Of polyps ≥ 5 mm, 46.0 % were resected using forceps and were therefore not treated in accordance with the guidelines. Forceps polypectomy of lesions 5 - 10 mm and> 10 mm decreased significantly in hospitals after implementation of the guidelines (both P < 0.0001). In private practices, there was no difference in forceps usage for polyps of 5 - 10 mm (P = 0.41) before and after the guidelines, and for polyps > 10 mm forceps usage even increased (P < 0.0001). Endoscopists' forceps removal rates for polyps ≥ 5 mm correlated significantly with respective adenoma detection rates (P = 0.0007, r p  - 0.187) and cecal intubation rates (P = 0.0001, r p  - 0.303). Among endoscopists in private practices, internists had slightly lower forceps removal rates for polyps ≥ 5 mm than surgeons, both before (47.2 % vs. 50.7 %; P = 0.014) and after publication of the guidelines (51.9 % vs. 53.5 %; P = 0.161).
Question: Forceps versus snare polypectomies in colorectal cancer screening: are we adhering to the guidelines?
This study confirmed the importance of the European guidelines. The inclusion of adequate resection technique as a quality indicator in colorectal cancer screening programs is recommended.
Answer the question based on the following context: Visible para-aortic lymph nodes of ≥2 mm in size are common metastatic patterns of colorectal cancer (CRC) seen on imaging. Their prognostic value, however, remains inconclusive. We aimed to assess the prognostic role of visible para-aortic lymph nodes (PALNs). Patients with confirmed pathologic diagnosis of CRC were enrolled. Correlations among clinicopathologic variables were analyzed using the χ2 test. The Cox proportional hazards model was applied for univariate and multivariate analyses. Survival was estimated using the Kaplan-Meier method and log-rank test. A prognostic model for visible PALNs in CRC patients was established. In total, 4527 newly diagnosed CRC patients were enrolled. Patients with visible PALNs had inferior overall survival compared to those without visible PALNs (5-year overall survival, 67% vs. 76%, P = 0.015). Lymphovascular invasion (LVI) (hazard ratio = 1.865, P = 0.015); nodal disease (pN+) status (hazard ratio = 2.099, P = 0.006); elevated preoperative serum carcinoembryonic antigen (CEA) levels (hazard ratio = 2.263, P<0.001); and visible PALNs ≥10 mm (hazard ratio = 1.638, P = 0.031) were independent prognostic factors for patients with visible PALNs. If each prognostic factor scored one point, 5-year overall survival of lower- (prognostic score 0-1), intermediate- (prognostic score 2), and high- (prognostic score 3-4) risk groups were, 78%. 54%, and 25% respectively (P<0.001).
Question: The Prognostic Role of Para-Aortic Lymph Nodes in Patients with Colorectal Cancer: Is It Regional or Distant Disease?
The prognostic model, which included LVI, pN+ status, preoperative serum CEA level, and the size of visible PALNs, could effectively distinguish the outcome of patients with visible PALNs.
Answer the question based on the following context: The aim of the study was to evaluate the relationship between bone microvascularization of the footprint and tendon integrity after rotator cuff repair of the shoulder. Forty-eight patients (mean age, 59 years; ±7.9) with a chronic rotator cuff tear underwent a tendon repair with a single-row technique and were studied prospectively. A core obtained from the footprint during the procedure allowed determination of the bone's microvascularization with an immunohistochemistry technique using anti-CD34 antibodies. Clinical evaluation was performed at a minimum of 12-month follow-up, and rotator cuff integrity was assessed with ultrasound according to Sugaya's classification. At a mean follow-up of 13 months, the Constant score improved from 40 to 75 points; American Shoulder and Elbow Surgeons score, from 59 to 89 points; and subjective shoulder value, from 38% to 83% (P<.001). Ultrasound identified 18 patients with Sugaya type I healing, 27 patients with type II, and 3 patients with type IV. No patients showed Sugaya type III or V repairs. The rate of microvascularization of the footprint was 15.6%, 13.9%, and 4.2% for type I, II, and IV tendon integrity, respectively (I vs. II, P = .22; II vs. IV, P = .02; I vs. IV, P = .0022). Patients with a history of corticosteroid injection had a lower rate of microvascularization than the others (10.3% vs. 16.2%; P = .03).
Question: Does microvascularization of the footprint play a role in rotator cuff healing of the shoulder?
Even if overall satisfactory clinical outcomes are achieved after a rotator cuff repair, bone microvascularization of the footprint plays a role in rotator cuff healing. A lower rate of microvessels decreases the tendon integrity and healing potential after repair.
Answer the question based on the following context: The aim of this study was to evaluate the effect of climate and altitude differences on the volume of paranasal sinuses and on the frequency of anatomic variations by comparing the paranasal sinus tomograms (PNSCT) of patients who were born and living in a cold, dry climate at high altitude with those of patients who were born and living on the coast at sea level in a temperate climate. We also aimed to determine differences relating to gender. A total of 55 PNSCTs of 55 patients from the city center of Antalya and 60 PNSCTs of 60 patients from the city center of Agrı were evaluated and compared prospectively. The study included a total of 115 patients with a mean age of 44.75 ± 9.64 years (range, 27-63 years). Group 1 (Antalya) comprised 26 females (47.3%) and 29 males (52.7%) with a mean age of 36.7 ± 12.4 years. Group 2 (Agrı) comprised 25 females (41.7%) and 35 males (58.3%) with a mean age of 35.1 ± 13.4 years. Maxillary sinus volumes were 18.27 cm(3) (range, 5.04-37.62) and 15.06 cm(3) (4.11-41.40); sphenoid sinus volumes were 7.81 cm(3) (1.80-20.63) and 6.35 cm(3) (0.54-16.50); frontal sinus volumes were 5.51 cm(3) (0.50-29.25) and 3.76 cm(3) (0.68-22.81) respectively. There was no statistically significant difference between the groups in term of volumes (p>0.025). Both maxillary and frontal sinus volumes were greater in males compared to females (p<0.025). The mean value of the maxillary sinus volume was 15.7 ± 5.3 cm(3) and was significantly larger in males than in females (p = 0.004). There was no statistically significant correlation between the volume of maxillary sinuses with age or side. There was no statistically significant difference between the groups in terms of septum deviation and concha bullosa rates (p = 0.469 and p = 0.388).
Question: Do altitude and climate affect paranasal sinus volume?
There have been many studies of nasal cavity changes due to climatic conditions but this is the first study to measure the difference of paranasal sinus volumes. No difference was determined in the anatomic variations and volumes of the maxillary, frontal, sphenoid sinuses on PNSCT of patients from different climates and altitudes.
Answer the question based on the following context: The evaluation of surgical risk is crucial in elderly patients. At present, there is little evidence of the usefulness of comprehensive geriatric assessment (CGA) as a part of the overall assessment of surgical elderly patients. We verified whether CGA associated with established surgical risk assessment tools is able to improve the prediction of postoperative morbidity and mortality in 377 elderly patients undergoing elective surgery. Overall mortality and morbidity were 2.4% and 19.9%, respectively. Multivariate analysis showed that impaired cognitive function (odds ratio [OR], 1.33; 95% confidence interval [CI], 1.15 to 4.22; P<.02) and higher Physiological and Operative Severity Score for the Enumeration of Mortality and Morbidity (OR, 1.11; 95% CI, 1.00 to 1.23; P<.04) are predictive of mortality. Higher comorbidity is predictive of morbidity (OR, 2.12; 95% CI, 1.06 to 4.22; P<.03) and higher American Society of Anesthesiologists (OR, 2.18; 95% CI, 1.31 to 3.63; P<.001) and National Confidential Enquiry into Patient Outcome of Death score (OR, 2.03; 95% CI, 1.03 to 4.00; P<.04).
Question: Does comprehensive geriatric assessment improve the estimate of surgical risk in elderly patients?
In elective surgical elderly patients, the morbidity and mortality are low. The use of CGA improves the identification of elderly patients at higher risk of adverse events, independent of the surgical prognostic indices.
Answer the question based on the following context: To examine if implicit emotion regulation (occurring outside of awareness) is related to binge eating disorder (BED) symptomatology and explicit emotion regulation (occurring within awareness), and can be altered via intervention. Implicit emotion regulation was assessed via the Emotion Conflict Task (ECT) among a group of adults with BED. Study 1 correlated BED symptomatology and explicit emotion regulation with ECT performance at baseline (BL) and after receiving BED treatment (PT). Study 2 generated effect sizes comparing ECT performance at BL and PT with healthy (non-eating disordered) controls (HC). Study 1 yielded significant correlations (p<.05) between both BED symptomatology and explicit emotion regulation with ECT performance. Study 2 found that compared to BL ECT performance, PT shifted (d=-.27), closer to HC. Preliminary results suggest a) BED symptomatology and explicit emotion regulation are associated with ECT performance, and b) PT ECT performance normalized after BED treatment.
Question: Does implicit emotion regulation in binge eating disorder matter?
Implicit emotion regulation may be a BED treatment mechanism because psychotherapy, directly or indirectly, decreased sensitivity to implicit emotional conflict. Further understanding implicit emotion regulation may refine conceptualizations and effective BED treatments.
Answer the question based on the following context: Diaphragm plication surgery is conducted to remove dyspnea, which results from mediastinal shift, atelectasia, and ventilation/perfusion dyssynchrony in lungs that occur because of an eventrated diaphragm. This study aims to determine whether diaphragm plication has any effect on respiration by analyzing the patients' changing values in the respiratory function test (RFT) after plication surgery. Sixteen patients who underwent diaphragm plication surgery in our clinic because of plication eventration or paralysis were examined prospectively. Diaphragm eventration values were assessed using a calculation method that uses posteroanterior pulmonary radiographies taken during patient admission and control; then, these data were recorded. The amount of changes in the eventration levels and in restrictive respiratory failure parameters-forced expiratory volume in 1 second (FEV1), forced vital capacity (FVC) of RFTs-conducted in pre- and postoperative control periods were compared using statistical analysis methods. The compatibility between the amounts of RFT changes was examined through a satisfaction survey-using a questionnaire that consisted of multiple choice questions with answer options such as "better," "the same," and "worse"-to understand preoperative and postoperative symptom levels in the 12(th) month of postoperative control. According to postoperative levels, a decrease between 19% and 23% was observed in eventration amounts within the 1(st) postoperative month, 6(th) postoperative month, and 12(th) postoperative month. In addition, the highest average increase in FEV1 liter (lt) values was 0.2 lt and 0.25 in FVC (lt) values.
Question: Is surgical plication necessary in diaphragm eventration?
Researchers of this study believe that more distinctive decisions need to be taken while identifying patients for surgery in unilateral diaphragm eventrations, especially in the adult patient group; surgical option should be used for cases in which the eventrated diaphragm results in mediastinal shift and respiratory failure.
Answer the question based on the following context: Recent reports have reopened discussion of the prognostic value of elevated pre-treatment carcinoembryonic antigen (CEA) levels in colorectal cancer. Due to the discrepancies in the published results, we aimed to analyze the possible predictive value of CEA, both overall and in different tumoral stages in our environment. We retrospectively studied 303 consecutive patients with colorectal cancer resected with curative intent by analysing tumor-related mortality. The frequency of patients with increased CEA levels (>5mg/l) was registered. Univariate and multivariate analyses of survival curves were performed, comparing patients with increased CEA levels and those with CEA levels within normal limits, both in the overall series and in the different pTNM tumoral stages. Frequency of patients with CEA>5mg/l was 31%. The median clinical follow-up was 83 months. A poor survival rate was registered in the multivariate analysis of the whole series in patients with high CEA levels: hazard ratio (HR)=1.81; 95% confidence interval (95% CI)=(1.15-3.10); P=.012. This predictive value was only maintained in stage II in the survival analysis of the distinct tumoral stages (n=104): HR=3.02; 95% CI=(1.22-7.45); P=.017.
Question: Prognostic value of preoperative carcinoembryogenic antigen: Is it useful in all stages of colorectal cancer?
Before treatment, 31% of our patients with colorectal cancer resected with curative intent had pathological CEA values. In the overall series, a high pretreatment CEA level showed an independent prognostic value for poor survival. When pTNM tumoral stages were analyzed separately, CEA level had predictive value only in pTNM II tumors.
Answer the question based on the following context: Bicortical screw fixation systems and miniplate with monocortical screw fixation systems have been reported mainly in bilateral sagittal split ramus osteotomy (BSSO). This study compared postoperative stability between these 2 fixation systems by an intraoral approach. This was a retrospective cohort study. The study sample was composed of patients treated by BSSO at the authors' institute from January 2006 through December 2012. All cases had facial symmetry and were performed by setback surgery. The predictor variable was treatment group (intraoral screw fixation [SG] vs intraoral miniplate fixation [MG]), and the primary outcome variable was stability defined as the change in the position of point B. Other outcome variables were stability defined as the change in the position of the menton, blood loss, incidence of postoperative temporomandibular joint disorder, and nerve injury. Descriptive and bivariate statistics were computed and the P value was set at .05. Seventy-five patients (35 men and 40 women; mean age, 25.8 yr) were divided into 2 groups (39 SG cases and 36 MG cases). Postoperative changes at point B and the menton in the 2 fixation groups were not statistically different. Lingual nerve injury occurred only in SG cases. Moreover, total blood loss was greater in SG cases.
Question: Does Intraoral Miniplate Fixation Have Good Postoperative Stability After Sagittal Splitting Ramus Osteotomy?
An intraoral miniplate with monocortical screw fixation system is recommended over intraoral bicortical screw fixation for bone segments in setback BSSO in patients without facial asymmetry.
Answer the question based on the following context: The most frequent reason for performing a distal pancreatectomy is the presence of cystic or neuroendocrine tumors, in which the distal pancreatic stump is often soft and non fibrotic. This parenchymal consistence represents the main risk factor for post-operative pancreatic fistula. In order to identify the fistula and assessing its severity postoperative monitoring of amylase from intraperitoneal drains is important. From a retrospective multicentric database analysis were included 33 patients who underwent distal pancreatectomy for pancreatic neoplastic disease. Postoperative pancreatic fistula occurred in four cases. One patient had a ductal adenocarcinoma, two presented with pancreatic endocrine neoplasms and the last one had an intraductal papillary mucinous neoplasia. Two patients underwent open, the other two laparoscopic distal pancreatectomy. Postoperative pancreatic fistulas after distal pancreatectomy worsen the quality of life, prolong the post-operative stay and delay further adjuvant therapy. In patients who underwent distal pancreatectomy literature exposed some advantages deriving from the placement of abdominal drainages only in selected cases and from their early removal. Patients presenting a high risk of pancreatic fistula had higher amylase levels of drainage fluid in the first postoperative day.
Question: Can the measurement of amylase in drain after distal pancreatectomy predict post-operative pancreatic fistula?
POPF is the most frequently complication after pancreatectomy. In our analysis DFA1>5000 can be considered as a predictive factor for pancreatic fistula. For this reason, the systematic measurement of amylase in drain fluid in first-postoperative day can be considered a good clinical practice.
Answer the question based on the following context: Though there is some evidence that body exposure increases body satisfaction, it is still unclear why exposure works and how attention should be guided during exposure. This pilot study manipulates the focus of attention during body exposure. Female participants high in body dissatisfaction were randomly assigned to an exposure intervention that exclusively focused on self-defined attractive (n = 11) or self-defined unattractive (n = 11) body parts. Both interventions consisted of five exposure sessions and homework. Outcome and process of change were studied. Both types of exposure were equally effective and led to significant improvements in body satisfaction, body checking, body concerns, body avoidance and mood at post-test. Improvements for body satisfaction and mood were maintained at follow-up while body shape concerns and body checking still improved between post-test and follow-up. Body avoidance improvements were maintained for the positive exposure while the negative exposure tended to further decrease long-term body avoidance at follow-up.. The 'positive' exposure induced positive feelings during all exposure sessions while the 'negative' exposure initially induced a worsening of feelings but feelings started to improve after some sessions. The most unattractive body part was rated increasingly attractive in both conditions though this increase was significantly larger in the negative compared to the positive exposure condition. The sample size was small and non-clinical.
Question: Mirror exposure to increase body satisfaction: Should we guide the focus of attention towards positively or negatively evaluated body parts?
Both types of exposure might be effective and clinically useful. Negative exposure is emotionally hard but might be significantly more effective in increasing the perceived attractiveness of loathed body parts and in decreasing avoidance behavior.
Answer the question based on the following context: To test whether presenting attribute levels in words or graphics generates different results with respect to attribute level interpretation, relative importance and participation probabilities. Parents of 959 newborns completed a DCE questionnaire that contained two versions of the same nine choice tasks in which the attribute levels were presented in words or graphics. Five attributes related to the decision of parents to vaccinate their newborn against rotavirus were included. Mixed-logit models were conducted to estimate the relative importance of the attribute levels. Respondents who started with the choice tasks in words produced the most consistent answer patterns. All respondents significantly preferred words to graphics. Part-worth utilities and the relative importance of the attribute levels differed based on the words and graphics data, resulting in different probabilities to participate in vaccination.
Question: Words or graphics to present a Discrete Choice Experiment: Does it matter?
Words were preferred over graphics, resulted in higher choice consistency, and showed more valid attribute level estimates. Graphics did not improve respondents' understanding of the attribute levels.
Answer the question based on the following context: The Movement Disorder Society-sponsored revision of the Unified Parkinson's Disease Rating Scale (MDS-UPDRS) has been published in 2008 as the successor of the original UPDRS. The MDS-UPDRS organizing team developed guidelines for the development of official non-English translations consisting of four steps: translation/back-translation, cognitive pretesting, large field testing, and clinimetric analysis. The aim of this paper was to introduce the new MDS-UPDRS and its validation process into Hungarian. Two independent groups of neurologists translated the text of the MDS-UPDRS into Hungarian and subsequently back-translated into English. After the review of the back-translated English version by the MDS-UPDRS translation administration team, cognitive pretesting was conducted with ten patients. Based on the results of the initial cognitive pretesting, another round was conducted. For the large field testing phase, the Hungarian official working draft version of MDS-UPDRS was tested with 357 patients with Parkinson's disease (PD). Confirmatory factor analyses (CFA) determined whether the factor structure for the English-language MDS-UPDRS could be confirmed in data collected using the Hungarian Official Draft Version. To become an official translation, the Comparative Fit Index (CFI) had to be ≥ 0.90 compared to the English-language version. For all four parts of the Hungarian MDS-UPDRS, the CFI was ≥ 0.94.
Question: VALIDATION OF THE HUNGARIAN MDS-UPDRS: WHY DO WE NEED A NEW PARKINSON SCALE?
The overall factor structure of the Hungarian version was consistent with that of the English version based on the high CFIs for all the four parts of the MDS-UPDRS in the CFA; therefore, this version was designated as the "OFFICIAL GUNGARIAN VERSION OF THE MDS-UPDRS'.
Answer the question based on the following context: Postoperative pancreatic fistula (POPF) is the most significant cause of morbidity and mortality after pancreaticoduodenectomy (PD). We evaluated the role of postoperative serum lipase concentration in ruling out POPF in the immediate post-operative period. We retrospectively analysed 98 consecutive PD performed between January 2009 and December 2014, investigating the correlation between postoperative day 1 (POD1) serum lipase concentration and POPF development. 29 patients (29.5%) developed POPF [grade A, 17 (17.3%); grade B, 8 (8.1%); grade C, 4 (4%)]. A receiver operating characteristic (ROC) analysis was conducted to determine the threshold value of POD1 serum lipase associated with clinically significant POPF (AUC = 0.76, 95% CI 0.64-0.86, P = 0.01). Such threshold was ≤ 44.5 U/L and its sensitivity and specificity were 92% and 66%, respectively. The positive and negative predictive values (PPV, NPV) were 31% and 98%, respectively.
Question: Can early serum lipase measurement be routinely implemented to rule out clinically significant pancreatic fistula after pancreaticoduodenectomy?
Early routinely measurement of serum lipase proved to be helpful in ruling out clinically relevant POPF (CR-POPF). In our cohort, a POD1 cut-off of ≤ 44.5 U/L allowed early and accurate identification of patients with low probability to develop clinically significant POPF, who can eventually be selected for enhanced post-operative recovery with significant clinical and economic benefits.
Answer the question based on the following context: To assess whether people who inject drugs (PWID) and who are treated for overdose by ambulance services have a greater mortality risk compared with other PWID, and to compare mortality risk within potentially critical time-periods (1 week, 1 month, 3 months, 6 months, 1 year, 5 years) after an overdose attendance with the mortality risk within potentially non-critical time-periods (time before and/or after critical periods). A prospective cohort study. Oslo, Norway. A total of 172 PWID street-recruited in 1997 and followed-up until the end of 2004. Interview data linked to data from ambulance records, Norwegian Correctional Services, Opioid Substitution Treatment records and National Cause of Death Registry. Separate Cox regression models (one for each critical time-period) were estimated. Ambulance services treated 54% of the participants for an overdose during follow-up. The mortality rate was 2.8 per 100 person-years for those with an overdose and 3.3 for those without; the adjusted hazard ratio (HR) was 1.3 (95% CI = 0.6, 2.6, P = 0.482). Mortality risk was greater in all but the shortest critical time-period following ambulance attendance than in the non-critical periods. The mortality risk remained significantly elevated during critical periods, even when adjusted for total time spent in prison and substitution treatment. The HR ranged from 9.4 (95% CI = 3.5, 25.4) in the month after an overdose to 13.9 (95% CI = 6.4, 30.2) in the 5-year period.
Question: Are overdoses treated by ambulance services an opportunity for additional interventions?
Mortality risk among people who inject drugs is significantly greater in time-periods after an overdose attendance than outside these time-periods.
Answer the question based on the following context: Major depressive disorder is a significant mental illness that is highly likely to recur, particularly after three or more previous episodes. Increased mindfulness and decreased rumination have both been associated with decreased depressive relapse. The aim of this study was to investigate whether rumination mediates the relationship between mindfulness and depressive relapse. This prospective design involved a secondary data analysis for identifying causal mechanisms using mediation analysis. This study was embedded in a pragmatic randomized controlled trial of mindfulness-based cognitive therapy (MBCT) in which 203 participants (165 females, 38 males; mean age: 48 years), with a history of at least three previous episodes of depression, completed measures of mindfulness, rumination, and depressive relapse over a 2-year follow-up period. Specific components of mindfulness and rumination, being nonjudging and brooding, respectively, were also explored. While higher mindfulness scores predicted reductions in rumination and depressive relapse, the relationship between mindfulness and relapse was not found to be mediated by rumination, although there appeared to be a trend.
Question: Does rumination mediate the relationship between mindfulness and depressive relapse?
Our results strengthen the argument that mindfulness may be important in preventing relapse but that rumination is not a significant mediator of its effects. The study was adequately powered to detect medium mediation effects, but it is possible that smaller effects were present but not detected.
Answer the question based on the following context: To examine the effect of autism spectrum (AS) tendencies and psychosocial job characteristics on health-related quality of life (HRQOL) among factory workers. A questionnaire survey was administered to 376 Japanese factory employees from the same company (response rate: 83.6%) in 2010. Psychosocial job characteristics, including job demand, job control, and social support, were evaluated using the Job Content Questionnaire (JCQ). AS tendencies was assessed using the Autism-Spectrum Quotient (AQ), and HRQOL was assessed using the Medical Outcomes Study Short-Form General Health Survey (SF-8). Associations were investigated using multiple logistic regression analysis adjusted for confounders. In the multivariate analysis, AQ was positively (odds ratio [OR]: 3.94; 95% confidence interval [CI]: 1.70-9.73) and social support in the workplace was inversely (OR: 0.25; 95% CI: 0.10-0.57) associated with poor mental HRQOL. No significant interaction was observed between AQ and JCQ subitems. Only social support was inversely associated with poor physical HRQOL (OR and 95% CI for medium social support: 0.45 and 0.21-0.94), and a significant interaction between AQ and job control was observed (p=0.02), suggesting that high job control was associated with poor physical HRQOL among workers with high AQ, whereas low job control tended to be associated with poor physical HRQOL among others.
Question: Is high job control a risk factor for poor quality of life in workers with high autism spectrum tendencies?
Our results suggest that AS tendencies have a negative effect on workers' HRQOL and social support is a primary factor in maintaining HRQOL. Moreover, a structured work environment can maintain physical HRQOL in workers with high AS tendencies since higher job control will be stressful.
Answer the question based on the following context: To find a method that is suitable for providing an objective assessment of the cost effectiveness of a dose-reducing measure used for diagnostic dental X-ray exposures. Three cost-utility analysis (CUA) methods were evaluated by comparing their assessments of two dose-reduction measures, a rectangular collimator and the combination of two devices that reduce the radiation dose received during orthodontic lateral cephalography. The following CUA methods were used: (1) the alpha value (AV), a monetary valuation of dose reduction used in the nuclear industry; (2) the value of a statistical life for valuation of the reduction in stochastic adverse effects; and (3) the time-for-time method, based on the postulate that risk reduction is effective when the number of years of life gained is more than the years that an average worker must work to earn the costs of the risk-reducing measure. The CUA methods were used to determine the minimum number of uses that was required for the dose-reducing device to be cost effective. The methods were assessed for coherence (are comparable results achieved for comparable countries?) and adaptability (can the method be adjusted for age and gender of specific patient groups?). The performance of the time-for-time method was superior to the other methods. Both types of dose-reduction devices tested were assessed as cost effective after a realistic number of uses with all three methods except low AVs.
Question: Reducing an already low dental diagnostic X-ray dose: does it make sense?
CUA for the methods of X-ray dose reduction can be performed to determine if investment in low dose reduction is cost effective. The time-for-time method proved to be a coherent and versatile method for performing CUA.
Answer the question based on the following context: It has been noted that after tubularized incised plate urethroplasty (TIP) repair, the final meatal position is glanular but not at the optimum position. Inner preputial inlay graft combined with tubularized incised plate (G-TIP) has been proposed for redo urethroplasty. We extended this indication to be the standard technique for primary hypospadias repair. We conduct this prospective study to obtain a wide, slit-like appearance neomeatus at the optimum position in the glans proper and to judge if hypospadias repair complications differ from TIP repair in the published data in the literature. This prospective study included 230 consecutive patients who underwent this technique. The study was conducted from November 2011 to August 2014 for all hypospadias cases to be repaired in a single stage regardless of the width and depth of urethral plate or the glans size and shape. Localization of the meatus was glanular in 13 patients, coronal in 75, distal penile in 112, mid penile in 25 and proximal in five. The urethral plate was incised deeply and extended distally beyond the end of the plate by 3 mm in glans proper. The mucosal graft was harvested from the inner prepuce, inlayed and quilted in the incised urethral plate. The neourethra was created over a urethral catheter in two layers. The vascular dartos flap was mobilized dorsally and moved ventrally to cover the neourethral suture line as a barrier. The follow-up period ranged from 5 to 36 months. Excellent cosmetic and functional results were achieved in 221 of 230 patients (96.09%). Neither meatal stenosis nor urethral diverticulum were encountered. An excellent glanular position of a wide slit-like neomeatus was achieved using this technique. Nine patients (3.91%) developed urethrocutaneous fistula. Excellent urinary stream was reported by parents.
Question: Is combined inner preputial inlay graft with tubularized incised plate in hypospadias repair worth doing?
Combined inner preputial graft with TIP urethroplasty secures the optimal glanular position of a wide slit-like neomeatus because of extension of the incision beyond the end of the plate, thus optimizing functional and cosmetic outcome with no meatal stenosis.
Answer the question based on the following context: Renal damage is more frequent with new-generation lithotripters. However, animal studies suggest that voltage ramping minimizes the risk of complications following extracorporeal shock wave lithotripsy (SWL). In the clinical setting, the optimal voltage strategy remains unclear. To evaluate whether stepwise voltage ramping can protect the kidney from damage during SWL. A total of 418 patients with solitary or multiple unilateral kidney stones were randomized to receive SWL using a Modulith SLX-F2 lithotripter with either stepwise voltage ramping (n=213) or a fixed maximal voltage (n=205). SWL. The primary outcome was sonographic evidence of renal hematomas. Secondary outcomes included levels of urinary markers of renal damage, stone disintegration, stone-free rate, and rates of secondary interventions within 3 mo of SWL. Descriptive statistics were used to compare clinical outcomes between the two groups. A logistic regression model was generated to assess predictors of hematomas. Significantly fewer hematomas occurred in the ramping group(12/213, 5.6%) than in the fixed group (27/205, 13%; p=0.008). There was some evidence that the fixed group had higher urinary β2-microglobulin levels after SWL compared to the ramping group (p=0.06). Urinary microalbumin levels, stone disintegration, stone-free rate, and rates of secondary interventions did not significantly differ between the groups. The logistic regression model showed a significantly higher risk of renal hematomas in older patients (odds ratio [OR]1.03, 95% confidence interval [CI] 1.00-1.05; p=0.04). Stepwise voltage ramping was associated with a lower risk of hematomas (OR 0.39, 95% CI 0.19-0.80; p=0.01). The study was limited by the use of ultrasound to detect hematomas.
Question: Does Stepwise Voltage Ramping Protect the Kidney from Injury During Extracorporeal Shockwave Lithotripsy?
In this prospective randomized study, stepwise voltage ramping during SWL was associated with a lower risk of renal damage compared to a fixed maximal voltage without compromising treatment effectiveness.