text
stringlengths 49
1.56k
| category
stringclasses 1
value | url
stringclasses 50
values | title
stringclasses 50
values | __index_level_0__
int64 141k
194k
|
---|---|---|---|---|
Shuiyousphaeridium is an extinct genus of acritarch discovered in 1993. Dated to 1.8Ga, it represents one of the earliest fossil eukaryote taxa. | Biology | https://en.wikipedia.org/wiki?curid=60769668 | Shuiyousphaeridium | 158,719 |
Histology Short description|Study of the microscopic anatomy of cells and tissues of plants and animals. Histology, also known as microscopic anatomy or microanatomy, is the branch of biology which studies the microscopic anatomy of biological tissues. is the microscopic counterpart to gross anatomy, which looks at larger structures visible without a microscope. Although one may divide microscopic anatomy into "organology", the study of organs, "histology", the study of tissues, and "cytology", the study of cells, modern usage places these topics under the field of histology. In medicine, histopathology is the branch of histology that includes the microscopic identification and study of diseased tissue. In the field of paleontology, the term paleohistology refers to the histology of fossil organisms. There are four basic types of animal tissues: muscle tissue, nervous tissue, connective tissue, and epithelial tissue. All animal tissues are considered to be subtypes of these four principal tissue types (for example, blood is classified as connective tissue, since the blood cells are suspended in an extracellular matrix, the plasma). For plants, the study of their tissues falls under the field of plant anatomy, with the following four main types: Histopathology is the branch of histology that includes the microscopic identification and study of diseased tissue | Biology | https://en.wikipedia.org/wiki?curid=13570 | Histology | 160,712 |
Histology It is an important part of anatomical pathology and surgical pathology, as accurate diagnosis of cancer and other diseases often requires histopathological examination of tissue samples. Trained physicians, frequently licensed pathologists, perform histopathological examination and provide diagnostic information based on their observations. The field of histology that includes the preparation of tissues for microscopic examination is known as histotechnology. Job titles for the trained personnel who prepare histological specimens for examination are numerous and include histotechnicians, histotechnologists, histology technicians and technologists, medical laboratory technicians, and biomedical scientists. Most histological samples need preparation before microscopic observation; these methods depend on the specimen and method of observation. Chemical fixatives are used to preserve and maintain the structure of tissues and cells; fixation also hardens tissues which aids in cutting the thin sections of tissue needed for observation under the microscope. Fixatives generally preserve tissues (and cells) by irreversibly cross-linking proteins. The most widely used fixative for light microscopy is 10% neutral buffered formalin, or NBF (4% formaldehyde in phosphate buffered saline). For electron microscopy, the most commonly used fixative is glutaraldehyde, usually as a 2.5% solution in phosphate buffered saline. Other fixatives used for electron microscopy are osmium tetroxide or uranyl acetate | Biology | https://en.wikipedia.org/wiki?curid=13570 | Histology | 160,713 |
Histology The main action of these aldehyde fixatives is to cross-link amino groups in proteins through the formation of methylene bridges (-CH-), in the case of formaldehyde, or by CH cross-links in the case of glutaraldehyde. This process, while preserving the structural integrity of the cells and tissue can damage the biological functionality of proteins, particularly enzymes. Formalin fixation leads to degradation of mRNA, miRNA, and DNA as well as denaturation and modification of proteins in tissues. However, extraction and analysis of nucleic acids and proteins from formalin-fixed, paraffin-embedded tissues is possible using appropriate protocols. "Selection" is the choice of relevant tissue in cases where it is not necessary to put the entire original tissue mass through further processing. The remainder may remain fixated in case it needs to be examined at a later time. "Trimming" is the cutting of tissue samples in order to expose the relevant surfaces for later sectioning. It also creates tissue samples of appropriate size to fit into cassettes. Tissues are embedded in a harder medium both as a support and to allow the cutting of thin tissue slices. In general, water must first be removed from tissues (dehydration) and replaced with a medium that either solidifies directly, or with an intermediary fluid (clearing) that is miscible with the embedding media. For light microscopy, paraffin wax is the most frequently used embedding material | Biology | https://en.wikipedia.org/wiki?curid=13570 | Histology | 160,714 |
Histology Paraffin is immiscible with water, the main constituent of biological tissue, so it must first be removed in a series of dehydration steps. Samples are transferred through a series of progressively more concentrated ethanol baths, up to 100% ethanol to remove remaining traces of water. Dehydration is followed by a "clearing agent" (typically xylene although other environmental safe substitutes are in use) which removes the alcohol and is miscible with the wax, finally melted paraffin wax is added to replace the xylene and infiltrate the tissue. In most histology, or histopathology laboratories the dehydration, clearing, and wax infiltration are carried out in "tissue processors" which automate this process. Once infiltrated in paraffin, tissues are oriented in molds which are filled with wax; once positioned, the wax is cooled, solidifying the block and tissue. Paraffin wax does not always provide a sufficiently hard matrix for cutting very thin sections (which are especially important for electron microscopy). Paraffin wax may also be too soft in relation to the tissue, the heat of the melted wax may alter the tissue in undesirable ways, or the dehydrating or clearing chemicals may harm the tissue. Alternatives to paraffin wax include, epoxy, acrylic, agar, gelatin, celloidin, and other types of waxes. In electron microscopy epoxy resins are the most commonly employed embedding media, but acrylic resins are also used, particularly where immunohistochemistry is required | Biology | https://en.wikipedia.org/wiki?curid=13570 | Histology | 160,715 |
Histology For tissues to be cut in a frozen state, tissues are placed in a water-based embedding medium. Pre-frozen tissues are placed into molds with the liquid embedding material, usually a water-based glycol, OCT, TBS, Cryogel, or resin, which is then frozen to form hardened blocks. For light microscopy, a knife mounted in a microtome is used to cut tissue sections (typically between 5-15 micrometers thick) which are mounted on a glass microscope slide. For transmission electron microscopy (TEM), a diamond or glass knife mounted in an ultramicrotome is used to cut between 50-150 nanometer thick tissue sections. Biological tissue has little inherent contrast in either the light or electron microscope. Staining is employed to give both contrast to the tissue as well as highlighting particular features of interest. When the stain is used to target a specific chemical component of the tissue (and not the general structure), the term histochemistry is used. Hematoxylin and eosin (H&E stain) is one of the most commonly used stains in histology to show the general structure of the tissue. Hematoxylin stains cell nuclei blue; eosin, an acidic dye, stains the cytoplasm and other tissues in different stains of pink. In contrast to H&E, which is used as a general stain, there are many techniques that more selectively stain cells, cellular components, and specific substances | Biology | https://en.wikipedia.org/wiki?curid=13570 | Histology | 160,716 |
Histology A commonly performed histochemical technique that targets a specific chemical is the Perls' Prussian blue reaction, used to demonstrate iron deposits in diseases like hemochromatosis. The Nissl method for Nissl substance and Golgi's method (and related silver stains) are useful in identifying neurons are other examples of more specific stains. In historadiography, a slide (sometimes stained histochemically) is X-rayed. More commonly, autoradiography is used to visualize the locations to which a radioactive substance has been transported within the body, such as cells in S phase (undergoing DNA replication) which incorporate tritiated thymidine, or sites to which radiolabeled nucleic acid probes bind in in situ hybridization. For autoradiography on a microscopic level, the slide is typically dipped into liquid nuclear tract emulsion, which dries to form the exposure film. Individual silver grains in the film are visualized with dark field microscopy. Recently, antibodies have been used to specifically visualize proteins, carbohydrates, and lipids. This process is called immunohistochemistry, or when the stain is a fluorescent molecule, immunofluorescence. This technique has greatly increased the ability to identify categories of cells under a microscope | Biology | https://en.wikipedia.org/wiki?curid=13570 | Histology | 160,717 |
Histology Other advanced techniques, such as nonradioactive "in situ" hybridization, can be combined with immunochemistry to identify specific DNA or RNA molecules with fluorescent probes or tags that can be used for immunofluorescence and enzyme-linked fluorescence amplification (especially alkaline phosphatase and tyramide signal amplification). Fluorescence microscopy and confocal microscopy are used to detect fluorescent signals with good intracellular detail. For electron microscopy heavy metals are typically used to stain tissue sections. Uranyl acetate and lead citrate are commonly used to impart contrast to tissue in the electron microscope. Similar to the frozen section procedure employed in medicine, cryosectioning is a method to rapidly freeze, cut, and mount sections of tissue for histology. The tissue is usually sectioned on a cryostat or freezing microtome. The frozen sections are mounted on a glass slide and may be stained to enhance the contrast between different tissues. Unfixed frozen sections can be used for studies requiring enzyme localization in tissues and cells. Tissue fixation is required for certain procedures such as antibody-linked immunofluorescence staining. Frozen sections are often prepared during surgical removal of tumors to allow rapid identification of tumor margins, as in Mohs surgery, or determination of tumor malignancy, when a tumor is discovered incidentally during surgery | Biology | https://en.wikipedia.org/wiki?curid=13570 | Histology | 160,718 |
Histology Ultramicrotomy is a method of preparing extremely thin sections for transmission electron microscope (TEM) analysis. Tissues are commonly embedded in epoxy or other plastic resin. Very thin sections (less than 0.1 micrometer in thickness) are cut using diamond or glass knives on an ultramicrotome. Artifacts are structures or features in tissue that interfere with normal histological examination. Artifacts interfere with histology by changing the tissues appearance and hiding structures. Tissue processing artifacts can include pigments formed by fixatives, shrinkage, washing out of cellular components, color changes in different tissues types and alterations of the structures in the tissue. An example is mercury pigment left behind after using Zenker's fixative to fix a section. Formalin fixation can also leave a brown to black pigment under acidic conditions. In the 17th century the Italian Marcello Malpighi used microscopes to study tiny biological entities; some regard him as the founder of the fields of histology and microscopic pathology. Malpighi analyzed several parts of the organs of bats, frogs and other animals under the microscope. While studying the structure of the lung, Malpighi noticed its membranous alveoli and the hair-like connections between veins and arteries, which he named capillaries. His discovery established how the oxygen breathed in enters the blood stream and serves the body. In the 19th century histology was an academic discipline in its own right | Biology | https://en.wikipedia.org/wiki?curid=13570 | Histology | 160,719 |
Histology The French anatomist Xavier Bichat introduced the concept of tissue in anatomy in 1801, and the term "histology" (), coined to denote the "study of tissues", first appeared in a book by Karl Meyer in 1819. Bichat described twenty-one human tissues, which can be subsumed under the four categories currently accepted by histologists. The usage of illustrations in histology, deemed as useless by Bichat, was promoted by Jean Cruveilhier. In the early 1830s Purkynĕ invented a microtome with high precision. During the 19th century many fixation techniques were developed by Adolph Hannover (solutions of chromates and chromic acid), Franz Schulze and Max Schultze (osmic acid), Alexander Butlerov (formaldehyde) and Benedikt Stilling (freezing). Mounting techniques were developed by Rudolf Heidenhain (gum Arabic), Salomon Stricker (mixture of wax and oil), Andrew Pritchard (gum and isinglass) and Edwin Klebs (Canada balsam). The 1906 Nobel Prize in Physiology or Medicine was awarded to histologists Camillo Golgi and Santiago Ramon y Cajal. They had conflicting interpretations of the neural structure of the brain based on differing interpretations of the same images. Ramón y Cajal won the prize for his correct theory, and Golgi for the silver-staining technique he invented to make it possible. | Biology | https://en.wikipedia.org/wiki?curid=13570 | Histology | 160,720 |
Seto Marine Biological Laboratory The (瀬戸臨海実験所, also known as SMBL), is a marine biology field station of Kyoto University. It is located in the small town of Shirahama in Wakayama Prefecture about 230 km from Kyoto. Established in 1922, research at the lab has historically been focused on marine invertebrates. A small island, Hatakejima, belongs to the laboratory and serves as an experimental field site in southeastern Tanabe Bay. The Shirahama Aquarium, open to the public, is located next to the research laboratories and operated by SMBL. In 2003, Kyoto University combined SMBL with the Maizuru Fisheries Research Station, University Forests, and the Subtropical Plant Institute into an administrative unit named the Field Science Education and Research Center (FSERC). The scientific journal "Publications of the Seto Marine Biological Laboratory" has been published by SMBL since 1949. | Biology | https://en.wikipedia.org/wiki?curid=31633276 | Seto Marine Biological Laboratory | 166,775 |
Endogenous retrovirus Endogenous retroviruses (ERVs) are endogenous viral elements in the genome that closely resemble and can be derived from retroviruses. They are abundant in the genomes of jawed vertebrates, and they comprise up to 5–8% of the human genome (lower estimates of ~1%). ERVs are a subclass of a type of gene called a transposon, which can be packaged and moved within the genome to serve a vital role in gene expression and in regulation. They are distinguished as retrotransposons, which are Class I elements. Researchers have suggested that retroviruses evolved from a type of transposable gene called a retrotransposon, which includes ERVs; these genes can mutate and instead of moving to another location in the genome they can become exogenous or pathogenic. This means that not all ERVs may have originated as an insertion by a retrovirus but that some may have been the source for the genetic information in the retroviruses they resemble. When integration of viral DNA occurs in the germ-line, it can give rise to an ERV, which can later become fixed in the gene pool of the host population. The replication cycle of a retrovirus entails the insertion ("integration") of a DNA copy of the viral genome into the nuclear genome of the host cell. Most retroviruses infect somatic cells, but occasional infection of germline cells (cells that produce eggs and sperm) can also occur. Rarely, retroviral integration may occur in a germline cell that goes on to develop into a viable organism | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,674 |
Endogenous retrovirus This organism will carry the inserted retroviral genome as an integral part of its own genome—an "endogenous" retrovirus (ERV) that may be inherited by its offspring as a novel allele. Many ERVs have persisted in the genome of their hosts for millions of years. However, most of these have acquired inactivating mutations during host DNA replication and are no longer capable of producing the virus. ERVs can also be partially excised from the genome by a process known as recombinational deletion, in which recombination between the identical sequences that flank newly integrated retroviruses results in deletion of the internal, protein-coding regions of the viral genome. The general retrovirus genome consists of three genes vital for the invasion, replication, escape, and spreading of its viral genome. These three genes are gag (encodes for structural proteins for the viral core), pol (encodes for reverse transcriptase, integrase, and protease), and env (encodes for coat proteins for the virus's exterior). These viral proteins are encoded as polyproteins. In order to carry out their life cycle, the retrovirus relies heavily on the host cell's machinery. Protease degrades peptide bonds of the viral polyproteins, making the separate proteins functional. Reverse transcriptase functions to synthesize viral DNA from the viral RNA in the host cell's cytoplasm before it enters the nucleus. Integrase guides the integration of viral DNA into the host genome | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,675 |
Endogenous retrovirus Over time, the genome of ERVs not only acquire point mutations, but also shuffle and recombine with other ERVs. ERVs with the "env" coat decayed become more likely to spread. Endogenous retroviruses can play an active role in shaping genomes. Most studies in this area have focused on the genomes of humans and higher primates, but other vertebrates, such as mice and sheep, have also been studied in depth. The long terminal repeat (LTR) sequences that flank ERV genomes frequently act as alternate promoters and enhancers, often contributing to the transcriptome by producing tissue-specific variants. In addition, the retroviral proteins themselves have been co-opted to serve novel host functions, particularly in reproduction and development. Recombination between homologous retroviral sequences has also contributed to gene shuffling and the generation of genetic variation. Furthermore, in the instance of potentially antagonistic effects of retroviral sequences, repressor genes have co-evolved to combat them. Solo LTRs and LTRs associated with complete retroviral sequences have been shown to act as transcriptional elements on host genes. Their range of action is mainly by insertion into the 5' UTRs of protein coding genes; however, they have been known to act upon genes up to 70–100 kb away. The majority of these elements are inserted in the sense direction to their corresponding genes, but there has been evidence of LTRs acting in the antisense direction and as a bidirectional promoter for neighboring genes | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,676 |
Endogenous retrovirus In a few cases, the LTR functions as the major promoter for the gene. For example, in humans AMY1C has a complete ERV sequence in its promoter region; the associated LTR confers salivary specific expression of the digestive enzyme amylase. Also, the primary promoter for bile acid-CoA:amino acid N-acyltransferase (BAAT), which codes for an enzyme that is integral in bile metabolism, is of LTR origin. The insertion of a solo ERV-9 LTR may have produced a functional open reading frame (ORF), causing the rebirth of the human immunity related GTPase gene (IRGM). ERV insertions have also been shown to generate alternative splice sites either by direct integration into the gene, as with the human leptin hormone receptor, or driven by the expression of an upstream LTR, as with the phospholipase A-2 like protein. Most of the time, however, the LTR functions as one of many alternate promoters, often conferring tissue-specific expression related to reproduction and development. In fact, 64% of known LTR-promoted transcription variants are expressed in reproductive tissues. For example, the gene CYP19 codes for aromatase P450, an important enzyme for estrogen synthesis, that is normally expressed in the brain and reproductive organs of most mammals. However, in primates, an LTR-promoted transcriptional variant confers expression to the placenta and is responsible for controlling estrogen levels during pregnancy | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,677 |
Endogenous retrovirus Furthermore, the neuronal apoptosis inhibitory protein (NAIP), normally widespread, has an LTR of the HERV-P family acting as a promoter that confers expression to the testis and prostate. Other proteins, such as nitric oxide synthase 3 (NOS3), interleukin-2 receptor B (IL2RB), and another mediator of estrogen synthesis, HSD17B1, are also alternatively regulated by LTRs that confer placental expression, but their specific functions are not yet known. The high degree of reproductive expression is thought to be an after effect of the method by which they were endogenized; however, this also may be due to a lack of DNA methylation in germ-line tissues. The best-characterized instance of placental protein expression comes not from an alternatively promoted host gene but from a complete co-option of a retroviral protein. Retroviral fusogenic env proteins, which play a role in the entry of the virion into the host cell, have had an important impact on the development of the mammalian placenta. In mammals, intact env proteins called syncytins are responsible for the formation and function of syncytiotrophoblasts. These multinucleated cells are mainly responsible for maintaining nutrient exchange and separating the fetus from the mother's immune system. It has been suggested that the selection and fixation of these proteins for this function have played a critical role in the evolution of viviparity | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,678 |
Endogenous retrovirus In addition, the insertion of ERVs and their respective LTRs have the potential to induce chromosomal rearrangement due to recombination between viral sequences at inter-chromosomal loci. These rearrangements have been shown to induce gene duplications and deletions that largely contribute to genome plasticity and dramatically change the dynamic of gene function. Furthermore, retroelements in general are largely prevalent in rapidly evolving, mammal-specific gene families whose function is largely related to the response to stress and external stimuli. In particular, both human class I and class II MHC genes have a high density of HERV elements as compared to other multi-locus-gene families. It has been shown that HERVs have contributed to the formation of extensively duplicated duplicon blocks that make up the HLA class 1 family of genes. More specifically, HERVs primarily occupy regions within and between the break points between these blocks, suggesting that considerable duplication and deletions events, typically associated with unequal crossover, facilitated their formation. The generation of these blocks, inherited as immunohaplotypes, act as a protective polymorphism against a wide range of antigens that may have imbued humans with an advantage over other primates. Finally, the insertion of ERVs or ERV elements into genic regions of host DNA, or overexpression of their transcriptional variants, has a much higher potential to produce deleterious effects than positive ones | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,679 |
Endogenous retrovirus Their appearance into the genome has created a host-parasite co-evolutionary dynamic that proliferated the duplication and expansion of repressor genes. The most clear-cut example of this involves the rapid duplication and proliferation of tandem zinc-finger genes in mammal genomes. Zinc-finger genes, particularly those that include a KRAB domain, exist in high copy number in vertebrate genomes, and their range of functions are limited to transcriptional roles. It has been shown in mammals, however, that the diversification of these genes was due to multiple duplication and fixation events in response to new retroviral sequences or their endogenous copies to repress their transcription. The characteristic of placentas being very evolutionary distinct organs between different species has been suggested to result from the co-option of ERV enhancers. Regulatory mutations, instead of mutations in genes that encode for hormones and growth factors, support the known evolution of placental morphology, especially since the majority of hormone and growth factor genes are expressed in response to pregnancy, not during placental development. Researchers studied the regulatory landscape of placental development between the rat and mouse, two closely related species. This was done by mapping all regulatory elements of the rat trophoblast stem cells (TSCs) and comparing them to their orthologs in mouse TSCs. TSCs were observed because they reflect the initial cells that develop in the fetal placenta | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,680 |
Endogenous retrovirus Regardless of their tangible similarities, enhancer and repressed regions were mostly species-specific. However, most promoter sequences were conserved between mouse and rat. In conclusion to their study, researchers proposed that ERVs influenced species-specific placental evolution through mediation of placental growth, immunosuppression, and cell fusion. Another example of ERV exploiting cellular mechanisms is p53, a tumor suppressor gene (TSG). DNA damage and cellular stress induces the p53 pathway, which results in cell apoptosis. Using chromatin immunoprecipitation with sequencing, thirty-percent of all p53-binding sites were located within copies of a few primate-specific ERV families. A study suggested that this benefits retroviruses because p53's mechanism provides a rapid induction of transcription, which leads to the exit of viral RNA from the host cell. The majority of ERVs that occur in vertebrate genomes are ancient, inactivated by mutation, and have reached genetic fixation in their host species. For these reasons, they are extremely unlikely to have negative effects on their hosts except under unusual circumstances. Nevertheless, it is clear from studies in birds and non-human mammal species including mice, cats and koalas, that younger (i.e., more recently integrated) ERVs can be associated with disease. The number of active ERVs in the genome of mammals is negatively related to their body size suggesting a contribution to the Peto's paradox through cancer pathogenesis | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,681 |
Endogenous retrovirus This has led researchers to propose a role for ERVs in several forms of human cancer and autoimmune disease, although conclusive evidence is lacking. In humans, ERVs have been proposed to be involved in multiple sclerosis (MS). A specific association between MS and the ERVWE1, or "syncytin", gene, which is derived from an ERV insertion, has been reported, along with the presence of an "MS-associated retrovirus" (MSRV), in patients with the disease. Human ERVs (HERVs) have also been implicated in ALS and addiction. In 2004 it was reported that antibodies to HERVs were found in greater frequency in the sera of people with schizophrenia. Additionally, the cerebrospinal fluid of people with recent onset schizophrenia contained levels of a retroviral marker, reverse transcriptase, four times higher than control subjects. Researchers continue to look at a possible link between HERVs and schizophrenia, with the additional possibility of a triggering infection inducing schizophrenia. ERVs have been found to be associated to disease not only through disease-causing relations, but also through immunity. The frequency of ERVs in long terminal repeats (LTRs) likely correlates to viral adaptations to take advantage of immunity signaling pathways that promote viral transcription and replication. A study done in 2016 investigated the benefit of ancient viral DNA integrated into a host through gene regulation networks induced by interferons, a branch of innate immunity | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,682 |
Endogenous retrovirus These cytokines are first to respond to viral infection and are also important in immunosurveillance for malignant cells. ERVs are predicted to act as cis-regulatory elements, but much of the adaptive consequences of this for certain physiological functions is still unknown. There is data that supports the general role of ERVs in the regulation of human interferon response, specifically to interferon-gamma (IFNG). For example, interferon-stimulated genes were found to be greatly enriched with ERVs bound by signal transducer and activator of transcription (STAT1) and/or Interferon regulatory factor (IRF1) in CD14+ macrophages. HERVs also play various roles shaping the human innate immunity response, with some sequences activating the system and others suppressing it. They may also protect from exogenous retroviral infections: the virus-like transcripts can activate pattern recognition receptors, and the proteins can interfere with active retroviruses. A gag protein from HERV-K(HML2) is shown to mix with HIV Gag, impairing HIV capsid formation as a result. Another idea proposed was that ERVs from the same family played a role in recruiting multiple genes into the same network of regulation. It was found that MER41 elements provided addition redundant regulatory enhancement to the genes located near STAT1 binding sites | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,683 |
Endogenous retrovirus For humans, porcine endogenous retroviruses (PERVs) pose a concern when using porcine tissues and organs in xenotransplantion, the transplanting of living cells, tissues, and organs from an organism of one species to an organism of different species. Although pigs are generally the most suitable donors to treat human organ diseases due to practical, financial, safety, and ethical reasons, PERVs previously could not be removed from pigs due to its viral nature of integrating into the host genome and being passed into offspring until the year 2017 when Dr. George Church's lab removed all 62 retroviruses from the pigs genome. The consequences of cross-species transmission remains unexplored and has very dangerous potential. Researchers indicated that infection of human tissues by PERVs is very possible, especially in immunosuppressed individuals. An immunosuppressed condition could potentially permit a more rapid and tenacious replication of viral DNA, and would later on have less difficulty adapting to human-to-human transmission. Although known infectious pathogens present in the donor organ/tissue can be eliminated by breeding pathogen-free herds, unknown retroviruses can be present in the donor. These retroviruses are often latent and asymptomatic in the donor, but can become active in the recipient. Some examples of endogenous viruses that can infect and multiply in human cells are from baboons (BaEV), cats (RD114), and mice. There are three different classes of PERVs, PERV-A, PERV-B, and PERV-C | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,684 |
Endogenous retrovirus PERV-A and PERV-B are polytropic and can infect human cells in vitro, while PERV-C is ecotropic and does not replicate on human cells. The major differences between the classes is in the receptor binding domain of the "env" protein and the long terminal repeats (LTRs) that influence the replication of each class. PERV-A and PERV-B display LTRs that have repeats in the U3 region. However, PERV-A and PERV-C show repeatless LTRs. Researchers found that PERVs in culture actively adapted to the repeat structure of their LTR in order to match the best replication performance a host cell could perform. At the end of their study, researchers concluded that repeatless PERV LTR evolved from the repeat-harboring LTR. This was likely to have occurred from insertional mutation and was proven through use of data on LTR and "env"/Env. It is thought that the generation of repeatless LTRs could be reflective of an adaptation process of the virus, changing from an exogenous to an endogenous lifestyle. A clinical trial study performed in 1999 sampled 160 patients who were treated with different living pig tissues and observed no evidence of a persistent PERV infection in 97% of the patients for whom a sufficient amount of DNA was available to PCR for amplification of PERV sequences. This study stated that retrospective studies are limited to find the true incidence of infection or associated clinical symptoms, however | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,685 |
Endogenous retrovirus It suggested using closely monitored prospective trials, which would provide a more complete and detailed evaluation of the possible cross-species PERV transmission and a comparison of the PERV. Human endogenous retroviruses (HERV) comprise a significant part of the human genome, with approximately 98,000 ERV elements and fragments making up 5–8%. According to a study published in 2005, no HERVs capable of replication had been identified; all appeared to be defective, containing major deletions or nonsense mutations. This is because most HERVs are merely traces of original viruses, having first integrated millions of years ago. An analysis of HERV integrations is ongoing as part of the 100,000 Genomes Project. Human endogenous retroviruses were discovered by accident using a couple of different experiments. Human genomic libraries were screened under low-stringency conditions using probes from animal retroviruses, allowing the isolation and characterization of multiple, though defective, proviruses, that represented various families. Another experiment depended on oligonucleotides with homology to viral primer binding sites. HERVs are classified based on their homologies to animal retroviruses. Families belong to Class I are similar in sequence to mammalian "Gammaretroviruses" (type C) and "Epsilonretroviruses" (Type E). Families belonging to Class II show homology to mammalian "Betaretroviruses" (Type B) and "Deltaretroviruses"(Type D). Families belong to Class III are similar to foamy viruses | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,686 |
Endogenous retrovirus For all classes, if homologies appear well conserved in the "gag", "pol", and "env" gene, they are grouped into a superfamily. There are more Class I families known to exist. The families themselves are named in a less uniform manner, with a mixture of naming based on an exogenous retrovirus, the priming tRNA (HERV-W, K), or some neiboring gene (HERV-ADP), clong number (HERV-S71), or some amino acid motif (HERV-FRD). A proposed nomenclature aims to clean up the sometimes paraphyletic standards. There are two proposals for how HERVs became fixed in the human genome. The first assumes that sometime during human evolution, exogenous progenitors of HERV inserted themselves into germ line cells and then replicated along with the host's genes using and exploiting the host's cellular mechanisms. Because of their distinct genomic structure, HERVs were subjected to many rounds of amplification and transposition, which lead to a widespread distribution of retroviral DNA. The second hypothesis claims the continuous evolution of retro-elements from more simple structured ancestors. Nevertheless, one family of viruses has been active since the divergence of humans and chimpanzees. This family, termed HERV-K (HML2), makes up less than 1% of HERV elements but is one of the most studied. There are indications it has even been active in the past few hundred thousand years, e.g., some human individuals carry more copies of HML2 than others | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,687 |
Endogenous retrovirus Traditionally, age estimates of HERVs are performed by comparing the 5' and 3' LTR of a HERV; however, this method is only relevant for full-length HERVs. A recent method, called cross-sectional dating, uses variations within a single LTR to estimate the ages of HERV insertions. This method is more precise in estimating HERV ages and can be used for any HERV insertions. Cross-sectional dating has been used to suggest that two members of HERV-K(HML2), HERV-K106 and HERV-K116, were active in the last 800,000 years and that HERV-K106 may have infected modern humans 150,000 years ago. However, the absence of known infectious members of the HERV-K(HML2) family, and the lack of elements with a full coding potential within the published human genome sequence, suggests to some that the family is less likely to be active at present. In 2006 and 2007, researchers working independently in France and the US recreated functional versions of HERV-K(HML2). MER41.AIM2 is an HERV that regulates the transcription of AIM2 (Absent in Melanoma 2) which encodes for a sensor of foreign cytosolic DNA. This acts as a binding site for AIM2, meaning that it is necessary for the transcription of AIM2. Researchers had shown this by deleting MER41.AIM2 in HeLa cells using CRISPR/Cas9, leading to an undetectable transcript level of AIM2 in modified HeLa cells. The control cells, which still contained the MER41.AIM2 ERV, were observed with normal amounts of AIM2 transcript. In terms of immunity, researchers concluded that MER41 | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,688 |
Endogenous retrovirus AIM2 is necessary for an inflammatory response to infection. Immunological studies have shown some evidence for T cell immune responses against HERVs in HIV-infected individuals. The hypothesis that HIV induces HERV expression in HIV-infected cells led to the proposal that a vaccine targeting HERV antigens could specifically eliminate HIV-infected cells. The potential advantage of this novel approach is that, by using HERV antigens as surrogate markers of HIV-infected cells, it could circumvent the difficulty inherent in directly targeting notoriously diverse and fast-mutating HIV antigens. There are a few classes of human endogenous retroviruses that still have intact open reading frames. For example, the expression of HERV-K, a biologically active family of HERV, produces proteins found in placenta. Furthermore, the expression of the envelope genes of HERV-W (ERVW-1)and HERV-FRD (ERVFRD-1) produces syncytins which are important for the generation of the syncytiotrophoblast cell layer during placentogenesis by inducing cell-cell fusion. The HUGO Gene Nomenclature Committee (HGNC) approves gene symbols for transcribed human ERVs. Example: A porcine ERV (PERV) Chinese-born minipig isolate, PERV-A-BM, was sequenced completely and along with different breeds and cell lines in order to understand its genetic variation and evolution | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,689 |
Endogenous retrovirus The observed number of nucleotide substitutions and among the different genome sequences helped researchers determine an estimate age that PERV-A-BM was integrated into its host genome, which was found to be of an evolutionary age earlier than the European-born pigs isolates. This technique is used to find histone marks indicative of promoters and enhancers, which are binding sites for DNA proteins, and repressed regions and trimethylation. DNA methylation has been shown to be vital to maintain silencing of ERVs in mouse somatic cells, while histone marks are vital for the same purpose in embryonic stem cells (ESCs) and early embryogenesis. Because most HERVs have no function, are selectively neutral, and are very abundant in primate genomes, they easily serve as phylogenetic markers for linkage analysis. They can be exploited by comparing the integration site polymorphisms or the evolving, proviral, nucleotide sequences of orthologs. To estimate when integration occurred, researchers used distances from each phylogenetic tree to find the rate of molecular evolution at each particular locus. It is also useful that ERVs are rich in many species genomes (i.e. plants, insects, mollusks, fish, rodents, domestic pets, and livestock) because its application can be used to answer a variety of phylogenetic questions. This is accomplished by comparing the different HERV from different evolutionary periods. For example, this study was done for different hominoids, which ranged from humans to apes and to monkeys | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,690 |
Endogenous retrovirus This is difficult to do with PERV because of the large diversity present. Researchers could analyze individual epigenomes and transcriptomes to study the reactivation of dormant transposable elements through epigenetic release and their potential associations with human disease and exploring the specifics of gene regulatory networks. Little is known about an effective way to overcoming hyperacute rejection (HAR), which follows the activation of complement initiated by xenoreactive antibodies recognizing galactosyl-alpha1-3galatosyl (alpha-Gal) antigens on the donor epithelium. Because retroviruses are able to recombine with each other and with other endogenous DNA sequences, it would be beneficial for gene therapy to explore the potential risks HERVs can cause, if any. Also, this ability of HERVs to recombine can be manipulated for site-directed integration by including HERV sequences in retroviral vectors. Researchers believe that RNA and proteins encoded for by HERV genes should continue to be explored for putative function in cell physiology and in pathological conditions. This would make sense to examine in order to more deeply define the biological significance of the proteins synthesized. | Biology | https://en.wikipedia.org/wiki?curid=2311903 | Endogenous retrovirus | 168,691 |
Cancer stem cell Cancer stem cells (CSCs) are cancer cells (found within tumors or hematological cancers) that possess characteristics associated with normal stem cells, specifically the ability to give rise to all cell types found in a particular cancer sample. CSCs are therefore tumorigenic (tumor-forming), perhaps in contrast to other non-tumorigenic cancer cells. CSCs may generate tumors through the stem cell processes of self-renewal and differentiation into multiple cell types. Such cells are hypothesized to persist in tumors as a distinct population and cause relapse and metastasis by giving rise to new tumors. Therefore, development of specific therapies targeted at CSCs holds hope for improvement of survival and quality of life of cancer patients, especially for patients with metastatic disease. Existing cancer treatments have mostly been developed based on animal models, where therapies able to promote tumor shrinkage were deemed effective. However, animals do not provide a complete model of human disease. In particular, in mice, whose life spans do not exceed two years, tumor relapse is difficult to study. The efficacy of cancer treatments is, in the initial stages of testing, often measured by the ablation fraction of tumor mass (fractional kill). As CSCs form a small proportion of the tumor, this may not necessarily select for drugs that act specifically on the stem cells | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,077 |
Cancer stem cell The theory suggests that conventional chemotherapies kill differentiated or differentiating cells, which form the bulk of the tumor but do not generate new cells. A population of CSCs, which gave rise to it, could remain untouched and cause relapse. Cancer stem cells were first identified by John Dick in acute myeloid leukemia in the late 1990s. Since the early 2000s they have been an intense cancer research focus. The term itself was coined in a highly cited paper in 2001 by biologists Tannishtha Reya, Sean J. Morrison, Michael F. Clarke and Irving Weissman. In different tumor subtypes, cells within the tumor population exhibit functional heterogeneity and tumors are formed from cells with various proliferative and differentiation capacities.<ref name="2/1"></ref> This functional heterogeneity among cancer cells has led to the creation of multiple propagation models to account for heterogeneity and differences in tumor-regenerative capacity: the cancer stem cell (CSC) and stochastic model. However, certain perspectives maintain that this demarcation is artificial, since both processes act in complementary manners as far as actual tumor populations are concerned. The cancer stem cell model, also known as the Hierarchical Model proposes that tumors are hierarchically organized (CSCs lying at the apex<ref name="2/6"></ref> (Fig. 3) | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,078 |
Cancer stem cell ) Within the cancer population of the tumors there are cancer stem cells (CSC) that are tumorigenic cells and are biologically distinct from other subpopulations They have two defining features: their long-term ability to self-renew and their capacity to differentiate into progeny that is non-tumorigenic but still contributes to the growth of the tumor. This model suggests that only certain subpopulations of cancer stem cells have the ability to drive the progression of cancer, meaning that there are specific (intrinsic) characteristics that can be identified and then targeted to destroy a tumor long-term without the need to battle the whole tumor.<ref name="2/12"></ref> In order for a cell to become cancerous it must undergo a significant number of alterations to its DNA sequence. This cell model suggests these mutations could occur to any cell in the body resulting in a cancer. Essentially this theory proposes that all cells have the ability to be tumorigenic making all tumor cells equipotent with the ability to self-renew or differentiate, leading to tumor heterogeneity while others can differentiate into non-CSCs The cell's potential can be influenced by unpredicted genetic or epigenetic factors, resulting in phenotypically diverse cells in both the tumorigenic and non-tumorigenic cells that compose the tumor | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,079 |
Cancer stem cell According to the “stochastic model” (or “clonal evolution model”) every cancer cell in a tumor could gain the ability to self-renew and differentiate to the numerous and heterogeneous lineages of cancer cells that compromise a tumor These mutations could progressively accumulate and enhance the resistance and fitness of cells that allow them to outcompete other tumor cells, better known as the somatic evolution model. The clonal evolution model, which occurs in both the CSC model and stochastic model, postulates that mutant tumor cells with a growth advantage outproliferate others. Cells in the dominant population have a similar potential for initiating tumor growth.<ref name="2/11"></ref> (Fig. 4). <ref name="2/8"></ref> These two models are not mutually exclusive, as CSCs themselves undergo clonal evolution. Thus, the secondary more dominant CSCs may emerge, if a mutation confers more aggressive properties<ref name="2/12b"></ref> (Fig. 5). A study in 2014 argues the gap between these two controversial models can be bridged by providing an alternative explanation of tumor heterogeneity. They demonstrate a model that includes aspects of both the Stochastic and CSC models. They examined cancer stem cell plasticity in which cancer stem cells can transition between non-cancer stem cells (Non-CSC) and CSC via in situ supporting a more Stochastic model | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,080 |
Cancer stem cell But the existence of both biologically distinct non-CSC and CSC populations supports a more CSC model, proposing that both models may play a vital role in tumor heterogeneity. This model suggests that immunological properties may be important for understanding tumorigenesis and heterogeneity. As such, CSCs can be very rare in some tumors, but some researchers found that a large proportion of tumor cells can initiate tumors if transplanted into severely immunocompromised mice, and thus questioned the relevance of rare CSCs. However, both stem cells and CSCs possess unique immunological properties which render them highly resistant towards immunosurveillance. Thus, only CSCs may be able to seed tumors in patients with functional immunosurveillance, and immune privilege may be a key criterion for identifying CSCs. Furthermore, the model suggests that CSCs may initially be dependent on stem cell niches, and CSCs may function there as a reservoir in which mutations can accumulate over decades unrestricted by the immune system. Clinically overt tumors may grow if: A) CSCs lose their dependence on niche factors (less differentiated tumors), B) their offspring of highly proliferative, yet initially immunogenic normal tumor cells evolve means to escape immunosurveillance or C) the immune system may lose its tumorsuppressive capacity, e.g. due to ageing. The existence of CSCs is under debate, because many studies found no cells with their specific characteristics | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,081 |
Cancer stem cell Cancer cells must be capable of continuous proliferation and self-renewal to retain the many mutations required for carcinogenesis and to sustain the growth of a tumor, since differentiated cells (constrained by the Hayflick Limit) cannot divide indefinitely. For therapeutic consideration, if most tumor cells are endowed with stem cell properties, targeting tumor size directly is a valid strategy. If CSCs are a small minority, targeting them may be more effective. Another debate is over the origin of CSCs - whether from disregulation of normal stem cells or from a more specialized population that acquired the ability to self-renew (which is related to the issue of stem cell plasticity). Confounding this debate is the discovery that many cancer cells demonstrate a Phenotypic plasticity under therapeutic challenge, altering their transcriptomes to a more stem-like state to escape destruction. The first conclusive evidence for CSCs came in 1997. Bonnet and Dick isolated a subpopulation of leukemia cells that expressed surface marker CD34, but not CD38. The authors established that the CD34/CD38 subpopulation is capable of initiating tumors in NOD/SCID mice that were histologically similar to the donor. The first evidence of a solid tumor cancer stem-like cell followed in 2002 with the discovery of a clonogenic, sphere-forming cell isolated and characterized from adult human brain gliomas. Human cortical glial tumors contain neural stem-like cells expressing astroglial and neuronal markers "in vitro" | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,082 |
Cancer stem cell Cancer stem cells isolated from adult human gliomas were shown to induce tumours that resembled the parent tumour when grafted into intracranial nude mouse models. In cancer research experiments, tumor cells are sometimes injected into an experimental animal to establish a tumor. Disease progression is then followed in time and novel drugs can be tested for their efficacy. Tumor formation requires thousands or tens of thousands of cells to be introduced. Classically, this was explained by poor methodology (i.e., the tumor cells lose their viability during transfer) or the critical importance of the microenvironment, the particular biochemical surroundings of the injected cells. Supporters of the CSC paradigm argue that only a small fraction of the injected cells, the CSCs, have the potential to generate a tumor. In human acute myeloid leukemia the frequency of these cells is less than 1 in 10,000. Further evidence comes from histology. Many tumors are heterogeneous and contain multiple cell types native to the host organ. Tumour heterogeneity is commonly retained by tumor metastases. This suggests that the cell that produced them had the capacity to generate multiple cell types, a classical hallmark of stem cells. The existence of leukemia stem cells prompted research into other cancers. CSCs have recently been identified in several solid tumors, including: Once the pathways to cancer are hypothesized, it is possible to develop predictive mathematical models, e.g., based on the cell compartment method | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,083 |
Cancer stem cell For instance, the growths of abnormal cells can be denoted with specific mutation probabilities. Such a model predicted that repeated insult to mature cells increases the formation of abnormal progeny and the risk of cancer. The clinical efficacy of such models remains unestablished. The origin of CSCs is an active research area. The answer may depend on the tumor type and phenotype. So far the hypothesis that tumors originate from a single "cell of origin" has not been demonstrated using the cancer stem cell model. This is because cancer stem cells are not present in end-stage tumors. Origin hypotheses include mutants in developing stem or progenitor cells, mutants in adult stem cells or adult progenitor cells and mutant, differentiated cells that acquire stem-like attributes. These theories often focus on a tumor's "cell of origin". The "mutation in stem cell niche populations during development" hypothesis claims that these developing stem populations are mutated and then reproduce so that the mutation is shared by many descendants. These daughter cells are much closer to becoming tumors and their numbers increase the chance of a cancerous mutation. Another theory associates adult stem cells (ASC) with tumor formation. This is most often associated with tissues with a high rate of cell turnover (such as the skin or gut). In these tissues, ASCs are candidates because of their frequent cell divisions (compared to most ASCs) in conjunction with the long lifespan of ASCs | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,084 |
Cancer stem cell This combination creates the ideal set of circumstances for mutations to accumulate: mutation accumulation is the primary factor that drives cancer initiation. Evidence shows that the association represents an actual phenomenon, although specific cancers have been linked to a specific cause. De-differentiation of mutated cells may create stem cell-like characteristics, suggesting that any cell might become a cancer stem cell. In other words, a fully differentiated cell undergoes mutations or extracellular signals that drive it back to a stem-like state. This concept has been demonstrated most recently in Prostate cancer models, whereby cells undergoing androgen deprivation therapy appear to transiently alter their transcriptome to that of a neural crest stem-like cell, with the invasive and multipotent properties of this class of stem-like cells. The concept of tumor hierarchy claims that a tumor is a heterogeneous population of mutant cells, all of which share some mutations, but vary in specific phenotype. A tumor hosts several types of stem cells, one optimal to the specific environment and other less successful lines. These secondary lines may be more successful in other environments, allowing the tumor to adapt, including adaptation to therapeutic intervention. If correct, this concept impacts cancer stem cell-specific treatment regimes. Such a hierarchy would complicate attempts to pinpoint the origin | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,085 |
Cancer stem cell CSCs, now reported in most human tumors, are commonly identified and enriched using strategies for identifying normal stem cells that are similar across studies.<ref name="1/7"></ref> These procedures include fluorescence-activated cell sorting (FACS), with antibodies directed at cell-surface markers and functional approaches including side population assay or Aldefluor assay.<ref name="3/9"></ref> The CSC-enriched result is then implanted, at various doses, in immune-deficient mice to assess its tumor development capacity. This "in vivo" assay is called a limiting dilution assay. The tumor cell subsets that can initiate tumor development at low cell numbers are further tested for self-renewal capacity in serial tumor studies.<ref name="1/49"></ref> CSCs can also be identified by efflux of incorporated Hoechst dyes via multidrug resistance (MDR) and ATP-binding cassette (ABC) Transporters. Another approach is sphere-forming assays. Many normal stem cells such as hematopoietic or stem cells from tissues, under special culture conditions, form three-dimensional spheres that can differentiate. As with normal stem cells, the CSCs isolated from brain or prostate tumors also have the ability to form anchor-independent spheres.<ref name="3/11"></ref> CSCs have been identified in various solid tumors. Commonly, markers specific for normal stem cells are used for isolating CSCs from solid and hematological tumors | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,086 |
Cancer stem cell Markers most frequently used for CSC isolation include: CD133 (also known as PROM1), CD44, ALDH1A1, CD34, CD24 and EpCAM (epithelial cell adhesion molecule, also known as epithelial specific antigen, ESA). CD133 (prominin 1) is a five-transmembrane domain glycoprotein expressed on CD34 stem and progenitor cells, in endothelial precursors and fetal neural stem cells. It has been detected using its glycosylated epitope known as AC133. EpCAM (epithelial cell adhesion molecule, ESA, TROP1) is hemophilic Ca-independent cell adhesion molecule expressed on the basolateral surface of most epithelial cells. CD90 (THY1) is a glycosylphosphatidylinositol glycoprotein anchored in the plasma membrane and involved in signal transduction. It may also mediate adhesion between thymocytes and thymic stroma. CD44 (PGP1) is an adhesion molecule that has pleiotropic roles in cell signaling, migration and homing. It has multiple isoforms, including CD44H, which exhibits high affinity for hyaluronate and CD44V which has metastatic properties. CD24 (HSA) is a glycosylated glycosylphosphatidylinositol-anchored adhesion molecule, which has co-stimulatory role in B and T cells. CD200 (OX-2) is a type 1 membrane glycoprotein, which delivers an inhibitory signal to immune cells including T cells, natural killer cells and macrophages. ALDH is a ubiquitous aldehyde dehydrogenase family of enzymes, which catalyzes the oxidation of aromatic aldehydes to carboxyl acids | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,087 |
Cancer stem cell For instance, it has a role in conversion of retinol to retinoic acid, which is essential for survival. The first solid malignancy from which CSCs were isolated and identified was breast cancer and they are the most intensely studied. Breast CSCs have been enriched in CD44CD24, SP<ref name="1/51"></ref> and ALDH subpopulations.<ref name="1/53"></ref><ref name="1/55"></ref> Breast CSCs are apparently phenotypically diverse. CSC marker expression in breast cancer cells is apparently heterogeneous and breast CSC populations vary across tumors.<ref name="1/58"></ref> Both CD44CD24 and CD44CD24 cell populations are tumor initiating cells; however, CSC are most highly enriched using the marker profile CD44CD49fCD133/2.<ref name="1/59"></ref> CSCs have been reported in many brain tumors. Stem-like tumor cells have been identified using cell surface markers including CD133,<ref name="1/61"></ref> SSEA-1 (stage-specific embryonic antigen-1),<ref name="1/64"></ref> EGFR<ref name="1/67"></ref> and CD44.<ref name="1/69"></ref> The use of CD133 for identification of brain tumor stem-like cells may be problematic because tumorigenic cells are found in both CD133 and CD133 cells in some gliomas and some CD133 brain tumor cells may not possess tumor-initiating capacity. CSCs were reported in human colon cancer | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,088 |
Cancer stem cell For their identification, cell surface markers such as CD133, CD44<ref name="1/78"></ref> and ABCB5,<ref name="1/85"></ref> functional analysis including clonal analysis <ref name="1/80"></ref> and Aldefluor assay were used.<ref name="1/82"></ref> Using CD133 as a positive marker for colon CSCs generated conflicting results. The AC133 epitope, but not the CD133 protein, is specifically expressed in colon CSCs and its expression is lost upon differentiation.<ref name="1/83"></ref> In addition, CD44 colon cancer cells and additional sub-fractionation of CD44EpCAM cell population with CD166 enhance the success of tumor engraftments. Multiple CSCs have been reported in prostate,<ref name="1/100"></ref> lung and many other organs, including liver, pancreas, kidney or ovary.<ref name="1/101"></ref> In prostate cancer, the tumor-initiating cells have been identified in CD44<ref name="1/90"></ref> cell subset as CD44α2β1,<ref name="1/96"></ref> TRA-1-60CD151CD166 <ref name="1/99"></ref> or ALDH <ref name="1/97"></ref> cell populations. Putative markers for lung CSCs have been reported, including CD133,<ref name="1/102"></ref> ALDH,<ref name="1/107"></ref> CD44 <ref name="1/109"></ref> and oncofetal protein 5T4.<ref name="1/111"></ref> Metastasis is the major cause of tumor lethality. However, not every tumor cell can metastasize. This potential depends on factors that determine growth, angiogenesis, invasion and other basic processes | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,089 |
Cancer stem cell In epithelial tumors, the epithelial-mesenchymal transition (EMT) is considered to be a crucial event.<ref name="2/50"></ref> EMT and the reverse transition from mesenchymal to an epithelial phenotype (MET) are involved in embryonic development, which involves disruption of epithelial cell homeostasis and the acquisition of a migratory mesenchymal phenotype.<ref name="4/16"></ref> EMT appears to be controlled by canonical pathways such as WNT and transforming growth factor β.<ref name="2/51"></ref> EMT's important feature is the loss of membrane E-cadherin in adherens junctions, where β-catenin may play a significant role. Translocation of β-catenin from adherens junctions to the nucleus may lead to a loss of E-cadherin and subsequently to EMT. Nuclear β-catenin apparently can directly, transcriptionally activate EMT-associated target genes, such as the E-cadherin gene repressor SLUG (also known as SNAI2).<ref name="4/22"></ref> Mechanical properties of the tumor microenvironment, such as hypoxia, can contribute to CSC survival and metastatic potential through stabilization of hypoxia inducible factors through interactions with ROS (reactive oxygen species). Tumor cells undergoing an EMT may be precursors for metastatic cancer cells, or even metastatic CSCs.<ref name="2/52"></ref> In the invasive edge of pancreatic carcinoma, a subset of CD133CXCR4 (receptor for CXCL12 chemokine also known as a SDF1 ligand) cells was defined | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,090 |
Cancer stem cell These cells exhibited significantly stronger migratory activity than their counterpart CD133CXCR4 cells, but both showed similar tumor development capacity.<ref name="2/28"></ref> Moreover, inhibition of the CXCR4 receptor reduced metastatic potential without altering tumorigenic capacity.<ref name="2/53"></ref> In breast cancer CD44CD24 cells are detectable in metastatic pleural effusions. By contrast, an increased number of CD24 cells have been identified in distant metastases in breast cancer patients.<ref name="2/56"></ref> It is possible that CD44CD24 cells initially metastasize and in the new site change their phenotype and undergo limited differentiation.<ref name="2/57"></ref> The two-phase expression pattern hypothesis proposes two forms of cancer stem cells - stationary (SCS) and mobile (MCS). SCS are embedded in tissue and persist in differentiated areas throughout tumor progression. MCS are located at the tumor-host interface. These cells are apparently derived from SCS through the acquisition of transient EMT (Figure 7). CSCs have implications for cancer therapy, including for disease identification, selective drug targets, prevention of metastasis and intervention strategies. CSCs are inherently more resistant to chemotherapeutic agents. There are 5 main factors that contribute to this: After chemotherapy treatment, surviving CSCs are able to repopulate the tumor and cause a relapse. Additional treatment targeted at removing CSCs in addition to cancerous somatic cells must be used to prevent this | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,091 |
Cancer stem cell Selectively targeting CSCs may allow treatment of aggressive, non-resectable tumors, as well as prevent metastasis and relapse. The hypothesis suggests that upon CSC elimination, cancer could regress due to differentiation and/or cell death. The fraction of tumor cells that are CSCs and therefore need to be eliminated is unclear. Studies looked for specific markers and for proteomic and genomic tumor signatures that distinguish CSCs from others. In 2009, scientists identified the compound salinomycin, which selectively reduces the proportion of breast CSCs in mice by more than 100-fold relative to Paclitaxel, a commonly used chemotherapeutic agent. Some types of cancer cells can survive treatment with salinomycin through autophagy, whereby cells use acidic organelles such as lysosomes to degrade and recycle certain types of proteins. The use of autophagy inhibitors can kill cancer stem cells that survive by autophagy. The cell surface receptor interleukin-3 receptor-alpha (CD123) is overexpressed on CD34+CD38- leukemic stem cells (LSCs) in acute myelogenous leukemia (AML) but not on normal CD34+CD38- bone marrow cells. Treating AML-engrafted NOD/SCID mice with a CD123-specific monoclonal antibody impaired LSCs homing to the bone marrow and reduced overall AML cell repopulation including the proportion of LSCs in secondary mouse recipients. A 2015 study packaged nanoparticles with miR-34a and ammonium bicarbonate and delivered them to prostate CSCs in a mouse model | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,092 |
Cancer stem cell Then they irradiated the area with near-infrared laser light. This caused the nanoparticles to swell three times or more in size bursting the endosomes and dispersing the RNA in the cell. miR-34a can lower the levels of CD44. A 2018 study identified inhibitors of the ALDH1A family of enzymes and showed that they could selectively deplete putative cancer stem cells in several ovarian cancer cell lines. The design of new drugs for targeting CSCs requires understanding the cellular mechanisms that regulate cell proliferation. The first advances in this area were made with hematopoietic stem cells (HSCs) and their transformed counterparts in leukemia, the disease for which the origin of CSCs is best understood. Stem cells of many organs share the same cellular pathways as leukemia-derived HSCs. A normal stem cell may be transformed into a CSC through disregulation of the proliferation and differentiation pathways controlling it or by inducing oncoprotein activity. The Polycomb group transcriptional repressor Bmi-1 was discovered as a common oncogene activated in lymphoma and later shown to regulate HSCs. The role of Bmi-1 has been illustrated in neural stem cells. The pathway appears to be active in CSCs of pediatric brain tumors. The Notch pathway plays a role in controlling stem cell proliferation for several cell types including hematopoietic, neural and mammary SCs. Components of this pathway have been proposed to act as oncogenes in mammary and other tumors | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,093 |
Cancer stem cell A branch of the Notch signaling pathway that involves the transcription factor Hes3 regulates a number of cultured cells with CSC characteristics obtained from glioblastoma patients. These developmental pathways are SC regulators. Both Sonic hedgehog (SHH) and Wnt pathways are commonly hyperactivated in tumors and are necessary to sustain tumor growth. However, the Gli transcription factors that are regulated by SHH take their name from gliomas, where they are highly expressed. A degree of crosstalk exists between the two pathways and they are commonly activated together. By contrast, in colon cancer hedgehog signalling appears to antagonise Wnt. Sonic hedgehog blockers are available, such as cyclopamine. A water-soluble cyclopamine may be more effective in cancer treatment. DMAPT, a water-soluble derivative of parthenolide, induces oxidative stress and inhibits NF-κB signaling for AML (leukemia) and possibly myeloma and prostate cancer. Telomerase is a study subject in CSC physiology. GRN163L (Imetelstat) was recently started in trials to target myeloma stem cells. Wnt signaling can become independent of regular stimuli, through mutations in downstream oncogenes and tumor suppressor genes that become permanently activated even though the normal receptor has not received a signal. β-catenin binds to transcription factors such as the protein TCF4 and in combination the molecules activate the necessary genes. LF3 strongly inhibits this binding "in vitro," in cell lines and reduced tumor growth in mouse models | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,094 |
Cancer stem cell It prevented replication and reduced their ability to migrate, all without affecting healthy cells. No cancer stem cells remained after treatment. The discovery was the product of "rational drug design", involving AlphaScreens and ELISA technologies. | Biology | https://en.wikipedia.org/wiki?curid=2448941 | Cancer stem cell | 169,095 |
Marine Life Information Network The (MarLIN) is an information system for marine biodiversity for Great Britain and Ireland. MarLIN was established in 1998 by the Marine Biological Association together with the environmental protection agencies and academic institutions in Britain and Ireland. The MarLIN data access programme has now become the DASSH Marine Data Archive Cantre. DASSH is built on the existing extensive data and dissemination skills of the (MarLIN), the library and information services of the National Marine Biological Library (NMBL) and the MBA's historical role in marine science. | Biology | https://en.wikipedia.org/wiki?curid=22536663 | Marine Life Information Network | 170,886 |
Biological data visualization Biology data visualization is a branch of bioinformatics concerned with the application of computer graphics, scientific visualization, and information visualization to different areas of the life sciences. This includes visualization of sequences, genomes, alignments, phylogenies, macromolecular structures, systems biology, microscopy, and magnetic resonance imaging data. Software tools used for visualizing biological data range from simple, standalone programs to complex, integrated systems. Today we are experiencing a rapid growth in volume and diversity of biological data, presenting an increasing challenge for biologists. A key step in understanding and learning from these data is visualization. Thus, there has been a corresponding increase in the number and diversity of systems for visualizing biological data. An emerging trend is the blurring of boundaries between the visualization of 3D structures at atomic resolution, visualization of larger complexes by cryo-electron microscopy, and visualization of the location of proteins and complexes within whole cells and tissues. A second emerging trend is an increase in the availability and importance of time-resolved data from systems biology, electron microscopy and cell and tissue imaging. In contrast, visualization of trajectories has long been a prominent part of molecular dynamics | Biology | https://en.wikipedia.org/wiki?curid=23265863 | Biological data visualization | 171,312 |
Biological data visualization Finally, as datasets are increasing in size, complexity, and interconnectness, biological visualization systems are improving in usability, data integration and standardization. Many software systems are available for visualization biological data. The links below link lists of such systems, grouped by application areas. | Biology | https://en.wikipedia.org/wiki?curid=23265863 | Biological data visualization | 171,313 |
Alternative stable state In ecology, the theory of alternative stable states (sometimes termed alternate stable states or alternative stable equilibria) predicts that ecosystems can exist under multiple "states" (sets of unique biotic and abiotic conditions). These alternative states are non-transitory and therefore considered stable over ecologically-relevant timescales. Ecosystems may transition from one stable state to another, in what is known as a state shift (sometimes termed a phase shift or regime shift), when perturbed. Due to ecological feedbacks, ecosystems display resistance to state shifts and therefore tend to remain in one state unless perturbations are large enough. Multiple states may persist under equal environmental conditions, a phenomenon known as hysteresis. theory suggests that discrete states are separated by ecological thresholds, in contrast to ecosystems which change smoothly and continuously along an environmental gradient. theory was first proposed by Richard Lewontin (1969), but other early key authors include Holling (1973), Sutherland (1974), May (1977), and Scheffer et al. (2001). In the broadest sense, alternative stable state theory proposes that a change in ecosystem conditions can result in an abrupt shift in the state of the ecosystem, such as a change in population (Barange, M. et al. 2008) or community composition. Ecosystems can persist in states that are considered stable (i.e., can exist for relatively long periods of time) | Biology | https://en.wikipedia.org/wiki?curid=23737622 | Alternative stable state | 171,536 |
Alternative stable state Intermediate states are considered unstable and are, therefore, transitory. Because ecosystems are resistant to state shifts, significant perturbations are usually required to overcome ecological thresholds and cause shifts from one stable state to another. The resistance to state shifts is known as "resilience" (Holling 1973). State shifts are often illustrated heuristically by the ball-in-cup model (Holling, C.S. et al., 1995) Biodiversity in the functioning of ecosystems: an ecological synthesis. In Biodiversity Loss, Ecological and Economical Issues (Perrings, C.A. et al., eds), pp. 44–83, Cambridge University Press). A ball, representing the ecosystem, exists on a surface where any point along the surface represents a possible state. In the simplest model, the landscape consists of two valleys separated by a hill. When the ball is in a valley, or a "domain of attraction", it exists in a stable state and must be perturbed to move from this state. In the absence of perturbations, the ball will always roll downhill and therefore will tend to stay in the valley (or stable state). State shifts can be viewed from two different viewpoints, the "community perspective" and the "ecosystem perspective". The ball can only move between stable states in two ways: (1) moving the ball or (2) altering the landscape. The community perspective is analogous to moving the ball, while the ecosystem perspective is analogous to altering the landscape. These two viewpoints consider the same phenomenon with different mechanisms | Biology | https://en.wikipedia.org/wiki?curid=23737622 | Alternative stable state | 171,537 |
Alternative stable state The community perspective considers ecosystem variables (which change relatively quickly and are subject to feedbacks from the system), whereas the ecosystem perspective considers ecosystem parameters (which change relatively slowly and operate independently of the system). The community context considers a relatively constant environment in which multiple stable states are accessible to populations or communities. This definition is an extension of stability analysis of populations (e.g., Lewontin 1969; Sutherland 1973) and communities (e.g., Drake 1991; Law and Morton 1993). The ecosystem context focuses on the effect of exogenic "drivers" on communities or ecosystems (e.g., May 1977; Scheffer et al. 2001; Dent et al. 2002). Both definitions are explored within this article. Ecosystems can shift from one state to another via a significant perturbation directly to state variables. State variables are quantities that change quickly (in ecologically-relevant time scales) in response to feedbacks from the system (i.e., they are dependent on system feedbacks), such as population densities. This perspective requires that different states can exist simultaneously under equal environmental conditions, since the ball moves only in response to a state variable change. For example, consider a very simple system with three microbial species. It may be possible for the system to exist under different community structure regimes depending on initial conditions (e.g | Biology | https://en.wikipedia.org/wiki?curid=23737622 | Alternative stable state | 171,538 |
Alternative stable state , population densities or spatial arrangement of individuals) (Kerr et al. 2002). Perhaps under certain initial densities or spatial configurations, one species dominates over all others, while under different initial conditions all species can mutually coexist. Because the different species interact, changes in populations affect one another synergistically to determine community structure. Under both states the environmental conditions are identical. Because the states have resilience, following small perturbations (e.g., changes to population size) the community returns to the same configuration while large perturbations may induce a shift to another configuration. The community perspective requires the existence of alternative stable states (i.e., more than one valley) before the perturbation, since the landscape is not changing. Because communities have some level of resistance to change, they will stay in their domain of attraction (or stable state) until the perturbation is large enough to force the system into another state. In the ball-and-cup model, this would be the energy required to push the ball up and over a hill, where it would fall downhill into a different valley. It is also possible to cause state shifts in another context, by indirectly affecting state variables. This is known as the ecosystem perspective. This perspective requires a change in environmental parameters that affect the behavior of state variables | Biology | https://en.wikipedia.org/wiki?curid=23737622 | Alternative stable state | 171,539 |
Alternative stable state For example, birth rate, death rate, migration, and density-dependent predation indirectly alter the ecosystem state by changing population density (a state variable). Ecosystem parameters are quantities that are unresponsive (or respond very slowly) to feedbacks from the system (i.e., they are independent of system feedbacks). The stable state landscape is changed by environmental drivers, which may result in a change in the quantity of stable states and the relationship between states. By the ecosystem perspective, the landscape of the ecological states is changed, which forces a change in the ecosystem state. Changing the landscape can modify the number, location, and resilience of stable states, as well as the unstable intermediate states. By this view, the topography in the ball-and-cup model is not static, as it is in the community perspective. This is a fundamental difference between the two perspectives. Although the mechanisms of community and ecosystem perspectives are different, the empirical evidence required for documentation of alternative stable states is the same. In addition, state shifts are often a combination of internal processes and external forces (Scheffer et al. 2001). For example, consider a stream-fed lake in which the pristine state is dominated by benthic vegetation. When upstream construction releases soils into the stream, the system becomes turbid | Biology | https://en.wikipedia.org/wiki?curid=23737622 | Alternative stable state | 171,540 |
Alternative stable state As a result, benthic vegetation cannot receive light and decline, increasing nutrient availability and allowing phytoplankton to dominate. In this state shift scenario the state variables changing are the populations of benthic vegetation and phytoplankton, and the ecosystem parameters are turbidity and nutrient levels. So, whether identifying mechanisms of variables or parameters is a matter of formulation (Beisner et al. 2003). Hysteresis is an important concept in alternative stable state theory. In this ecological context, hysteresis refers to the existence of different stable states under the same variables or parameters. Hysteresis can be explained by "path-dependency", in which the equilibrium point for the trajectory of "A → B" is different from for "B → A". In other words, it matters which way the ball is moving across the landscape. Some ecologists (e.g., Scheffer et al. 2001) argue that hysteresis is a prerequisite for the existence of alternative stable states. Others (e.g., Beisner et al. 2003) claim that this is not so; although shifts often involve hysteresis, a system can show alternative stable states yet have equal paths for "A → B" and "B → A". Hysteresis can occur via changes to variables or parameters. When variables are changed the ball is pushed from one domain of attraction to another, yet the same push from the other direction cannot return the ball to the original domain of attraction | Biology | https://en.wikipedia.org/wiki?curid=23737622 | Alternative stable state | 171,541 |
Alternative stable state When parameters are changed a modification to the landscape results in a state shift, but reversing the modification does not result in a reciprocal shift. A real-world example of hysteresis is helpful to illustrate the concept. Coral reef systems can dramatically shift from pristine coral-dominated systems to degraded algae-dominated systems when populations grazing on algae decline. The 1983 crash of sea urchin populations in Caribbean reef systems released algae from top-down (herbivory) control, allowing them to overgrow corals and resulting in a shift to a degraded state. When urchins rebounded, the high (pre-crash) coral cover levels did not return, indicating hysteresis (Mumby et al. 2007). In some cases, state shifts under hysteresis may be irreversible. For example, tropical cloud forests require high moisture levels, provided by clouds that are intercepted by the canopy (via condensation). When deforested, moisture delivery ceases. Therefore, reforestation is often unsuccessful because conditions are too dry to allow the trees to grow . Even in cases where there is no obvious barrier to recovery, alternative states can be remarkably persistent: an experimental grassland heavily fertilized for 10 years lost much of its biodiversity, and was still in this state 20 years later . By their very nature, basins of attraction display resilience. Ecosystems are resistant to state shifts – they will only undergo shifts under substantial perturbations – but some states are more resilient than others | Biology | https://en.wikipedia.org/wiki?curid=23737622 | Alternative stable state | 171,542 |
Alternative stable state In the ball-and-cup model, a valley with steep sides has greater resilience than a shallow valley, since it would take more force to push the ball up the hill and out of the valley. Resilience can change in stable states when environmental parameters are shifted. Often, humans influence stable states by reducing the resilience of basins of attraction. There are at least three ways in which anthropogenic forces reduce resilience (Folke et al. 2004): (1) Decreasing diversity and functional groups, often by top-down effects (e.g., overfishing); (2) altering the physico-chemical environment (e.g., climate change, pollution, fertilization); or (3) modifying disturbance regimes to which organisms are adapted (e.g., bottom trawling, coral mining, etc.). When the resilience is decreased, ecosystems can be pushed into alternative, and often less-desirable, stable states with only minor perturbations. When hysteresis effects are present, the return to a more-desirable state is sometimes impossible or impractical (given management constraints). Shifts to less-desirable states often entail a loss of ecosystem service and function, and have been documented in an array of terrestrial, marine, and freshwater environments (reviewed in Folke et al. 2004). Most work on alternative stable states has been theoretical, using mathematical models and simulations to test ecological hypotheses. Other work has been conducted using empirical evidence from surveying, historical records, or comparisons across spatial scales | Biology | https://en.wikipedia.org/wiki?curid=23737622 | Alternative stable state | 171,543 |
Alternative stable state There has been a lack of direct, manipulative experimental tests for alternative stable states. This is especially true for studies outside of controlled laboratory conditions, where state shifts have been documented for cultures of microorganisms. Verifying the existence of alternative stable states carries profound implications for ecosystem management. If stable states exist, gradual changes in environmental factors may have little effect on a system until a threshold is reached, at which point a catastrophic state shift may occur. Understanding the nature of these thresholds will help inform the design of monitoring programs, ecosystem restoration, and other management decisions. Managers are particularly interested in the potential of hysteresis, since it may be difficult to recover from a state shift (Beisner et al. 2003). The mechanisms of feedback loops that maintain stable states are important to understand if we hope to effectively manage an ecosystem with alternative stable states. Empirical evidence for the existence of alternative stable states is vital to advancing the idea beyond theory. Schröder et al. (2005) reviewed the current ecological literature for alternative stable states and found 35 direct experiments, of which only 21 were deemed valid. Of these, 62% (14) showed evidence for and 38% (8) showed no evidence for alternative stable states. However, the Schröder et al. (2005) analysis required evidence of hysteresis, which is not necessarily a prerequisite for alternative stable states | Biology | https://en.wikipedia.org/wiki?curid=23737622 | Alternative stable state | 171,544 |
Alternative stable state Other authors (e.g., Scheffer et al. 2001; Folke et al. 2004) have had less-stringent requirements for the documentation of alternative stable states. State shifts via the community perspective have been induced experimentally by the addition or removal of predators, such as in Paine's (1966) work on keystone predators (i.e., predators with disproportionate influence on community structure) in the intertidal zone (although this claim is refuted by Schröder et al. 2005). Also, Beisner et al. (2003) suggest that commercially exploited fish populations can be forced between alternative stable states by fishing pressure due to Allee effect that work at very low population sizes. Once a fish population falls below a certain threshold, it will inevitably go extinct when low population densities make replacement of adults impossible due to, for example, the inability to find mates or density-dependent mortality. Since populations cannot return from extinction, this is an example of an irreversible state shift. Although alternative stable state theory is still in its infancy, empirical evidence has been collected from a variety of biomes: | Biology | https://en.wikipedia.org/wiki?curid=23737622 | Alternative stable state | 171,545 |
Reflectin Reflectins are a family of intrinsically disordered proteins evolved by a certain number of cephalopods including "Euprymna scolopes" and "Doryteuthis opalescens" to produce iridescent camouflage and signaling. The recently identified protein family is enriched in aromatic and sulfur-containing amino acids, and is utilized by certain cephalopods to refract incident light in their environment. It is possible that reflectins are beta barrel type proteins. It is present in the iridophores and leucophores of cephalopods. There is evidence that the reflectin gene appeared in cephalopods due to a horizontal gene transfer with the marine bioluminescent bacterium, "Aliivibrio fischeri". | Biology | https://en.wikipedia.org/wiki?curid=16774174 | Reflectin | 171,813 |
Procymidone is a pesticide. It is often used for killing unwanted ferns and nettles, and as a dicarboximide fungicide for killing fungi, for example as seed dressing, pre-harvest spray or post-harvest dip of lupins, grapes, stone fruit, strawberries. It is a known endocrine disruptor (androgen receptor antagonist) which interferes with the sexual differention of male rats. It is considered to be a poison. | Biology | https://en.wikipedia.org/wiki?curid=17705397 | Procymidone | 172,236 |
GLIMMER In bioinformatics, (Gene Locator and Interpolated Markov ModelER) is used to find genes in prokaryotic DNA. "It is effective at finding genes in bacteria, archea, viruses, typically finding 98-99% of all relatively long protein coding genes". was the first system that used the interpolated Markov model to identify coding regions. The software is open source and is maintained by Steven Salzberg, Art Delcher, and their colleagues at the "Center for Computational Biology" at Johns Hopkins University. The original algorithms and software were designed by Art Delcher, Simon Kasif and Steven Salzberg and applied to bacterial genome annotation in collaboration with Owen White. First Version of "i.e., 1.0" was released in 1998 and it was published in the paper "Microbial gene identification using interpolated Markov model". Markov models were used to identify microbial genes in 1.0. considers the local composition sequence dependencies which makes more flexible and more powerful when compared to fixed-order Markov model. There was a comparison made between interpolated Markov model used by and fifth order Markov model in the paper "Microbial gene identification using interpolated Markov models". "algorithm found 1680 genes out of 1717 annotated genes in Haemophilus influenzae where fifth order Markov model found 1574 genes. found 209 additional genes which were not included in 1717 annotated genes where fifth order Markov model found 104 genes."' Second Version of i.e., 2 | Biology | https://en.wikipedia.org/wiki?curid=1854663 | GLIMMER | 173,019 |
GLIMMER 0 was released in 1999 and it was published in the paper "Improved microbial identification with GLIMMER". This paper provides significant technical improvements such as using interpolated context model instead of interpolated Markov model and resolving overlapping genes which improves the accuracy of GLIMMER. Interpolated context models are used instead of interpolated Markov model which gives the flexibility to select any base. In interpolated Markov model probability distribution of a base is determined from the immediate preceding bases. If the immediate preceding base is irrelevant amino acid translation, interpolated Markov model still considers the preceding base to determine the probability of given base where as interpolated context model which was used in 2.0 can ignore irrelevant bases. False positive predictions were increased in 2.0 to reduce the number of false negative predictions. Overlapped genes are also resolved in 2.0. Various comparisons between 1.0 and 2.0 were made in the paper "Improved microbial identification with GLIMMER" which shows improvement in the later version. "Sensitivity of 1.0 ranges from 98.4 to 99.7% with an average of 99.1% where as 2.0 has a sensitivity range from 98.6 to 99.8% with an average of 99.3%. 2.0 is very effective in finding genes of high density. The parasite Trypanosoma brucei, responsible for causing African sleeping sickness is being identified by 2.0" Third version of GLIMMER, "3 | Biology | https://en.wikipedia.org/wiki?curid=1854663 | GLIMMER | 173,020 |
GLIMMER 0" was released in 2007 and it was published in the paper "Identifying bacterial genes and endosymbiont DNA with Glimmer". This paper describes several major changes made to the system including improved methods to identify coding regions and start codon. Scoring of ORF in 3.0 is done in reverse order i.e., starting from stop codon and moves back towards the start codon. Reverse scanning helps in identifying the coding portion of the gene more accurately which is contained in the context window of IMM. 3.0 also improves the generated training set data by comparing the long-ORF with universal amino acid distribution of widely disparate bacterial genomes."3.0 has an average long-ORF output of 57% for various organisms where as 2.0 has an average long-ORF output of 39%." 3.0 reduces the rate of false positive predictions which were increased in 2.0 to reduce the number of false negative predictions. "3.0 has a start-site prediction accuracy of 99.5% for 3'5' matches where as 2.0 has 99.1% for 3'5' matches. 3.0 uses a new algorithm for scanning coding regions, a new start site detection module, and architecture which integrates all gene predictions across an entire genome." Minimum description length The project helped introduce and popularize the use of variable length models in Computational Biology and Bioinformatics that subsequently have been applied to numerous problems such as protein classification and others | Biology | https://en.wikipedia.org/wiki?curid=1854663 | GLIMMER | 173,021 |
GLIMMER Variable length modeling was originally pioneered by information theorists and subsequently ingeniously applied and popularized in data compression (e.g. Ziv-Lempel compression). Prediction and compression are intimately linked using Minimum Description Length Principles. The basic idea is to create a dictionary of frequent words (motifs in biological sequences). The intuition is that the frequently occurring motifs are likely to be most predictive and informative. In the interpolated model is a mixture model of the probabilities of these relatively common motifs. Similarly to the development of HMMs in Computational Biology, the authors of were conceptually influenced by the previous application of another variant of interpolated Markov models to speech recognition by researchers such as Fred Jelinek (IBM) and Eric Ristad (Princeton). The learning algorithm in is different from these earlier approaches. can be downloaded from The Glimmer home page (requires a C++ compiler). Alternatively, an online version is hosted by NCBI . The steps explained above describes the basic functionality of GLIMMER. There are various improvements made to and some of them are described in the following sub-sections. system consists of two programs. First program called build-imm, which takes an input set of sequences and outputs the interpolated Markov model as follows. The probability for each base i.e., A,C,G,T for all k-mers for 0 ≤ k ≤ 8 is computed. Then, for each k-mer, computes weight | Biology | https://en.wikipedia.org/wiki?curid=1854663 | GLIMMER | 173,022 |
GLIMMER New sequence probability is computed as follows. formula_1 where n is the length of the sequence formula_2 is the oligomer at position x. formula_3, the formula_4-order interpolated Markov model score is computed as formula_5 "where formula_6 is the weight of the k-mer at position x-1 in the sequence S and formula_7 is the estimate obtained from the training data of the probability of the base located at position x in the formula_8-order model." The probability of base formula_2 given the i previous bases is computed as follows. formula_10 "The value of formula_11 associated with formula_12 can be regarded as a measure of confidence in the accuracy of this value as an estimate of the true probability. uses two criteria to determine formula_11. The first of these is simple frequency occurrence in which the number of occurrences of context string formula_14 in the training data exceeds a specific threshold value, then formula_11 is set to 1.0. The current default value for threshold is 400, which gives 95% confidence. When there are insufficient sample occurrences of a context string, build-imm employ additional criteria to determine formula_16 value. For a given context string formula_14 of length i, build-imm compare the observed frequencies of the following base formula_18, formula_19, formula_20, formula_21 with the previously calculated interpolated Markov model probabilities using the next shorter context, formula_22, formula_23, formula_24, formula_25 | Biology | https://en.wikipedia.org/wiki?curid=1854663 | GLIMMER | 173,023 |
GLIMMER Using a formula_26 test, build-imm determine how likely it is that the four observed frequencies are consistent with the IMM values from the next shorter context." The second program called glimmer, then uses this IMM to identify putative gene in an entire genome. identifies all the open reading frame which score higher than threshold and check for overlapping genes. Resolving overlapping genes is explained in the next sub-section. Equations and explanation of the terms used above are taken from the paper 'Microbial gene identification using interpolated Markov models"" In 1.0, when two genes A and B overlap, the overlap region is scored. If A is longer than B, and if A scores higher on the overlap region, and if moving B's start site will not resolve the overlap, then B is rejected. 2.0 provided a better solution to resolve the overlap. In 2.0, when two potential genes A and B overlap, the overlap region is scored. Suppose gene A scores higher, four different orientations are considered. In the above case, moving of start sites does not remove the overlap. If A is significantly longer than B, then B is rejected or else both A and B are called genes, with a doubtful overlap. In the above case, moving of B can resolve the overlap, A and B can be called non overlapped genes but if B is significantly shorter than A, then B is rejected. In the above case, moving of A can resolve the overlap. A is only moved if overlap is a small fraction of A or else B is rejected. In the above case, both A and B can be moved | Biology | https://en.wikipedia.org/wiki?curid=1854663 | GLIMMER | 173,024 |
GLIMMER We first move the start of B until the overlap region scores higher for B. Then we move the start of A until it scores higher. Then B again, and so on, until either the overlap is eliminated or no further moves can be made. The above example has been taken from the paper 'Identifying bacterial genes and endosymbiont DNA with Glimmer"" Ribosome binding site(RBS) signal can be used to find true start site position. results are passed as an input for RBSfinder program to predict ribosome binding sites. 3.0 integrates RBSfinder program into gene predicting function itself. ELPH software( which was determined as highly effective at identifying RBS in the paper) is used for identifying RBS and is available at this website. Gibbs sampling algorithm is used to identify shared motif in any set of sequences. This shared motif sequences and their length is given as input to ELPH. ELPH then computes the position weight matrix(PWM) which will be used by 3 to score any potential RBS found by RBSfinder. The above process is done when we have a substantial amount of training genes. If there are inadequate number of training genes, 3 can bootstrap itself to generate a set of gene predictions which can be used as input to ELPH. ELPH now computes PWM and this PWM can be again used on the same set of genes to get more accurate results for start-sites. This process can be repeated for many iterations to obtain more consistent PWM and gene prediction results | Biology | https://en.wikipedia.org/wiki?curid=1854663 | GLIMMER | 173,025 |
GLIMMER Glimmer supports genome annotation efforts on a wide range of bacterial, archaeal, and viral species. In a large-scale reannotation effort at the DNA Data Bank of Japan (DDBJ, which mirrors Genbank). Kosuge "et al." (2006) examined the gene finding methods used for 183 genomes. They reported that of these projects, Glimmer was the gene finder for 49%, followed by GeneMark with 12%, with other algorithms used in 3% or fewer of the projects. (They also reported that 33% of genomes used "other" programs, which in many cases meant that they could not identify the method. Excluding those cases, Glimmer was used for 73% of the genomes for which the methods could be unambiguously identified.) Glimmer was used by the DDBJ to re-annotate all bacterial genomes in the International Nucleotide Sequence Databases. It is also being used by this group to annotate viruses. Glimmer is part of the bacterial annotation pipeline at the National Center for Biotechnology Information (NCBI), which also maintains a web server for Glimmer, as do sites in Germany, Canada. According to Google Scholar, as of early 2011 the original Glimmer article (Salzberg et al., 1998) has been cited 581 times, and the Glimmer 2.0 article (Delcher et al., 1999) has been cited 950 times. | Biology | https://en.wikipedia.org/wiki?curid=1854663 | GLIMMER | 173,026 |
Remote control animal Remote control animals are animals that are controlled remotely by humans. Some applications require electrodes to be implanted in the animal's nervous system connected to a receiver which is usually carried on the animal's back. The animals are controlled by the use of radio signals. The electrodes do not move the animal directly, as if controlling a robot; rather, they signal a direction or action desired by the human operator and then stimulate the animal's reward centres if the animal complies. These are sometimes called bio-robots or robo-animals. They can be considered to be cyborgs as they combine electronic devices with an organic life form. Because of the surgery required, and the moral and ethical issues involved, there has been criticism aimed at the use of remote control animals, especially regarding animal welfare and animal rights. A similar, non-invasive application has been reported which stimulates the brain with ultrasound to control the animal. Some applications (used primarily for dogs) use vibrations or sound to control the movements of the animals. Several species of animals have been successfully controlled remotely. These include moths, beetles, cockroaches, rats, dogfish sharks, mice and pigeons. Remote control animals can be directed and used as working animals for search and rescue operations or various other uses. Several studies have examined the remote control of rats using micro-electodes implanted into their brains and rely on stimulating the reward centre of the rat | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,231 |
Remote control animal Three electrodes are implanted; two in the ventral posterolateral nucleus of the thalamus which conveys facial sensory information from the left and right whiskers, and a third in the medial forebrain bundle which is involved in the reward process of the rat. This third electrode is used to give a rewarding electrical stimulus to the brain when the rat makes the correct move to the left or right. During training, the operator stimulates the left or right electrode of the rat making it "feel" a touch to the corresponding set of whiskers, as though it had come in contact with an obstacle. If the rat then makes the correct response, the operator rewards the rat by stimulating the third electrode. In 2002, a team of scientists at the State University of New York remotely controlled rats from a laptop up to 500 m away. The rats could be instructed to turn left or right, climb trees and ladders, navigate piles of rubble, and jump from different heights. They could even be commanded into brightly lit areas, which rats usually avoid. It has been suggested that the rats could be used to carry cameras to people trapped in disaster zones. In 2013, researchers reported the development of a radio-telemetry system to remotely control free-roaming rats with a range of 200 m. The backpack worn by the rat includes the mainboard and an FM transmitter-receiver, which can generate biphasic microcurrent pulses | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,232 |
Remote control animal All components in the system are commercially available and are fabricated from surface mount devices to reduce the size (25 x 15 x 2 mm) and weight (10 g with battery). Concerns have been raised about the ethics of such studies. Even one of the pioneers in this area of study, Sanjiv Talwar, said "There's going to have to be a wide debate to see whether this is acceptable or not" and "There are some ethical issues here which I can't deny." Elsewhere he was quoted as saying "The idea sounds a little creepy." Some oppose the idea of placing living creatures under direct human command. "It's appalling, and yet another example of how the human species instrumentalises other species," says Gill Langley of the Dr Hadwen Trust based in Hertfordshire (UK), which funds alternatives to animal-based research. Gary Francione, an expert in animal welfare law at Rutgers University School of Law, says "The animal is no longer functioning as an animal," as the rat is operating under someone's control. And the issue goes beyond whether or not the stimulations are compelling or rewarding the rat to act. "There's got to be a level of discomfort in implanting these electrodes," he says, which may be difficult to justify. Talwar stated that the animal's "native intelligence" can stop it from performing some directives but with enough stimulation, this hesitation can sometimes be overcome, but occasionally cannot | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,233 |
Remote control animal Researchers at Harvard University have created a brain-to-brain interface (BBI) between a human and a Sprague-Dawley rat. Simply by thinking the appropriate thought, the BBI allows the human to control the rat's tail. The human wears an EEG-based brain-to-computer interface (BCI), while the anesthetised rat is equipped with a focused ultrasound (FUS) computer-to-brain interface (CBI). FUS is a technology that allows the researchers to excite a specific region of neurons in the rat's brain using an ultrasound signal (350 kHz ultrasound frequency, tone burst duration of 0.5 ms, pulse repetition frequency of 1 kHz, given for 300 ms duration). The main advantage of FUS is that, unlike most brain-stimulation techniques, it is non-invasive. Whenever the human looks at a specific pattern (strobe light flicker) on a computer screen, the BCI communicates a command to the rat's CBI, which causes ultrasound to be beamed into the region of the rat's motor cortex responsible for tail movement. The researchers report that the human BCI has an accuracy of 94%, and that it generally takes around 1.5 s from the human looking at the screen to movement of the rat's tail. Another system that non-invasively controls rats uses ultrasonic, epidermal and LED photic stimulators on the back. The system receives commands to deliver specified electrical stimulations to the hearing, pain and visual senses of the rat respectively. The three stimuli work in groups for the rat navigation | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,234 |
Remote control animal Other researchers have dispensed with human remote control of rats and instead uses a General Regression Neural Network algorithm to analyse and model controlling of human operations. Dogs are often used in disaster relief, at crime scenes and on the battlefield, but it's not always easy for them to hear the commands of their handlers. A command module which contains a microprocessor, wireless radio, GPS receiver and an attitude and heading reference system (essentially a gyroscope) can be fitted to dogs. The command module delivers vibration or sound commands (delivered by the handler over the radio) to the dog to guide it in a certain direction or to perform certain actions. The overall success rate of the control system is 86.6%. Researchers responsible for developing remote control of a pigeon using brain implants conducted a similar successful experiment on mice in 2005. In 1967, Franz Huber pioneered electrical stimulation to the brain of insects and showed that mushroom body stimulation elicits complex behaviours, including the inhibition of locomotion. The US-based company Backyard Brains released the "RoboRoach", a remote controlled cockroach kit that they refer to as "The world's first commercially available cyborg". The project started as a University of Michigan biomedical engineering student senior design project in 2010 and was launched as an available beta product on 25 February 2011 | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,235 |
Remote control animal The RoboRoach was officially released into production via a TED talk at the TED Global conference, and via the crowdsourcing website Kickstarter in 2013, the kit allows students to use microstimulation to momentarily control the movements of a walking cockroach (left and right) using a bluetooth-enabled smartphone as the controller. The RoboRoach was the first kit available to the general public for the remote control of an animal and was funded by the United States' National Institute of Mental Health as a device to serve as a teaching aid to promote an interest in neuroscience. This funding was due to the similarities between the RoboRoach microstimulation, and the microstimulation used in the treatments of Parkinson's disease (Deep Brain Stimulation) and deafness (Cochlear implants) in humans. Several animal welfare organizations including the RSPCA and PETA have expressed concerns about the ethics and welfare of animals in this project. Another group at North Carolina State University has developed a remote control cockroach. Researchers at NCSU have programmed a path for cockroaches to follow while tracking their location with an Xbox Kinect. The system automatically adjusted the cockroach's movements to ensure it stayed on the prescribed path. In 2009, remote control of the flight movements of the "Cotinus texana" and the much larger "Mecynorrhina torquata" beetles has been achieved during experiments funded by the Defence Advanced Research Projects Agency (DARPA) | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,236 |
Remote control animal The weight of the electronics and battery meant that only Mecynorrhina was strong enough to fly freely under radio control. A specific series of pulses sent to the optic lobes of the insect encouraged it to take flight. The average length of flights was just 45 seconds, although one lasted for more than 30 minutes. A single pulse caused the beetle to land again. Stimulation of basalar flight muscles allowed the controller to direct the insect left or right, although this was successful on only 75% of stimulations. After each maneuver, the beetles quickly righted themselves and continued flying parallel to the ground. In 2015, researchers was able to fine tune the beetle steering in flight by changing the pulse train applied on the wing-folding muscle. Recently, scientists from Nanyang Technological University, Singapore, have demonstrated graded turning and backward walking in a small darkling beetle (Zophobas morio), which is 2 cm to 2.5 cm long and weight only 1 g including the electronic backpack and battery. It has been suggested the beetles could be used for search and rescue mission, however, it has been noted that currently available batteries, solar cells and piezoelectrics that harvest energy from movement cannot provide enough power to run the electrodes and radio transmitters for very long. Work using "Drosophila" has dispensed with stimulating electrodes and developed a 3-part remote control system that evokes action potentials in pre-specified "Drosophila" neurons using a laser beam | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,237 |
Remote control animal The central component of the remote control system is a Ligand-gated ion channel gated by ATP. When ATP is applied, uptake of external calcium is induced and action potentials generated. The remaining two parts of the remote control system include chemically caged ATP, which is injected into the central nervous system through the fly's simple eye, and laser light capable of uncaging the injected ATP. The giant fibre system in insects consists of a pair of large interneurons in the brain which can excite the insect flight and jump muscles. A 200 ms pulse of laser light elicited jumping, wing flapping, or other flight movements in 60%–80% of the flies. Although this frequency is lower than that observed with direct electrical stimulation of the giant fibre system, it is higher than that elicited by natural stimuli, such as a light-off stimulus. Spiny dogfish sharks have been remotely controlled by implanting electrodes deep in the shark's brain to a remote control device outside the tank. When an electric current is passed through the wire, it stimulates the shark's sense of smell and the animal turns, just as it would move toward blood in the ocean. Stronger electrical signals—mimicking stronger smells—cause the shark to turn more sharply. One study is funded by a $600,000 grant from Defense Advanced Research Projects Agency (DARPA). It has been suggested that such sharks could search hostile waters with sensors that detect explosives, or cameras that record intelligence photographs | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,238 |
Remote control animal Outside the military, similar sensors could detect oil spills or gather data on the behaviour of sharks in their natural habitat. Scientists working with remote control sharks admit they are not sure exactly which neurons they are stimulating, and therefore, they can't always control the shark's direction reliably. The sharks only respond after some training, and some sharks don't respond at all. The research has prompted protests from bloggers who allude to remote controlled humans or horror films featuring maniacal cyborg sharks on a feeding frenzy. An alternative technique was to use small gadgets attached to the shark's noses that released squid juice on demand. South Korean researchers have remotely controlled the movements of a turtle using a completely non-invasive steering system. Red-eared terrapins ("Trachemys scripta elegans") were made to follow a specific path by manipulating the turtles' natural obstacle avoidance behaviour. If these turtles detect something is blocking their path in one direction, they move to avoid it. The researchers attached a black half cylinder to the turtle. The "visor" was positioned around the turtle's rear end, but was pivoted around using a microcontroller and a servo motor to either the left or right to partially block the turtle's vision on one side. This made the turtle believe there was an obstacle it needed to avoid on that side and thereby encouraged the turtle to move in the other direction | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,239 |
Remote control animal Some animals have had parts of their bodies remotely controlled, rather than their entire bodies. Researchers in China stimulated the mesencephalon of geckos ("G. gecko") via micro stainless steel electrodes and observed the gecko's responses during stimulation. Locomotion responses such as spinal bending and limb movements could be elicited in different depths of mesencephalon. Stimulation of the periaqueductal gray area elicited ipsilateral spinal bending while stimulation of the ventral tegmental area elicited contralateral spinal bending. In 2007, researchers at east China's Shandong University of Science and Technology implanted micro electrodes in the brain of a pigeon so they could remotely control it to fly right or left, or up or down. Remote-controlled animals are considered to have several potential uses, replacing the need for humans in some dangerous situations. Their application is further widened if they are equipped with additional electronic devices. Small creatures fitted with cameras and other sensors have been proposed as being useful when searching for survivors after a building has collapsed, with cockroaches or rats being small and manoeuvrable enough to go under rubble. There have been a number of suggested military uses of remote controlled animals, particularly in the area of surveillance. Remote-controlled dogfish sharks have been likened to the studies into the use of military dolphins | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,240 |
Remote control animal It has also been proposed that remote-controlled rats could be used for the clearing of land mines. Other suggested fields of application include pest control, the mapping of underground areas, and the study of animal behaviour. Development of robots that are capable of performing the same actions as controlled animals is often technologically difficult and cost-prohibitive. Flight is very difficult to replicate while having an acceptable payload and flight duration. Harnessing insects and using their natural flying ability gives significant improvements in performance. The availability of "inexpensive, organic substitutes" therefore allows for the development of small, controllable robots that are otherwise currently unavailable. Some animals are remotely controlled, but rather than being directed to move left or right, the animal is prevented from moving forward, or its behaviour is modified in other ways. Shock collars deliver electrical shocks of varying intensity and duration to the neck or other area of a dog's body via a radio-controlled electronic device incorporated into a dog collar. Some collar models also include a tone or vibration setting, as an alternative to or in conjunction with the shock. Shock collars are now readily available and have been used in a range of applications, including behavioural modification, obedience training, and pet containment, as well as in military, police and service training | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,241 |
Remote control animal While similar systems are available for other animals, the most common are the collars designed for domestic dogs. The use of shock collars is controversial and scientific evidence for their safety and efficacy is mixed. A few countries have enacted bans or controls on their use. Some animal welfare organizations warn against their use or actively support a ban on their use or sale. Some want restrictions placed on their sale. Some professional dog trainers and their organizations oppose their use and some support them. Support for their use or calls for bans from the general public is mixed. In 2007, it was reported that scientists at the Commonwealth Scientific and Industrial Research Organisation had developed a prototype "invisible fence" using the Global Positioning System (GPS) in a project nicknamed Bovines Without Borders. The system uses battery-powered collars that emit a sound to warn cattle when they are approaching a virtual boundary. If a cow wanders too near, the collar emits a warning noise. If it continues, the cow gets an electric shock of 250-milliwatts . The boundaries are drawn by GPS and exist only as a line on a computer. There are no wires or fixed transmitters at all. The cattle took less than an hour to learn to back off when they heard the warning noise. The scientists indicated that commercial units were up to 10 years away. Another type of invisible fence uses a buried wire that sends radio signals to activate shock collars worn by animals that are "fenced" in | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,242 |
Remote control animal The system works with three signals. The first is visual (white plastic flags spaced at intervals around the perimeter in the fenced-in area), the second is audible (the collar emits a sound when the animal wearing it approaches buried cable), and finally there's an electric shock to indicate they have reached the fence. Other invisible fences are wireless. Rather than using a buried wire, they emit a radio signal from a central unit, and activate when the animal travels beyond a certain radius from the unit. | Biology | https://en.wikipedia.org/wiki?curid=2515883 | Remote control animal | 173,243 |
Tucson Bird Count The (TBC) is a community-based program that monitors bird populations in and around the Tucson, Arizona, United States metropolitan area. With nearly 1000 sites monitored annually, the is among the largest urban biological monitoring programs in the world. Each spring, TBC participants collect data on bird abundance and distribution at hundreds of point count locations arrayed across the Tucson basin. The TBC is an example of citizen science, drawing on the combined efforts of hundreds of volunteers. So that data are of suitable quality for scientific analysis and decisionmaking, all TBC volunteers are skilled birdwatchers; many are also professional field guides or biologists. TBC methods are similar to those employed by the North American Breeding Bird Survey, although the TBC uses more closely spaced sites (one site per 1-km square) over a smaller total area (approximately 1000 km). For full details of TBC methods, see the TBC web page at tucsonbirds.org or Turner (2003). The TBC's spatially systematic monitoring is complemented by a TBC park monitoring program that surveys parks, watercourses, or other areas of particular interest multiple times throughout the year. Uses of data include monitoring the status of the Tucson-area bird community over time, finding the areas and land-use practices that are succeeding at sustaining native birds, and investigating the ecology of birds in human-dominated landscapes | Biology | https://en.wikipedia.org/wiki?curid=2580529 | Tucson Bird Count | 173,401 |
Tucson Bird Count results have led to scientific publications, informed Tucson-area planning, and contributed a variety of projects, from locating populations of imperiled species to estimating risk to humans from West Nile Virus. The TBC and several associated research projects are examples of reconciliation ecology, in that they investigate how native species can be sustained in and around the places people live, work, and play. Researchers have also used TBC data to explore the extent to which urban humans are separated from nature (Turner et al. 2004) Recently, the city of Ottawa, Canada has initiated an urban bird survey that is largely modeled on the Tucson Bird Count. The Ottawa Breeding Bird Count will conduct its inaugural season in 2007. By collaborating among cities, urban surveys like the and the Ottawa Breeding Bird Count, will help researchers discover ways to create habitat for biodiversity in the places where people live and work. The began in spring 2001 and is ongoing. As of summer 2005, the TBC had recorded 192,000 individual birds belonging to 212 distinct species. About 115 of these species are known or suspected to breed in the Tucson area; the remainder are migrants (either to higher latitudes, or to higher elevations in nearby mountain ranges such as the Catalinas) or vagrants. Because the data are entered directly by participants into the web site, results are publicly available on-line with little delay after observations are made | Biology | https://en.wikipedia.org/wiki?curid=2580529 | Tucson Bird Count | 173,402 |
Tucson Bird Count Among the project's more basic, yet striking results are distribution maps for species in the Tucson area. Many species show strong patterns with respect to development intensity, presence of various habitats, or other factors. For example, the non-native rock pigeon, common in urban areas worldwide, is generally restricted to Tucson's urban core. In contrast, Gambel's quail, a characteristic species of Sonoran Desert upland habitats, shows the inverse pattern, common near Tucson's periphery yet absent from most of the more heavily developed central portion of the city. | Biology | https://en.wikipedia.org/wiki?curid=2580529 | Tucson Bird Count | 173,403 |
Survival analysis is a branch of statistics for analyzing the expected duration of time until one or more events happen, such as death in biological organisms and failure in mechanical systems. This topic is called reliability theory or reliability analysis in engineering, duration analysis or duration modelling in economics, and event history analysis in sociology. attempts to answer questions such as: what is the proportion of a population which will survive past a certain time? Of those that survive, at what rate will they die or fail? Can multiple causes of death or failure be taken into account? How do particular circumstances or characteristics increase or decrease the probability of survival? To answer such questions, it is necessary to define "lifetime". In the case of biological survival, death is unambiguous, but for mechanical reliability, failure may not be well-defined, for there may well be mechanical systems in which failure is partial, a matter of degree, or not otherwise localized in time. Even in biological problems, some events (for example, heart attack or other organ failure) may have the same ambiguity. The theory outlined below assumes well-defined events at specific times; other cases may be better treated by models which explicitly account for ambiguous events | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,664 |
Survival analysis More generally, survival analysis involves the modelling of time to event data; in this context, death or failure is considered an "event" in the survival analysis literature – traditionally only a single event occurs for each subject, after which the organism or mechanism is dead or broken. "Recurring event" or "repeated event" models relax that assumption. The study of recurring events is relevant in systems reliability, and in many areas of social sciences and medical research. is used in several ways: The following terms are commonly used in survival analyses: This example uses the Acute Myelogenous Leukemia survival data set "aml" from the "survival" package in R. The data set is from Miller (1997) and the question is whether the standard course of chemotherapy should be extended ('maintained') for additional cycles. The aml data set sorted by survival time is shown in the box. The last observation (11), at 161 weeks, is censored. Censoring indicates that the patient did not have an event (no recurrence of aml cancer). Another subject, observation 3, was censored at 13 weeks (indicated by status=0). This subject was in the study for only 13 weeks, and the aml cancer did not recur during those 13 weeks. It is possible that this patient was enrolled near the end of the study, so that they could be observed for only 13 weeks. It is also possible that the patient was enrolled early in the study, but was lost to follow up or withdrew from the study | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,665 |
Survival analysis The table shows that other subjects were censored at 16, 28, and 45 weeks (observations 17, 6, and9 with status=0). The remaining subjects all experienced events (recurrence of aml cancer) while in the study. The question of interest is whether recurrence occurs later in maintained patients than in non-maintained patients. The survival function S(t), is the probability that a subject survives longer than time t. S(t) is theoretically a smooth curve, but it is usually estimated using the Kaplan-Meier (KM) curve. The graph shows the KM plot for the aml data and can be interpreted as follows: A life table summarizes survival data in terms of the number of events and the proportion surviving at each event time point. The life table for the aml data, created using the Rsoftware, is shown. The life table summarizes the events and the proportion surviving at each event time point. The columns in the life table have the following interpretation: The log-rank test compares the survival times of two or more groups. This example uses a log-rank test for a difference in survival in the maintained versus non-maintained treatment groups in the aml data. The graph shows KM plots for the aml data broken out by treatment group, which is indicated by the variable "x" in the data. The null hypothesis for a log-rank test is that the groups have the same survival. The expected number of subjects surviving at each time point in each is adjusted for the number of subjects at risk in the groups at each event time | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,666 |
Survival analysis The log-rank test determines if the observed number of events in each group is significantly different from the expected number. The formal test is based on a chi-squared statistic. When the log-rank statistic is large, it is evidence for a difference in the survival times between the groups. The log-rank statistic approximately has a chi-squared distribution with one degree of freedom, and the p-value is calculated using the chi-squared distribution. For the example data, the log-rank test for difference in survival gives a p-value of p=0.0653, indicating that the treatment groups do not differ significantly in survival, assuming an alpha level of 0.05. The sample size of 23 subjects is modest, so there is little power to detect differences between the treatment groups. The chi-squared test is based on asymptotic approximation, so the p-value should be regarded with caution for small sample sizes. Kaplan-Meier curves and log-rank tests are most useful when the predictor variable is categorical (e.g., drug vs. placebo), or takes a small number of values (e.g., drug doses 0, 20, 50, and 100 mg/day) that can be treated as categorical. The log-rank test and KM curves don't work easily with quantitative predictors such as gene expression, white blood count, or age. For quantitative predictor variables, an alternative method is Cox proportional hazards regression analysis. Cox PH models work also with categorical predictor variables, which are encoded as {0,1} indicator or dummy variables | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,667 |
Survival analysis The log-rank test is a special case of a Cox PH analysis, and can be performed using Cox PH software. This example uses the melanoma data set from Dalgaard Chapter 12. Data are in the R package ISwR. The Cox proportional hazards regression usingR gives the results shown in the box. The Cox regression results are interpreted as follows. The summary output also gives upper and lower 95% confidence intervals for the hazard ratio: lower 95% bound = 1.15; upper 95% bound = 3.26. Finally, the output gives p-values for three alternative tests for overall significance of the model: These three tests are asymptotically equivalent. For large enough N, they will give similar results. For small N, they may differ somewhat. The last row, "Score (logrank) test" is the result for the log-rank test, with p=0.011, the same result as the log-rank test, because the log-rank test is a special case of a Cox PH regression. The Likelihood ratio test has better behavior for small sample sizes, so it is generally preferred. The Cox model extends the log-rank test by allowing the inclusion of additional covariates. This example use the melanoma data set where the predictor variables include a continuous covariate, the thickness of the tumor (variable name = "thick"). In the histograms, the thickness values don't look normally distributed. Regression models, including the Cox model, generally give more reliable results with normally-distributed variables. For this example use a log transform | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,668 |
Survival analysis The log of the thickness of the tumor looks to be more normally distributed, so the Cox models will use log thickness. The Cox PH analysis gives the results in the box. The p-value for all three overall tests (likelihood, Wald, and score) are significant, indicating that the model is significant. The p-value for log(thick) is 6.9e-07, with a hazard ratio HR = exp(coef) = 2.18, indicating a strong relationship between the thickness of the tumor and increased risk of death. By contrast, the p-value for sex is now p=0.088. The hazard ratio HR = exp(coef) = 1.58, with a 95% confidence interval of 0.934 to 2.68. Because the confidence interval for HR includes 1, these results indicate that sex makes a smaller contribution to the difference in the HR after controlling for the thickness of the tumor, and only trend toward significance. Examination of graphs of log(thickness) by sex and a t-test of log(thickness) by sex both indicate that there is a significant difference between men and women in the thickness of the tumor when they first see the clinician. The Cox model assumes that the hazards are proportional. The proportional hazard assumption may be tested using the Rfunction cox.zph(). A p-value is less than 0.05 indicates that the hazards are not proportional. For the melanoma data, p=0.222, indicating that the hazards are, at least approximately, proportional. Additional tests and graphs for examining a Cox model are described in the textbooks cited | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,669 |
Survival analysis Cox models can be extended to deal with variations on the simple analysis. The Cox PH regression model is a linear model. It is similar to linear regression and logistic regression. Specifically, these methods assume that a single line, curve, plane, or surface is sufficient to separate groups (alive, dead) or to estimate a quantitative response (survival time). In some cases alternative partitions give more accurate classification or quantitative estimates. One set of alternative methods are tree-structured survival models, including survival random forests. Tree-structured survival models may give more accurate predictions than Cox models. Examining both types of models for a given data set is a reasonable strategy. This example of a survival tree analysis uses the Rpackage "rpart". The example is based on 146 stageC prostate cancer patients in the data set stagec in rpart. Rpart and the stagec example are described in the PDF document "An Introduction to Recursive Partitioning Using the RPART Routines". Terry M. Therneau, Elizabeth J. Atkinson, Mayo Foundation. September 3, 1997. The variables in stagec are: The survival tree produced by the analysis is shown in the figure. Each branch in the tree indicates a split on the value of a variable. For example, the root of the tree splits subjects with grade < 2.5 versus subjects with grade 2.5 or greater. The terminal nodes indicate the number of subjects in the node, the number of subjects who have events, and the relative event rate compared to the root | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,670 |
Survival analysis In the node on the far left, the values 1/33 indicate that one of the 33 subjects in the node had an event, and that the relative event rate is 0.122. In the node on the far right bottom, the values 11/15 indicate that 11 of 15 subjects in the node had an event, and the relative event rate is 2.7. An alternative to building a single survival tree is to build many survival trees, where each tree is constructed using a sample of the data, and average the trees to predict survival. This is the method underlying the survival random forest models. Survival random forest analysis is available in the Rpackage "randomForestSRC". The randomForestSRC package includes an example survival random forest analysis using the data set pbc. This data is from the Mayo Clinic Primary Biliary Cirrhosis (PBC) trial of the liver conducted between 1974 and 1984. In the example, the random forest survival model gives more accurate predictions of survival than the Cox PH model. The prediction errors are estimated by bootstrap re-sampling. The object of primary interest is the survival function, conventionally denoted "S", which is defined as where "t" is some time, "T" is a random variable denoting the time of death, and "Pr" stands for probability. That is, the survival function is the probability that the time of death is later than some specified time "t" | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,671 |
Survival analysis The survival function is also called the "survivor function" or "survivorship function" in problems of biological survival, and the "reliability function" in mechanical survival problems. In the latter case, the reliability function is denoted "R"("t"). Usually one assumes "S"(0) = 1, although it could be less than 1if there is the possibility of immediate death or failure. The survival function must be non-increasing: "S"("u") ≤ "S"("t") if "u" ≥ "t". This property follows directly because "T">"u" implies "T">"t". This reflects the notion that survival to a later age is possible only if all younger ages are attained. Given this property, the lifetime distribution function and event density ("F" and "f" below) are well-defined. The survival function is usually assumed to approach zero as age increases without bound (i.e., "S"("t") → 0 as "t" → ∞), although the limit could be greater than zero if eternal life is possible. For instance, we could apply survival analysis to a mixture of stable and unstable carbon isotopes; unstable isotopes would decay sooner or later, but the stable isotopes would last indefinitely. Related quantities are defined in terms of the survival function | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,672 |
Survival analysis The lifetime distribution function, conventionally denoted "F", is defined as the complement of the survival function, If "F" is differentiable then the derivative, which is the density function of the lifetime distribution, is conventionally denoted "f", The function "f" is sometimes called the event density; it is the rate of death or failure events per unit time. The survival function can be expressed in terms of probability distribution and probability density functions Similarly, a survival event density function can be defined as In other fields, such as statistical physics, the survival event density function is known as the first passage time density. The hazard function, conventionally denoted formula_6, is defined as the event rate at time "t" conditional on survival until time "t" or later (that is, "T" ≥ "t"). Suppose that an item has survived for a time t and we desire the probability that it will not survive for an additional time "dt": Force of mortality is a synonym of "hazard function" which is used particularly in demography and actuarial science, where it is denoted by formula_8. The term "hazard rate" is another synonym. The force of mortality of the survival function is defined as formula_9 The force of mortality is also called the force of failure. It is the probability density function of the distribution of mortality. In actuarial science, the hazard rate is the rate of death for lives aged x | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,673 |
Survival analysis For a life aged x, the force of mortality t years later is the force of mortality for a (x + t)–year old. The hazard rate is also called the failure rate. Hazard rate and failure rate are names used in reliability theory. Any function "h" is a hazard function if and only if it satisfies the following properties: In fact, the hazard rate is usually more informative about the underlying mechanism of failure than the other representatives of a lifetime distribution. The hazard function must be non-negative, λ("t") ≥ 0, and its integral over formula_12 must be infinite, but is not otherwise constrained; it may be increasing or decreasing, non-monotonic, or discontinuous. An example is the bathtub curve hazard function, which is large for small values of "t", decreasing to some minimum, and thereafter increasing again; this can model the property of some mechanical systems to either fail soon after operation, or much later, as the system ages. The hazard function can alternatively be represented in terms of the cumulative hazard function, conventionally denoted formula_13: so transposing signs and exponentiating or differentiating (with the chain rule) The name "cumulative hazard function" is derived from the fact that which is the "accumulation" of the hazard over time. From the definition of formula_18, we see that it increases without bound as "t" tends to infinity (assuming that "S"("t") tends to zero) | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,674 |
Survival analysis This implies that formula_19 must not decrease too quickly, since, by definition, the cumulative hazard has to diverge. For example, formula_20 is not the hazard function of any survival distribution, because its integral converges to 1. The survival function "S"("t"), the cumulative hazard function Λ("t"), the density "f"("t"), and the hazard function λ("t") are related through Future lifetime at a given time formula_22 is the time remaining until death, given survival to age formula_22. Thus, it is formula_24 in the present notation. The expected future lifetime is the expected value of future lifetime. The probability of death at or before age formula_25, given survival until age formula_22, is just Therefore, the probability density of future lifetime is and the expected future lifetime is where the second expression is obtained using integration by parts. For formula_30, that is, at birth, this reduces to the expected lifetime. In reliability problems, the expected lifetime is called the "mean time to failure", and the expected future lifetime is called the "mean residual lifetime". As the probability of an individual surviving until age "t" or later is "S"("t"), by definition, the expected number of survivors at age "t" out of an initial population of "n" newborns is "n" × "S"("t"), assuming the same survival function for all individuals. Thus the expected proportion of survivors is "S"("t") | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,675 |
Survival analysis If the survival of different individuals is independent, the number of survivors at age "t" has a binomial distribution with parameters "n" and "S"("t"), and the variance of the proportion of survivors is "S"("t") × (1-"S"("t"))/"n". The age at which a specified proportion of survivors remain can be found by solving the equation "S"("t") = "q" for "t", where "q" is the quantile in question. Typically one is interested in the median lifetime, for which "q" = 1/2, or other quantiles such as "q" = 0.90 or "q" = 0.99. One can also make more complex inferences from the survival distribution. In mechanical reliability problems, one can bring cost (or, more generally, utility) into consideration, and thus solve problems concerning repair or replacement. This leads to the study of renewal theory and reliability theory of ageing and longevity. Censoring is a form of missing data problem in which time to event is not observed for reasons such as termination of study before all recruited subjects have shown the event of interest or the subject has left the study prior to experiencing an event. Censoring is common in survival analysis. If only the lower limit "l" for the true event time "T" is known such that "T" > "l", this is called "right censoring". Right censoring will occur, for example, for those subjects whose birth date is known but who are still alive when they are lost to follow-up or when the study ends. We generally encounter right-censored data | Biology | https://en.wikipedia.org/wiki?curid=419259 | Survival analysis | 174,676 |