doc_id
int32 18
2.25M
| text
stringlengths 245
2.96k
| source
stringlengths 38
44
| __index_level_0__
int64 18
2.25M
|
---|---|---|---|
408,582 | The 21st century is marked by significant technological advancements. New technology in environmental science has transformed how researchers gather information about various topics in the field. Research in engines, fuel efficiency, and decreasing emissions from vehicles since the times of the Industrial Revolution has reduced the amount of carbon and other pollutants into the atmosphere. Furthermore, investment in researching and developing clean energy (i.e. wind, solar, hydroelectric, and geothermal power) has significantly increased in recent years, indicating the beginnings of the divestment from fossil fuel use. Geographic information systems (GIS) are used to observe sources of air or water pollution through satellites and digital imagery analysis. This technology allows for advanced farming techniques like precision agriculture as well as monitoring water usage in order to set market prices. In the field of water quality, developed strains of natural and manmade bacteria contribute to bioremediation, the treatment of wastewaters for future use. This method is more eco-friendly and cheaper than manual cleanup or treatment of wastewaters. Most notably, the expansion of computer technology has allowed for large data collection, advanced analysis, historical archives, public awareness of environmental issues, and international scientific communication. The ability to crowdsource on the Internet, for example, represents the process of collectivizing knowledge from researchers around the world to create increased opportunity for scientific progress. With crowdsourcing, data is released to the public for personal analyses which can later be shared as new information is found. Another technological development, blockchain technology, monitors and regulates global fisheries. By tracking the path of fish through global markets, environmental scientists can observe whether certain species are being overharvested to the point of extinction. Additionally, remote sensing allows for the detection of features of the environment without physical intervention. The resulting digital imagery is used to create increasingly accurate models of environmental processes, climate change, and much more. Advancements to remote sensing technology are particularly useful in locating the nonpoint sources of pollution and analyzing ecosystem health through image analysis across the electromagnetic spectrum. Lastly, thermal imaging technology is used in wildlife management to catch and discourage poachers and other illegal wildlife traffickers from killing endangered animals, proving useful for conservation efforts. Artificial intelligence has also been used to predict the movement of animal populations and protect the habitats of wildlife. | https://en.wikipedia.org/wiki?curid=64919 | 408,381 |
2,126,292 | Although the applications of pharmacometabolomics to personalized medicine are largely only being realized now, the study of an individual's metabolism has been used to treat disease since the Middle Ages. Early physicians employed a primitive form of metabolomic analysis by smelling, tasting and looking at urine to diagnose disease. Obviously the measurement techniques needed to look at specific metabolites were unavailable at that time, but such technologies have evolved dramatically over the last decade to develop precise, high-throughput devices, as well as the accompanying data analysis software to analyze output. Currently, sample purification processes, such as liquid or gas chromatography, are coupled with either mass spectrometry (MS)-based or nuclear magnetic resonance (NMR)-based analytical methods to characterize the metabolite profiles of individual patients. Continually advancing informatics tools allow for the identification, quantification and classification of metabolites to determine which pathways may influence certain pharmaceutical interventions. One of the earliest studies discussing the principle and applications of pharmacometabolomics was conducted in an animal model to look at the metabolism of paracetamol and liver damage. NMR spectroscopy was used to analyze the urinary metabolic profiles of rats pre- and post-treatment with paracetamol. The analysis revealed a certain metabolic profile associated with increased liver damage following paracetamol treatment. At this point, it was eagerly anticipated that such pharmacometabolomics approaches could be applied to personalized human medicine. Since this publication in 2006, the Pharmacometabolomics Research Network led by Duke University researchers and that included partnerships between centers of excellence in metabolomics, pharmacogenomics and informatics (over sixteen academic centers funded by NIGMS) has been able to illustrate for the first time the power of the pharmacometabolomics approach in informing about treatment outcomes in large clinical studies and with use of drugs that include antidepressants, statins, antihypertensives, antiplatelet therapies and antipsychotics. Totally new concepts emerged from these studies on use of pharmacometabolomics as a tool that can bring a paradigm shift in the field of pharmacology. It illustrated how pharmacometabolomics can enable a Quantitative and Systems Pharmacology approach. | https://en.wikipedia.org/wiki?curid=34939027 | 2,125,071 |
2,056,045 | SPI can be applied to other, perhaps more common, benthic disturbance investigations as well (). To illustrate, consider a benthic ecological impact study for a hypothetical shellfish mariculture facility. There are an enormous variety of study approaches. Existing information and the available resources inevitably constrain every design. With little information on bottom type, a simple, one-off, spatial impact study like that shown in Figure 5 with eight sites along an isobath, taking three replicate grabs from each, is fairly common and moderately powerful. Prior data gathering including bathymetric, diver, towed-camera, ROV, or side-scan sonar observations would probably alter site placement and greatly enhance overall information and value. Collecting such data over even a small site such as this one requires considerable resources and will probably cause a gap of several days to allow data processing between the first field days and the grab sampling events (It is this delay that precludes, or reduces, the value of studying transient events in hydrodynamically energetic areas). Collecting a large number of point data from an SPI device is easily done where the resulting snapshots of the benthic character are automatically placed on a map of the study area in real time. This approach allows rapid categorisation according to one or more variables of interest. In waters <30 m deep it is not unreasonable to expect to collect the 170 SP images indicated in Figure 6 and produce a rough benthic classification map in a single field day. The categories may be based on sediment texture, overburden, specific detritus, biota, etc. Sampling effort can then be allocated to focus on the variability of communities among the gross habitat differences by using grabs as habitat replicates with varying lag. This type of approach produces a broader understanding of the system and permits more informed decisions by increasing the generality of the grab sample data. The SPI evidence can effectively increase the extent from one dimension to at least two. Correlation between physical and biological data collected from the grabs also allows more data to be extracted from the SP imagery by identifying specific features (infaunal species, tubes, mounds, etc.). Furthermore, a detailed analysis of ARPD depths can then be presented as geochemical environment contours. | https://en.wikipedia.org/wiki?curid=14917968 | 2,054,862 |
1,589,303 | As a young professor at UC Davis, Deamer continued to work with electron microscopy, revealing for the first time particles related to functional ATPase enzymes within the membranes of sarcoplasmic reticulum. After spending sabbaticals in England at the University of Bristol in 1971 and with Alec Bangham in 1975, Deamer became interested in liposomes. Conversations with Bangham inspired his research on the role of membranes in the origin of life, and in 1985 Deamer demonstrated that the Murchison carbonaceous meteorite contained lipid-like compounds that could assemble into membranous vesicles. Deamer described the significance of self-assembly processes in his 2011 book "First Life". In collaborative work with Mark Akeson, a post-doctoral student at the time, the two established methods for monitoring proton permeation through ion channels such as gramicidin. In 1989, while returning from a scientific meeting in Oregon, Deamer conceived that it might be possible to sequence single molecules of DNA by using an imposed voltage to pull them individually through a nanoscopic channel. The DNA sequence could be distinguished by the specific modulating effect of the four bases on the ionic current through the channel. In 1993, he and Dan Branton initiated a research collaboration with John Kasianowitz at NIST to explore this possibility with the hemolysin channel, and in 1996 published the first paper demonstrating that nanopore sequencing may be feasible. George Church at Harvard had independently proposed a similar idea, and Church, Branton and Deamer decided to initiate a patent application which was awarded in 1998. Mark Akeson joined the research effort in 1997, and in 1999 published a paper showing that the hemolysin channel, now referred to as a nanopore, could distinguish between purine and pyrimidine bases in single RNA molecules. In 2007, Oxford Nanopore Technologies (ONT) licensed the patents describing the technology and in 2014 released the MinION nanopore sequencing device to selected researchers. The first publications appeared in 2015, one of which used the MinION to sequence E. coli DNA with 99.4% accuracy relative to the established 5.4 million base pair genome. Despite earlier skepticism, nanopore sequencing is now accepted as a viable third generation sequencing method. | https://en.wikipedia.org/wiki?curid=48840722 | 1,588,409 |
766,744 | Starting in the early 1930s, single-beam sounders were used to make bathymetry maps. Today, multibeam echosounders (MBES) are typically used, which use hundreds of very narrow adjacent beams (typically 256) arranged in a fan-like swath of typically 90 to 170 degrees across. The tightly packed array of narrow individual beams provides very high angular resolution and accuracy. In general, a wide swath, which is depth dependent, allows a boat to map more seafloor in less time than a single-beam echosounder by making fewer passes. The beams update many times per second (typically 0.1–50 Hz depending on water depth), allowing faster boat speed while maintaining 100% coverage of the seafloor. Attitude sensors allow for the correction of the boat's roll and pitch on the ocean surface, and a gyrocompass provides accurate heading information to correct for vessel yaw. (Most modern MBES systems use an integrated motion-sensor and position system that measures yaw as well as the other dynamics and position.) A boat-mounted Global Positioning System (GPS) (or other Global Navigation Satellite System (GNSS)) positions the soundings with respect to the surface of the earth. Sound speed profiles (speed of sound in water as a function of depth) of the water column correct for refraction or "ray-bending" of the sound waves owing to non-uniform water column characteristics such as temperature, conductivity, and pressure. A computer system processes all the data, correcting for all of the above factors as well as for the angle of each individual beam. The resulting sounding measurements are then processed either manually, semi-automatically or automatically (in limited circumstances) to produce a map of the area. a number of different outputs are generated, including a sub-set of the original measurements that satisfy some conditions (e.g., most representative likely soundings, shallowest in a region, etc.) or integrated Digital Terrain Models (DTM) (e.g., a regular or irregular grid of points connected into a surface). Historically, selection of measurements was more common in hydrographic applications while DTM construction was used for engineering surveys, geology, flow modeling, etc. Since ca. 2003–2005, DTMs have become more accepted in hydrographic practice. | https://en.wikipedia.org/wiki?curid=965387 | 766,333 |
246,713 | Beginning with the start of Samuel H. Smith’s term as President in 1985, marked a large period of growth for WSU. In 1989, WSU gained branch campuses in Spokane, the Tri-Cities, and Vancouver with established extension offices and research centers in all regions of the state, with facilities in Prosser and Wenatchee. Smith proved to be a consummate fundraiser, with about $760 million raised during his term, thanks in some part to Microsoft co-founder and alumnus Paul Allen. In the 1990s Smith began to clamp down and take action regarding student alcohol abuse and disciplinary issues after some high-profile incidents on campus in an effort to improve the university's image. The efforts seemed to have paid off when WSU lost its rank and was completely excluded from "The Princeton Review"’s party school list in August 2000. Improving the quality of education was the defining goal of the university under V. Lane Rawlins, who raised admission requirements and sought to improve the academic profile of the school with improved curricula and research facilities. After Rawlins retired in 2006, Elson Floyd succeeded him as president. Under Floyd's leadership, increasing the diversity of the student body and continuing to raise the stature and reach of the university were a focus. In his eight years as president, WSU enrollment figures went up by 17 percent, including a 12.5 percent increase in the number of students of color, the amount of research grants awarded to WSU tripled to $600 million a year, and he led expansions in all of WSU's branch campuses-most notably successfully campaigning for the creation of the public medical school that now bears his name at WSU - Spokane, the Elson S. Floyd College of Medicine. The second public medical school in Washington, and only one of three in the state, is seen as key to the university's organizational mission as a state land-grant university and its ambitions as a research university. Created five years after the passage of the Affordable Care Act in 2015, the medical school's goal is to alleviate a physician shortage in rural and eastern Washington using a community-based approach. The med school is said to be a key component in the university's new research-focused $1.5 billion "Drive to 25" campaign under President Kirk Schulz, which seeks to make WSU among the nations top 25 public research universities by 2030. The Elson S. Floyd College of Medicine achieved full accreditation in June 2021. | https://en.wikipedia.org/wiki?curid=228600 | 246,585 |
77,553 | Richard Feynman was among the most well-known physicists associated with Caltech, having published the "Feynman Lectures on Physics", an undergraduate physics text, and popular science texts such as "Six Easy Pieces" for the general audience. The promotion of physics made him a public figure of science, although his Nobel-winning work in quantum electrodynamics was already very established in the scientific community. Murray Gell-Mann, a Nobel-winning physicist, introduced a classification of hadrons and went on to postulate the existence of quarks, which is currently accepted as part of the Standard Model. Long-time Caltech President Robert Andrews Millikan was the first to calculate the charge of the electron with his well-known oil-drop experiment, while Richard Chace Tolman is remembered for his contributions to cosmology and statistical mechanics. 2004 Nobel Prize in Physics winner H. David Politzer is a current professor at Caltech, as is astrophysicist and author Kip Thorne and eminent mathematician Barry Simon. Linus Pauling pioneered quantum chemistry and molecular biology, and went on to discover the nature of the chemical bond in 1939. Seismologist Charles Richter, also an alumnus, developed the magnitude scale that bears his name, the Richter magnitude scale for measuring the power of earthquakes. One of the founders of the geochemistry department, Clair Patterson was the first to accurately determine the age of the Earth via lead:uranium ratio in meteorites. In engineering, Theodore von Kármán made many key advances in aerodynamics, notably his work on supersonic and hypersonic airflow characterization. A repeating pattern of swirling vortices is named after him, the von Kármán vortex street. Participants in von Kármán's GALCIT project included Frank Malina, who helped develop the WAC Corporal, which was the first U.S. rocket to reach the edge of space, Jack Parsons, a pioneer in the development of liquid and solid rocket fuels who designed the first castable composite-based rocket motor, and Qian Xuesen, who was dubbed the "Father of Chinese Rocketry". More recently, Michael Brown, a professor of planetary astronomy, discovered many trans-Neptunian objects, most notably the dwarf planet Eris, which prompted the International Astronomical Union to redefine the term "planet". | https://en.wikipedia.org/wiki?curid=5786 | 77,524 |
761,012 | Another revolutionary development of the 20th century was quantum theory, which emerged from the seminal contributions of Max Planck (1856–1947) (on black-body radiation) and Einstein's work on the photoelectric effect. In 1912, a mathematician Henri Poincare published "Sur la théorie des quanta". He introduced the first non-naïve definition of quantization in this paper. The development of early quantum physics followed by a heuristic framework devised by Arnold Sommerfeld (1868–1951) and Niels Bohr (1885–1962), but this was soon replaced by the quantum mechanics developed by Max Born (1882–1970), Werner Heisenberg (1901–1976), Paul Dirac (1902–1984), Erwin Schrödinger (1887–1961), Satyendra Nath Bose (1894–1974), and Wolfgang Pauli (1900–1958). This revolutionary theoretical framework is based on a probabilistic interpretation of states, and evolution and measurements in terms of self-adjoint operators on an infinite-dimensional vector space. That is called Hilbert space (introduced by mathematicians David Hilbert (1862–1943), Erhard Schmidt(1876-1959) and Frigyes Riesz (1880-1956) in search of generalization of Euclidean space and study of integral equations), and rigorously defined within the axiomatic modern version by John von Neumann in his celebrated book "Mathematical Foundations of Quantum Mechanics", where he built up a relevant part of modern functional analysis on Hilbert spaces, the spectral theory (introduced by David Hilbert who investigated quadratic forms with infinitely many variables. Many years later, it had been revealed that his spectral theory is associated with the spectrum of the hydrogen atom. He was surprised by this application.) in particular. Paul Dirac used algebraic constructions to produce a relativistic model for the electron, predicting its magnetic moment and the existence of its antiparticle, the positron. | https://en.wikipedia.org/wiki?curid=173416 | 760,606 |
675,146 | With the raft of safety improvements as a result of Peterson's fatal crash being implemented during the late 70s and early 80s Formula One overall became much safer despite the deaths of Patrick Depailler in 1980 and Gilles Villeneuve & Riccardo Paletti in 1982. The huge amounts of downforce created by ground effect became increasingly dangerous as years went on, and aside from the fatal accidents mentioned above, a number of drivers crashed heavily enough for their careers to be brought to an end, and the technology was banned outright at the start of the 1983 season. These safety changes coupled with the much stronger carbon fibre replacing aluminium as the material of choice for chassis construction meant there was not a single driver fatality at a race meeting for the rest of the decade. However one factor threatening to undo all this progress was the almost exponential power increases being extracted from turbocharged engines. Renault proved in 1980 that turbocharging was the way to go to success, with their very dominant performances in qualifying in almost every race, especially on fast and high-altitude circuits, where the thinner air did not affect the turbocharged engines. With power output doubling in less than 10 years and figures in excess of talked about by engine manufacturers, from 1986 onwards the FIA's primary goal was to rein in the turbo engines before finally banning them altogether at the end of the 1988 season. Brabham team owner Bernie Ecclestone and ex-March team owner Max Mosley set new organizational standards for Formula One, something they had been working on since 1972. All the races are now more organized by Formula One Management instead of circuit organizers doing their own things; such as setting specific times for when races, practice sessions and qualifying sessions are to start, and teams must commit to all of however many races are in a season, in order to assure sponsors that their advertising will be seen by television cameras, which was also an enterprise set up by Ecclestone and Mosley. This effectively transformed the sport into the multibillion-dollar business it is today. | https://en.wikipedia.org/wiki?curid=20796228 | 674,793 |
1,268,003 | Gilman attended local elementary school in White Plains. Hoping for better education, in 1955 his parents sent him to The Taft School in Watertown, Connecticut, where he completed grades 10 to 12. The school was known for its sports activity, and he described it as "a strict, monastic, and frankly unpleasant environment in the 1950s: academic boot camp." He recalled that he was "the worst 120-pound lineman on the intramural tackle football team." He studied science at Yale University. His first research project was to test the adaptor hypothesis of Francis Crick. He worked in the laboratory of Melvin Simpson, where he met his future wife Kathryn Hedlund. (They were married in 1963.) He graduated in 1962 receiving a BA in biology with major in biochemistry. During summer break in 1962, he briefly worked at Burroughs Wellcome & Company in New York, under with Allan Conney. With Conney he published his first two research papers in 1963. He then entered a combined MD-PhD program at Case Western Reserve University School of Medicine in Cleveland, Ohio where he wanted to study under Nobel laureate pharmacologist Earl Sutherland, who was a close friend of his father. It was Sutherland who had introduced the combined MD-PhD course, and invited Gilman to join course. But to Gilman, a seven-year program was like "an eternity in purgatory" and that he preferred not to have a degree in pharmacology, so he refused. Sutherland later persuaded him by explaining that pharmacology was "just biochemistry with a purpose." However, Sutherland was departing for Vanderbilt University, so Gilman studied with Sutherland's collaborator, Theodore Rall. Gilman graduated from Case Western in 1969, then did his post-doctoral studies at the National Institutes of Health with Nobel laureate Marshall Nirenberg from 1969 to 1971. Nirenberg assigned him to work on the study of nerve endings (axons from cultured neuroblastoma cells), which he considered as "a truly boring project." Instead, against the advice of Nirenberg, he worked on a new method for studying protein binding. After six weeks of working, he showed his result to Nirenberg, who immediately communicated it and got it published in 1970. The work was a simple and vital biochemical assay for studying cyclic AMP. | https://en.wikipedia.org/wiki?curid=1414381 | 1,267,313 |
758,594 | Webster's "Speller" was the pedagogical blueprint for American textbooks; it was so arranged that it could be easily taught to students, and it progressed by age. Webster believed students learned most readily when complex problems were broken into its component parts. Each pupil could master one part before moving to the next. Ellis argues that Webster anticipated some of the insights associated in the 20th century with Jean Piaget's theory of cognitive development. Webster said that children pass through distinctive learning phases in which they master increasingly complex or abstract tasks. He stressed that teachers should not try to teach a three-year-old how to read—wait until they are ready at age five. He planned the "Speller" accordingly, starting with the alphabet, then covering the different sounds of vowels and consonants, then syllables; simple words came next, followed by more complex words, then sentences. Webster's "Speller" was entirely secular. It ended with two pages of important dates in American history, beginning with Columbus' "discovery" in 1492 and ending with the Battle of Yorktown in 1781, by which the United States achieved independence. There was no mention of God, the Bible, or sacred events. As Ellis explains, "Webster began to construct a secular catechism to the nation-state. Here was the first appearance of 'civics' in American schoolbooks. In this sense, Webster's speller was the secular successor to "The New England Primer" with its explicitly biblical injunctions." Bynack (1984) examines Webster in relation to his commitment to the idea of a unified American national culture that would prevent the decline of republican virtues and national solidarity. Webster acquired his perspective on language from such German theorists as Johann David Michaelis and Johann Gottfried Herder. He believed with them that a nation's linguistic forms and the thoughts correlated with them shaped individuals' behavior. He intended the etymological clarification and reform of American English to improve citizens' manners and thereby preserve republican purity and social stability. Webster animated his "Speller" and "Grammar" by following these principles. | https://en.wikipedia.org/wiki?curid=9083795 | 758,188 |
1,136,477 | From the early 15th century to the early 17th century the Age of Discovery had, through Spanish and Portuguese seafarers, opened up southern Africa, the Americas (New World), Asia and Oceania to European eyes: Bartholomew Dias had sailed around the Cape of southern Africa in search of a trade route to India; Christopher Columbus, on four journeys across the Atlantic, had prepared the way for European colonisation of the New World; Ferdinand Magellan had commanded the first expedition to sail across the Atlantic and Pacific oceans to reach the Maluku Islands and was continued by Juan Sebastián Elcano, completing the first circumnavigation of the Earth. During the 17th century the naval hegemony started to shift from the Portuguese and Spanish to the Dutch and then the British and French. The new era of scientific exploration began in the late 17th century as scientists, and in particular natural historians, established scientific societies that published their researches in specialist journals. The British Royal Society was founded in 1660 and encouraged the scientific rigour of empiricism with its principles of careful observation and deduction. Activities of early members of the Royal Society served as models for later maritime exploration. Hans Sloane (1650–1753) was elected a member in 1685 and travelled to Jamaica from 1687 to 1689 as physician to the Duke of Albemarle (1653–1688) who had been appointed Governor of Jamaica. In Jamaica Sloane collected numerous specimens which were carefully described and illustrated in a published account of his stay. Sloane bequeathed his vast collection of natural history 'curiosities' and library of over 50,000 bound volumes to the nation, prompting the establishment in 1753 of the British Museum. His travels also made him an extremely wealthy man as he patented a recipe that combined milk with the fruit of "Theobroma cacao" (cocoa) he saw growing in Jamaica, to produce milk chocolate. Books of distinguished social figures like the intellectual commentator Jean Jacques Rousseau, Director of the Paris Museum of Natural History Comte de Buffon, and scientist-travellers like Joseph Banks, and Charles Darwin, along with the romantic and often fanciful travelogues of intrepid explorers, increased the desire of European governments and the general public for accurate information about the newly discovered distant lands. | https://en.wikipedia.org/wiki?curid=31642145 | 1,135,884 |
2,056,055 | Keegan et al. (2001) described the relationships among workers and authorities evaluating long-established, though often expensive and slow, methodologies with more recent technological developments as sometimes discordant. Gray et al. (1999b) lamented that there is a strong institutional tendency for sediment ecologists to rely on sampling methods developed in the early 1900s! A fine balance needs to be struck. Some degree of paradigm inertia is necessary to maintain intellectual continuity, but it can be taken too far. Physics, as a science, confronted this issue long ago and has widely embraced new technologies after establishing a scientific culture of always linking new techniques to established findings in a period of calibration and evaluation. The pace of this process in biology, as a whole, has quickened over the past few decades and ecology has only recently come to this horizon. This article introduces one such technology, sediment profile imagery (SPI) that is slowly gaining acceptance and currently undergoing its evaluation and calibration period even though it has existed since the 1970s. Like many of the technologies mentioned above, each new capability requires a careful consideration of its appropriateness in any particular application. This is especially true when they cross important, though often subtle, boundaries of data collection limitations. For example, much of our benthic knowledge has been developed from point-sample methods like cores or grabs, whereas continuous data collection, like some video transect analysis methods (e.g. Tkachenko 2005), may require different spatial interpretations that more explicitly integrate patchiness. While remote sampling techniques often improve our point-sampling resolution, benthologists need to consider the real-world heterogeneity at small spatial scales and compare them to the noise inherent to most high-volume data collection methods (e.g. Rabouille et al. 2003 for microelectrode investigations of pore water). New developments in the field of SPI will provide tools for investigating dynamic sediment processes, but also challenge our ability to accurately interpolate point-data collected at spatial densities approaching continuous data sets. | https://en.wikipedia.org/wiki?curid=14917968 | 2,054,871 |
2,088,142 | Vera Stein Ehrlich was born on 4 October 1897 in Zagreb to Adolf and Ida Ehrlich. She grew up in Zagreb with her younger sister Ina Jun-Broda (1899-1983) who later become notable writer and translator. She studied pedagogy and psychology in Berlin and Vienna. At the age of 19, she began publishing works on psychological and pedagogic problems. In 1933, 1934, and 1936, she published books on the issues of education of children and youth. As part of the fifteen studies conducted in those years, she was writing about the methods and meaning of drawing and books for children. Her paper on the impact of illness on the character of the child was a result of cooperation with her husband Dr. Ben Stein. At that time, Ehrlich was particularly concerned with individual psychology, probably influenced by the achievements of Austrian psychologist Alfred Adler to whom she wrote an obituary in the "Jew", Gazette of the Jewish Community of Zagreb. In that same period, she became interested in the problem of the position of women in the Yugoslav society. She was among the first to start discussing these issues scientifically in order to encourage the change of policy, legal norms and the attitude of society towards the woman and to help so they would stop being treated as second-class citizens. She also studied the activities of women in the economy. With her writing, she encouraged woman activism in society and struggle for the right to vote. Already in 1935, she emphasized the successes of the feminist movement in her works. She was interested in the process of liberating women from the tutoring of a man who was still legally superior. Disempowerment and the lack of protection of women in, at the then conservative patriarchal society, motivated her to conduct research and start teaching. She studied the life of rural families and intergenerational relationships. Psychological analysis of life situations has introduced her to the field of social anthropology. At the time, she had already prepared a survey on issues that enabled her to gain insight into the social situation in rural areas which also provided her with an analysis of the status of women and the relationship between the family members. The survey was forwarded to many associations in the Kingdom of Yugoslavia. By the beginning of World War II, she received responses from 300 villages. | https://en.wikipedia.org/wiki?curid=55924732 | 2,086,940 |
1,679,124 | The IMAGE satellite instrument complement includes three Far Ultraviolet (FUV) instruments. In the wavelength region 120-190 nm, a downward-viewing auroral imager is only minimally contaminated by sunlight, scattered from clouds and ground, and radiance of the aurora observed in a nadir viewing geometry can be observed in the presence of the high-latitude dayglow. The Wideband Imaging Camera (WIC) provides broadband ultraviolet images of the aurora for maximum spatial and temporal resolution by imaging the LBH N2 bands of the aurora. The Spectrographic Imager (SI), a monochromatic imager, images different types of aurora, filtered by wavelength. By measuring the Doppler-shifted Lyman-a, the proton-induced component of the aurora can be imaged separately. Finally, the GEO instrument observes the distribution of the geocoronal emission, which is a measure of the neutral background density source for charge exchange in the magnetosphere. The FUV instrument complement looks radially outward from the rotating IMAGE satellite and, therefore, it spends only a short time observing the aurora and the Earth during each spin (120-seconds period). Detailed descriptions of the WIC, SI, GEO, and their individual performance validations can be found in the January 2000 issue of the Space Science Reviews. One primary requirement of the FUV instrument is to maximize photon collection efficiency and use efficiently the short time available for exposures. The FUV auroral imagers WIC and SI both have wide fields of view and take data continuously as the auroral region proceeds through the field of view. To minimize data volume, multiple images are taken and electronically co-added by suitably shifting each image to compensate for the spacecraft rotation. In order to minimize resolution loss, the images have to be distortion -corrected in real time for both WIC and SI prior to co-adding. The distortion correction is using high speed look up tables that are pre-generated by least square fitting to polynomial functions by the on-orbit processor. The instruments were calibrated individually while on stationery platforms, mostly in vacuum chambers as described in the companion papers. Extensive ground-based testing was performed with visible and near UV simulators mounted on a rotating platform to estimate their on-orbit performance. | https://en.wikipedia.org/wiki?curid=1090498 | 1,678,181 |
2,008,638 | He was born in Kumamoto Prefecture, Japan and graduated from Tohoku Imperial University in 1925. After entering the Ministry of Communications as an engineer, he proposed the idea of "Long Distance Non-Loaded Cable Carrier Communication System" in 1932. In 1933, he was sent to Germany by the Government for one year, and exchanged opinions with engineers such as of Siemens factories. The Long Distance Non-Loaded Cable Carrier Communication System was realized between Harbin of Manchuria and Japan. In 1940, he assumed the post of the general affairs department of the Taisei Yokusankai (大政翼賛会, "Imperial Rule Assistance Association") which was Japan's para-fascist organization created by Prime Minister Fumimaro Konoe on October 12, 1940 to promote the goals of his Shintaisei movement. In 1941, he was appointed General Director of the Engineering Department of the Ministry of Communications. In 1943, he established a school for airplane technology in Shimizu, Shizuoka Prefecture, and in 1949 a school for wireless science in Nakano, Tokyo; these schools joined later to a school called Tokai Science School. During World War II, he changed his opinions and left the Taisei Yokusankai and strongly opposed the policy of the Hideki Tojo cabinet. Matsumae was sent to the front of the Philippines, as a second class private in the Imperial Army. Right before the end of the war, he returned to Japan. He was appointed Minister of Communications in 1945. Between January 1950 and June 1951, he was purged from public service. Later, he assumed the post of the President of Tokai University, which was constructed based on the schools established earlier. In 1952, he was elected a member of the Lower House of the Japanese Parliament and served for 17 years belonging to the Socialist Party. In 1966, he established a cultural exchange system "Nihon Taigai Bunka Kyokai" at the request of Soviet Russia, and assumed the post of its president. He boasted that he could talk to the Chief Secretary personally by telephone. He was also a judo player and as the top of Students' Association of Judo, he staged harsh struggles with the Kodokan, the traditional association of judo. He established the Matsumae International Foundation in 1979. He has established a large number of educational cultural exchange programs with universities throughout the world. For his efforts he received numerous honorary degrees from various countries. He died in 1991 at the age of 89. | https://en.wikipedia.org/wiki?curid=34826953 | 2,007,486 |
2,022,924 | Interatomic potentials were developed for amino acids as early as in the late 1960s, for example, serving the CHARMM program. The fraction of covered chemical space was small considering the size of the periodic table, and compatible interatomic potentials for inorganic compounds remained largely unavailable. Different energy functions, lack of interpretation and validation of parameters restricted modeling to isolated compounds with unpredictable errors. Specifically, assumptions of formal charges, fixed atoms, and other approximations often led to collapsed structures and random energy differences when allowing atom mobility. A concept for consistent simulations of inorganic-organic interfaces was introduced in 2003. A major obstacle was the poor definition of atomic charges in molecular models, especially for inorganic compounds. IFF utilizes a method to assign atomic charges that translates chemical bonding accurately into molecular models, including metals, oxides, minerals, and organic molecules. The models reproduce multipole moments internal to a chemical compound on the basis of experimental data for electron deformation densities, dipole moments, as well as consideration of atomization energies, ionization energies, coordination numbers, and trends relative to other chemically similar compounds in the periodic table (the Extended Born Model). The method ensures a combination of experimental data and theory to represent chemical bonding and yield up to ten times more reliable and reproducible atomic charges in comparison to the use of quantum methods. This approach is essential to carry out consistent all-atom simulations of compounds across the periodic table that vary widely in internal polarity. IFF also allows the inclusion of specific features of the electronic structure such as π electrons in graphitic materials and in aromatic compounds. Another characteristic is the systematic reproduction of structures and energies to validate the classical Hamiltonian. The quality of structural predictions is assessed by validation of lattice parameters and densities from X-ray data, which is common in molecular simulations. In addition, IFF uses surface and cleavage energies from experimental measurements to ensure a reliable potential energy surface. Thereafter, hydration energies, adsorption energies, thermal, and mechanical properties can often be computed in quantitative agreement with measurements without further modifications. The parameters also have a physical-chemical interpretation and chemical analogy can be effectively used to derive parameters for chemically similar, yet unknown compounds in good accuracy. Alternative approaches based on random force field fitting to lattice parameters and mechanical properties (the 2nd derivative of the energy) lack interpretability and can cause over 500% errors in surface and interfacial energies, limiting the utility of models. | https://en.wikipedia.org/wiki?curid=64170778 | 2,021,760 |
937,856 | Extensive testing of the M22 occurred in 1943 and 1944, and was conducted by both the Ordnance Department and the British Armoured Fighting Vehicle (AFV) Gunnery School at Lulworth Ranges. These tests uncovered a number of faults and problems with the Locust. The AFV School noted that the process of loading the M22 into a C-54 transport aircraft took considerable time and involved the use of complex equipment. Overall the process took six untrained men 24 minutes, although it was believed this could be shortened with sufficient training. Unloading was also a long process, taking approximately ten minutes; it was noted that the time it took to unload the M22 from a C-54 on the battlefield meant that both the tank and aircraft would make excellent targets for enemy fire. Operational use of the tank would therefore be restricted to the availability of airfields large enough to accommodate a fully laden C-54, which might not be in the right geographical location or might even have to be captured in advance of a planned airborne operation. A heavy transport aircraft, the Fairchild C-82 Packet, was developed to specifically carry the M22 inside its fuselage and unload it through a set of clam-shell doors, but it did not enter service until after the war had ended. The US Army Armored Board released a critical report on the Locust in September 1943, stating that it was inadequate in the areas of reliability and durability, and indicating that it would not be able to be successfully used during airborne operations. By 1944 it was also realized that the design of the tank was actually obsolete. The armor of the M22 in several areas was found to be so thin that it was incapable of even resisting the armor-piercing ammunition of a .50 caliber machine-gun. Complaints were also made about the 37mm main armament, which was not powerful enough to penetrate the armor of most tanks used by the Axis powers. Similarly a report made on March 13, 1944, by elements of the 6th Airborne Armoured Reconnaissance Regiment complained that when a high-explosive shell was fired from the gun, the resulting shell-burst was so weak that observers had difficulty in seeing where it impacted. There were also mechanical problems with the design, which caused it to be unreliable; the engine was also found to be underpowered, possibly due to problems with the torque characteristics of the engine or an inefficient transmission system. | https://en.wikipedia.org/wiki?curid=335693 | 937,356 |
1,244,413 | There are scholars who cite that the subject of automatic indexing attracted attention as early as the 1950s, particularly with the demand for faster and more comprehensive access to scientific and engineering literature. This attention in indexing began with text processing between 1957 and 1959 by H.P. Lunh through a series of papers that were published. Lunh proposed that a computer could handle keyword matching, sorting, and content analysis. This was the beginning of Automatic Indexing and the formula to pull keywords from text based on frequency analysis. It was later determined that frequency alone was not sufficient for good descriptors however this began the path to where we are now with Automatic Indexing. This was highlighted by the information explosion, which was predicted in the 1960s and came about through the emergence of information technology and the World Wide Web. The prediction was prepared by Mooers where an outline was created with the expected role that computing would have for text processing and information retrieval. This prediction said that machines would be used for storage of documents in large collections and that we would use these machines to run searches. Mooers also predicted the online aspect and retrieval environment for indexing databases. This led Mooers to predict an Induction Inference Machine which would revolutionize indexing. This phenomenon required the development of an indexing system that can cope with the challenge of storing and organizing vast amount of data and can facilitate information access. New electronic hardware further advanced automated indexing since it overcame the barrier imposed by old paper archives, allowing the encoding of information at the molecular level. With this new electronic hardware there were tools developed for assisting users. These were used to manage files and were organized into different categories such as PDM Suites like Outlook or Lotus Note and Mind Mapping Tools such as MindManager and Freemind. These allow users to focus on storage and building a cognitive model. The automatic indexing is also partly driven by the emergence of the field called computational linguistics, which steered research that eventually produced techniques such as the application of computer analysis to the structure and meaning of languages. Automatic indexing is further spurred by research and development in the area of artificial intelligence and self-organizing system also referred to as thinking machine. | https://en.wikipedia.org/wiki?curid=28204669 | 1,243,740 |
1,020,482 | Theodosius Dobzhansky, an immigrant from the Soviet Union to the United States, who had been a postdoctoral worker in Morgan's fruit fly lab, was one of the first to apply genetics to natural populations. He worked mostly with "Drosophila pseudoobscura". He says pointedly: "Russia has a variety of climates from the Arctic to sub-tropical... Exclusively laboratory workers who neither possess nor wish to have any knowledge of living beings in nature were and are in a minority." Not surprisingly, there were other Russian geneticists with similar ideas, though for some time their work was known to only a few in the West. His 1937 work "Genetics and the Origin of Species" was a key step in bridging the gap between population geneticists and field naturalists. It presented the conclusions reached by Fisher, Haldane, and especially Wright in their highly mathematical papers in a form that was easily accessible to others. Further, Dobzhansky asserted the physicality, and hence the biological reality, of the mechanisms of inheritance: that evolution was based on material genes, arranged in a string on physical hereditary structures, the chromosomes, and linked more or less strongly to each other according to their actual physical distances on the chromosomes. As with Haldane and Fisher, Dobzhansky's "evolutionary genetics" was a genuine science, now unifying cell biology, genetics, and both micro and macroevolution. His work emphasized that real-world populations had far more genetic variability than the early population geneticists had assumed in their models and that genetically distinct sub-populations were important. Dobzhansky argued that natural selection worked to maintain genetic diversity as well as by driving change. He was influenced by his exposure in the 1920s to the work of Sergei Chetverikov, who had looked at the role of recessive genes in maintaining a reservoir of genetic variability in a population, before his work was shut down by the rise of Lysenkoism in the Soviet Union. By 1937, Dobzhansky was able to argue that mutations were the main source of evolutionary changes and variability, along with chromosome rearrangements, effects of genes on their neighbours during development, and polyploidy. Next, genetic drift (he used the term in 1941), selection, migration, and geographical isolation could change gene frequencies. Thirdly, mechanisms like ecological or sexual isolation and hybrid sterility could fix the results of the earlier processes. | https://en.wikipedia.org/wiki?curid=97536 | 1,019,955 |
1,517,315 | Electrochemical Random-Access Memory (ECRAM) is a type of non-volatile memory (NVM) with multiple levels per cell (MLC) designed for deep learning analog acceleration. An ECRAM cell is a three-terminal device composed of a conductive channel, an insulating electrolyte, an ionic reservoir, and metal contacts. The resistance of the channel is modulated by ionic exchange at the interface between the channel and the electrolyte upon application of an electric field. The charge-transfer process allows both for state retention in the absence of applied power, and for programming of multiple distinct levels, both differentiating ECRAM operation from that of a field-effect transistor (FET). The write operation is deterministic and can result in symmetrical potentiation and depression, making ECRAM arrays attractive for acting as artificial synaptic weights in physical implementations of artificial neural networks (ANN). The technological challenges include open circuit potential (OCP) and semiconductor foundry compatibility associated with energy materials. Universities, government laboratories, and corporate research teams have contributed to the development of ECRAM for analog computing. Notably, Sandia National Laboratories designed a lithium-based cell inspired by solid-state battery materials, Stanford University built an organic proton-based cell, and International Business Machines (IBM) demonstrated in-memory selector-free parallel programming for a logistic regression task in an array of metal-oxide ECRAM designed for insertion in the back end of line (BEOL). In 2022, researchers at Massachusetts Institute of Technology built an inorganic, CMOS-compatible protonic technology that achieved near-ideal modulation characteristics using nanosecond fast pulses | https://en.wikipedia.org/wiki?curid=64305316 | 1,516,463 |
408,579 | Much of the interest in environmental science throughout the 1970s and the 1980s was characterized by major disasters and social movements. In 1978, hundreds of people were relocated from Love Canal, New York after carcinogenic pollutants were found to be buried underground near residential areas. The next year, in 1979, the nuclear power plant on Three Mile Island in Pennsylvania suffered a meltdown and raised concerns about the dangers of radioactive waste and the safety of nuclear energy. In response to landfills and toxic waste often disposed of near their homes, the official Environmental Justice Movement was started by a Black community in North Carolina in 1982. Two years later, the toxic methyl isocyanate gas was released to the public from a power plant disaster in Bhopal, India, harming hundreds of thousands of people living near the disaster site, the effects of which are still felt today. In a groundbreaking discovery in 1985, a British team of researchers studying Antarctica found evidence of a hole in the ozone layer, inspiring global agreements banning the use of chlorofluorocarbons (CFCs), which were previously used in nearly all aerosols and refrigerants. Notably, in 1986, the meltdown at the Chernobyl nuclear power plant in Ukraine released radioactive waste to the public, leading to international studies on the ramifications of environmental disasters. Over the next couple of years, the Brundtland Commission (previously known as the World Commission on Environment and Development) published a report titled "Our Common Future" and the Montreal Protocol formed the International Panel on Climate Change (IPCC) as international communication focused on finding solutions for climate change and degradation. In the late 1980s, the Exxon Valdez company was fined for spilling large quantities of crude oil off the coast of Alaska and the resulting cleanup, involving the work of environmental scientists. After hundreds of oil wells were burned in combat in 1991, warfare between Iraq and Kuwait polluted the surrounding atmosphere just below the air quality threshold environmental scientists believed was life-threatening. | https://en.wikipedia.org/wiki?curid=64919 | 408,378 |
1,501,022 | In 2000, the United Nations Environment Programme (UNEP) advanced the Digital Earth to enhance decision-makers' access to information for then Secretary-General Kofi Annan and the United Nations Security Council. UNEP promoted use of web-based geospatial technologies with the ability to access the world's environmental information, in association with economic and social policy issues. A reorganization of UNEP's data and information resources was initiated in 2001, based on the GSDI/DE architecture for a network of distributed and interoperable databases creating a framework of linked servers. The design concept was based upon using a growing network of internet mapping software and database content with advanced capabilities to link GIS tools and applications. UNEP.net, launched in February 2001, provided UN staff with an unparalleled facility for accessing authoritative environmental data resources and a visible example to others in the UN community. However, a universal user interface for UNEP.net, suitable for members of Security Council, that is non-scientists, did not exist. UNEP began actively testing prototypes for a UNEP geo-browser beginning in mid-2001 with a showcase for the African community displayed at the 5th African GIS Conference in Nairobi, Kenya November 2001. Keyhole Technology, Inc. (later purchased in 2004 by Google and to become Google Earth) was contracted to develop and demonstrate the first full globe 3-D interactive Digital Earth using web-stream data from a distributed database located on servers around the planet. A concerted effort within the UN community, via the Geographic Information Working Group (UNGIWG), followed immediately, including purchase of early Keyhole systems by 2002. UNEP provided further public demonstrations for this early Digital Earth system at the World Summit on Sustainable Development in September, 2002 at Johannesburg, South Africa. In seeking an engineering approach to system-wide development of the Digital Earth model, recommendations were made at the 3rd UNGIWG Meeting, June 2002, Washington, D.C. for creating a document on the Functional User Requirements for geo-browsers. This proposal was communicated to the ISDE Secretariat in Beijing and the organizing committee for the 3rd International Symposium on Digital Earth and agreement was reached by the Chinese Academy of Sciences-sponsored Secretariat to host the first of the two Digital Earth geo-browser meetings. | https://en.wikipedia.org/wiki?curid=7297180 | 1,500,176 |
738,875 | Another strand of Angrist's research in the economics of education concerns the impact of various inputs and rules on learning. For instance, in further work with Lavy, Angrist exploited Maimonides' Rule, which limits class size to 40 students, in order to study the impact of class size on scholastic achievement in Israeli schools, finding that class size reduction substantially increase test scores for 4th and 5th graders, albeit not for 3rd graders. In further research at Israeli schools, they find that teacher training can cost-effectively improve students' test scores (at least in secular schools), that computer-aided instruction doesn't and that cash incentives raised high school achievement among girls (by inducing them to increase time invested into exam preparation) but were ineffective for boys. Similarly, in a study by Angrist, Philip Oreopoulos and Daniel Lang comparing the impact of academic support services, financial incentives and a combination of both on Canadian college first-year students, the combined treatment raised the grades of women throughout their first and second years but had no impact on men. In research on school vouchers for private schools in Colombia with Eric Bettinger, Erik Bloom, Elizabeth King and Michael Kremer, Angrist found voucher recipients 10 pp more likely to finish lower secondary school, 5-7 pp more likely to complete high school, and to score 0.2 standard deviations higher on tests, suggesting that the vouchers' benefits likely exceeded their $24 cost. Another subject of Angrist's research are peer effects in education, which he has e.g. explored with Kevin Lang in the context of METCO's school integrations or with Atila Abdulkadiroglu and Parag Pathak in Boston's and New York City's over-subscribed exam schools, though the effects that they find are brief and modest in both cases. With regard to the effect of teacher testing, which Angrist has studied with Jonathan Guryan in the U.S., he finds that state-mandated teacher testing raises teachers' wages without raising their quality, though it decreases teacher diversity by reducing the fraction of new teachers who are Hispanic. In work with Lavy and Analia Schlosser, Angrist has also explored Becker's hypothesis on a trade-off between child quality and quantity by exploiting variation in twin births and parental preferences for compositions of siblings of mixed sexes, with evidence rejecting the hypothesis. | https://en.wikipedia.org/wiki?curid=14441269 | 738,484 |
429,497 | This second newer system started in Boston and is essentially a form of career guidance for children. A member of the community would call a meeting of all the neighborhood boys who were to leave elementary school at the end of the year and discuss with them whether they had any reasonable plans for the future. It was clear that the boys knew little of what they wanted to do or what would be expected of them in the real world, and the leader was able to give them, especially in one-one-one conversations, valuable advice. They knew too little of the characteristic features of the vocations to which they wanted to devote themselves, and they had given hardly any attention to the question whether they had the necessary qualifications for the special work. From this experience an office "opened in 1908, in which all Boston children at the time when they left school were to receive individual suggestions with reference to the most reasonable and best adjusted selection of a calling. There is hardly any doubt that the remarkable success of this modest beginning was dependent upon the admirable personality of the late organizer, who recognized the individual features with unusual tact and acumen. But he himself had no doubt that such a merely impressionistic method could not satisfy the demands." Münsterberg identified three main reasons why this worked: first, because they analyzed the objective relations of the hundreds of different accessible vocations, as well as, the children's economic, hygienic, technical, and social elements that should be examined so that every child could receive valuable information as to the demands of the vocation and what opportunities could be found within that vocation. Second, that the schools would have to be interested in the question of vocational choice so that observations of an individual child could be made about their abilities and interests. And finally, what he believed to be the most important point, "the methods had to be elaborated in such a way that the personal traits and dispositions might be discovered with much greater exactitude and with much richer detail than was possible through what a mere call on the vocational counselor could unveil." Münsterberg believes that these early vocational counselors point towards the spirit of the modern tendency toward applied psychology, and that the goal can only be reached through exact, scientific, experimental research, "and that the mere naïve methods—for instance, the filling-out of questionnaires which may be quite useful in the first approach—cannot be sufficient for a real, persistent furtherance of economic life and of the masses who seek their vocations." | https://en.wikipedia.org/wiki?curid=917279 | 429,287 |
1,762,264 | During World War I, Hoagland tried to substitute the lack of imports of potassium-based fertilizers from the German Empire to the United States with plant extracts from brown algae, inspired by the ability of giant kelp to absorb elements from seawater selectively and to accumulate potassium and iodide many times in excess of the concentrations found in seawater (Hoagland, 1915). Based on these findings he investigated the ability of plants to absorb salts against a concentration gradient and discovered the dependence of nutrient absorption and translocation on metabolic energy using innovative model systems under controlled experimental conditions (Hoagland, Hibbard, and Davis, 1926). During his systematic research, mainly by solution culture technique, and inspired by a principle of Julius von Sachs, he developed the basic formula for the Hoagland solution, whose composition was originally patterned after the displaced soil solution obtained from certain soils of high productivity (Hoagland, 1919). His research also led to new discoveries on the need and function of trace elements required by living cells, thus establishing the essentiality of molybdenum for the growth of tomato plants (Arnon and Hoagland, 1940; Hoagland, 1945). Hoagland was able to show that various plant diseases are caused by a lack of trace elements such as zinc (Hoagland, Chandler, and Hibbard, 1931, ff.), and that boron, manganese, zinc, and copper are indispensable for normal plant growth (Hoagland, 1937). He took special interest in soil-plant interrelationships addressing, for example, the physiological balance of soil solutions and the pH dependence of plant growth, in order to gain a better understanding on the availability and absorption of nutrients in soils and solutions (Hoagland, 1916, 1917, 1920, 1922; Hoagland and Arnon, 1941). Hoagland and his associates, including his research assistant William Z. Hassid, thus contributed to the understanding of fundamental cellular physiological processes in green plants that are driven by sunlight as the ultimate form of energy (Hoagland and Davis, 1929; Hoagland and Steward, 1939, 1940; Hoagland, 1944, 1946). | https://en.wikipedia.org/wiki?curid=43993091 | 1,761,271 |
349,378 | Eero Saarinen (1910–1961) was the son of Eliel Saarinen, the most famous Finnish architect of the Art Nouveau period, who emigrated to the United States in 1923, when Eero was thirteen. He studied art and sculpture at the academy where his father taught, and then at the Académie de la Grande Chaumière Academy in Paris before studying architecture at Yale University. His architectural designs were more like enormous pieces of sculpture than traditional modern buildings; he broke away from the elegant boxes inspired by Mies van der Rohe and used instead sweeping curves and parabolas, like the wings of birds. In 1948 he conceived the idea of a monument in St. Louis, Missouri in the form of a parabolic arch 192 meters high, made of stainless steel (1948). He then designed the General Motors Technical Center in Warren, Michigan (1949–55), a glass modernist box in the style of Mies van der Rohe, followed by the IBM Research Center in Yorktown, Virginia (1957–61). His next works were a major departure in style; he produced a particularly striking sculptural design for the Ingalls Rink in New Haven, Connecticut (1956–59, an ice skiing rink with a parabolic roof suspended from cables, which served as a preliminary model for next and most famous work, the TWA Terminal at JFK airport in New York (1956–1962). His declared intention was to design a building that was distinctive and memorable, and also one that would capture the particular excitement of passengers before a journey. The structure is separated into four white concrete parabolic vaults, which together resemble a bird on the ground perched for flight. Each of the four curving roof vaults has two sides attached to columns in a Y form just outside the structure. One of the angles of each shell is lightly raised, and the other is attached to the center of the structure. The roof is connected with the ground by curtain walls of glass. All of the details inside the building, including the benches, counters, escalators, and clocks, were designed in the same style. | https://en.wikipedia.org/wiki?curid=315927 | 349,195 |
1,131,992 | Despite the above developments, devoted BSE detectors in the ESEM have played an important role, since the BSE remain a most useful detection mode yielding information not possible to obtain with SE. The conventional BSE detection means have been adapted to operate in the gaseous conditions of the ESEM. The BSE having a high energy are self-propelled to the corresponding detector without significant obstruction by the gas molecules. Already, annular or quadrant solid-state detectors have been employed for this purpose but their geometry is not easily adaptable to the requirements of ESEM for optimum operation. As a result, no much use has been reported of these detectors on genuine ESEM instruments at high pressure. The "Robinson" BSE detector is tuned for operation up to around 100 Pa at the usual working distance of conventional SEM for the suppression of specimen charging, whilst electron collection at the short working distance and high pressure conditions make it inadequate for the ESEM. However, plastic scintillating materials being easily adaptable have been used for BSE and made to measure according to the strictest requirements of the system. Such work culminated in the use of a pair of wedge-shaped detectors saddling a conical PLA1 and abutting to its rim, so that the dead detection space is reduced to a minimum, as shown in the accompanying figure of . The photon conduction is also optimized by the geometry of the light pipes, whilst the pair of symmetrical detectors allow the separation of topography (signal subtraction) and atomic number contrast (signal addition) of the specimen surface to be displayed with the best ever signal-to-noise-ratio. This scheme has further allowed the use of color by superimposing various signals in a meaningful way. These simple but special detectors became possible in the conditions of ESEM, since bare plastic does not charge by the BSE. However, a very fine wire mesh with appropriate spacing has been proposed as a GDD when gas is present and to conduct negative charge away from the plastic detectors when the gas is pumped out, towards a universal ESEM. Furthermore, since the associated electronics involve a photomultiplier with a wide frequency response, true TV scanning rates are readily available. This is an essential attribute to maintain with an ESEM that enables the examination of processes in situ in real time. In comparison, no such imaging has been reported with the electron avalanche mode of the GDD yet. | https://en.wikipedia.org/wiki?curid=9778156 | 1,131,400 |
1,569,568 | Autism spectrum disorder (ASD) refers to a variety of conditions typically identified by challenges with social skills, communication, speech, and repetitive sensory-motor behaviors. The 11th International Classification of Diseases (ICD-11), released in January 2021, characterizes ASD by the associated deficits in the ability to initiate and sustain two-way social communication and restricted or repetitive behavior unusual for the individual's age or situation. Although linked with early childhood, the symptoms can appear later as well. Symptoms can be detected before the age of two and experienced practitioners can give a reliable diagnosis by that age. However, official diagnosis may not occur until much older, even well into adulthood. There is a large degree of variation "in" how much support a person with ASD needs in day-to-day life. This can be classified by a further diagnosis of ASD level 1, level 2, or level 3. Of these, ASD level 3 describes people requiring very substantial support and who experience more severe symptoms. ASD-related deficits in nonverbal and verbal social skills can result in impediments in personal, family, social, educational, and occupational situations. This disorder tends to have a strong correlation with genetics along with other factors. More research is identifying ways in which epigenetics is linked to autism. Epigenetics generally refers to the ways in which chromatin structure is altered to affect gene expression. Mechanisms such as cytosine regulation and post-translational modifications of histones. Of the 215 genes contributing, to some extent in ASD, 42 have been found to be involved in epigenetic modification of gene expression. Some examples of ASD signs are specific or repeated behaviors, enhanced sensitivity to materials, being upset by changes in routine, appearing to show reduced interest in others, avoiding "eye contact" and limitations in social situations, "as well as" verbal communication. When social interaction becomes more important, some whose condition might have been overlooked suffer social and other exclusion and are more likely to have coexisting mental and physical conditions. Long-term problems include difficulties in daily living such as managing schedules, hypersensitivities ("e.g.," to foods, noises, fabric textures, light), initiating and sustaining relationships, and maintaining jobs. | https://en.wikipedia.org/wiki?curid=35773166 | 1,568,680 |
1,869,558 | The LICA experiment was designed to measure 0.5--5 MeV/nucleon solar and magnetospheric ions (He through Ni) arriving from the zenith in twelve energy bands. The mass of an ion was determined with simultaneous measurements of its time of flight (ToF) across a path length of approximately and its residual kinetic energy in one of four silicon (Si) solid state detectors. Ions passing through the 0.75 micrometre nickel entrance foils emitted secondary electrons which a chevron microchannel plate assembly amplified to form a signal to begin timing. A double entrance foil prevented single pinholes from allowing sunlight to enter the telescope and provided immunity to solar and geocoronal ultraviolet. Another foil and microchannel plate assembly in front of the solid state detectors gave the signal to stop timing. Wedge-and-strip anodes on the front sides of the timing anodes determined where the ion passed through the foils and, therefore, its flight path length. The velocity determined from the path length, the ToF, and the residual energy measured by the solid state detectors were combined to yield the mass of the ion with a resolution of about 1%, adequate to provide complete isotope separation. Corrections for the energy loss in the entrance foils gave the ion's incident energy. The geometric factor of the sensor was 0.8 cm2-sr and the field of view was 17° x 21°. On-board processing determined whether ions triggering LICA were protons, He nuclei, or more massive ions. Protons were counted in a rate and not further analyzed. Heavier nuclei were treated as low (He) or high (more massive than He) priority for transmission to the ground. The instrument data processing unit ensured that a sample of both priority events was telemetered, but that low priority events did not crowd out the rarer heavy species. Processed flux rates versus energy of H (hydrogen), He, O, Si group, and Fe groups were picked out every 15 seconds for transmission. Appropriate magnetic field models enabled specification of the atomic charge state by means of rigidity cut-off calculations. In addition, the proton cut-off versus energy during an orbit helped charge identification of the other species. On-board calibrations of the sensor were done by command about once per week. Data was stored in on-board memory of 26.5 MB, which was then dumped twice daily over ground stations. | https://en.wikipedia.org/wiki?curid=10669005 | 1,868,482 |
1,370,696 | The Burnets moved to Terang in 1909, when Frank was posted to be the bank manager there, having declined a post in London. Burnet was interested in the wildlife around the nearby Lake Terang; he joined the Scouts in 1910 and enjoyed all outdoor activities. While living in Terang, he began to collect beetles and study biology. He read biology articles in the "Chambers's Encyclopaedia", which introduced him to the work of Charles Darwin. During his early teens, the family took annual holidays to Port Fairy, where Burnet spent his time observing and recording the behaviour of the wildlife. He was educated at Terang State School and attended Sunday school at the local church, where the priest encouraged him to pursue scholastic studies and awarded him a book on ants as a reward for his academic performance. He advised Frank to invest in Mac's education and he won a full scholarship to board and study at Geelong College, one of Victoria's most exclusive private schools. Starting there in 1913, Burnet was the only boarder with a full scholarship. He did not enjoy his time there among the scions of the ruling upper class; while most of his peers were brash and sports-oriented, Burnet was bookish and not athletically inclined, and found his fellow students to be arrogant and boorish. During this period he kept his beetle-collecting and disapproval of his peers a secret and mixed with his schoolmates out of necessity. Nevertheless, his academic prowess gained him privileges, and he graduated in 1916, placing first in his school overall, and in history, English, chemistry and physics. The typical university path for a person of his social background was to pursue studies in theology, law or medicine. By this time, he was becoming disillusioned with religion and chose medicine. Due to World War I, military service was a possibility and he felt that a medical background would increase his chances of being given a non-combat post. | https://en.wikipedia.org/wiki?curid=417493 | 1,369,940 |
1,657,745 | Alpher was the son of a Russian Empire Jewish immigrant, Samuel Alpher (born Alfirevich), from Vitebsk, Russian Empire. His mother, Rose Maleson, died of stomach cancer in 1938, and his father later remarried. Alpher graduated at age 15 from Theodore Roosevelt High School in Washington, D.C., and held the ranks of Major and Commander of his school's Cadet program. He worked in the high school theater as stage manager for two years, supplementing his family's Depression-era income. He also learned Gregg shorthand, and in 1937 began working for the Director of the American Geophysical Union as a stenographer. In 1940 he was hired by the Department of Terrestrial Magnetism of the Carnegie Foundation, where he worked with Dr. Scott Forbush under contract for the U.S. Navy to develop ship degaussing techniques during World War II. He contributed to the development of the Mark 32 and Mark 45 detonators, torpedoes, Naval gun control, Magnetic Airborne Detection (of submarines), and other top-secret ordnance work (including the Manhattan Project), and he was recognized at the end of the War with the Naval Ordnance Development Award (December 10, 1945—with Symbol), and another Naval Ordnance Development award in 1946. Alpher's war time work been somewhat obscured by security classification. From 1944 through 1955, he was employed at the Johns Hopkins University Applied Physics Laboratory. During the daytime he was involved in the development of ballistic missiles, guidance systems, supersonics, and related subjects. In 1948 he earned his Ph.D. in Physics with a theory of nucleosynthesis called neutron-capture, and from 1948 onward collaborated with Dr. Robert C. Herman (Ph.D. in Physics, 1940, Princeton University, under E. Condon), also at APL, on predictions of the Cosmic Microwave Background Radiation (now widely referred to by the acronym CMB). Alpher was somewhat ambivalent about the nature of his ordnance work, having dedicated much of his early career to this in order to obtain his doctorate. | https://en.wikipedia.org/wiki?curid=1273628 | 1,656,812 |
422,091 | Blackett became friends with Kingsley Martin, later editor of the "New Statesman", while an undergraduate and became committed to the left. Politically he identified himself as a socialist, and often campaigned on behalf of the Labour Party. In the late 1940s, Blackett became known for his radical political opinions, which included his belief that Britain ought not develop atomic weapons. He was considered too far to the left for the Labour Government 1945–1951 to employ, and he returned to academic life. His internationalism found expression in his strong support for India. There in 1947 he met Jawaharlal Nehru, who sought his advice on the research and development needs of the Indian armed forces and for the next 20 years he was a frequent visitor and advisor on military and civil science. These visits deepened his concern for the underprivileged and the poor. He was convinced that the problem could be solved by applying science and technology and he used his scientific prestige to try to persuade scientists that one of their first duties was to use their skill to ensure a decent life for all mankind. Before underdevelopment became a popular issue he proposed in a presidential address to the British Association that Britain should devote 1% of its national income to the economic improvement of the third world and he was later one of the prime movers in the foundation of the Overseas Development Institute. He was the senior member of a group of scientists which met regularly to discuss scientific and technological policy during the 13 years when the Labour Party was out of office, and this group became influential when Harold Wilson became leader of the Party. Blackett's ideas led directly to the creation of the Ministry of Technology as soon as the Wilson government was formed and he insisted that the first priority was revival of the computer industry. He did not enter open politics, but worked for a year as a civil servant. He remained deputy chairman of the Minister's Advisory Council throughout the administration's life, and was also personal scientific adviser to the Minister. | https://en.wikipedia.org/wiki?curid=326834 | 421,885 |
739,238 | Electron microscopy has accelerated research in materials science by quantifying properties and features from nanometer-resolution imaging with STEM, which is crucial in observing and confirming factors, such as thin film deposition, crystal growth, surface structure formation, and dislocation movement. Until recently, most papers have inferred the properties and behaviors of material systems based on these images without being able to establish rigorous rules for what exactly is observed. The techniques that have emerged as a result of interest in quantitative scanning transmission electron microscopy (QSTEM) closes this gap by allowing researchers to identify and quantify structural features that are only visible using high-resolution imaging in a STEM. Widely available image processing techniques are applied to high-angle annular dark field (HAADF) images of atomic columns to precisely locate their positions and the material's lattice constant(s). This ideology has been successfully used to quantify structural properties, such as strain and bond angle, at interfaces and defect complexes. QSTEM allows researchers to now compare the experimental data to theoretical simulations both qualitatively and quantitatively. Recent studies published have shown that QSTEM can measure structural properties, such as interatomic distances, lattice distortions from point defects, and locations of defects within an atomic column, with high accuracy. QSTEM can also be applied to selected area diffraction patterns and convergent beam diffraction patterns to quantify the degree and types of symmetry present in a specimen. Since any materials research requires structure-property relationship studies, this technique is applicable to countless fields. A notable study is the mapping of atomic column intensities and interatomic bond angles in a mott-insulator system. This was the first study to show that the transition from the insulating to conducting state was due to a slight global decrease in distortion, which was concluded by mapping the interatomic bond angles as a function of the dopant concentration. This effect is not visible by the human eye in a standard atomic-scale image enabled by HAADF imaging, thus this important finding was only made possible due to the application of QSTEM. | https://en.wikipedia.org/wiki?curid=1823144 | 738,846 |
596,820 | General Dynamics first realized the capability gap being experienced by U.S. forces in Afghanistan around 2010. In many cases, troops were on low ground and being engaged by PKM machine gun fire from the high ground, forcing them to return fire from where they were instead of being able to seek a better position. The M2 .50-caliber machine gun is too heavy for use by dismounted patrols, and rounds from an M240 begin to drift off-target at 800 meters, especially when shooting upwards. At closer ranges, an M240 is accurate but does not have enough penetrating power against hard structures. The Precision Sniper Rifle competition going on at that time also showed the U.S. military was interested in infantry weapons with a 1,500-meter range. To achieve desired range capabilities, the .338-caliber was chosen, specifically the .338 Norma Magnum over the .338 Lapua Magnum for several reasons including greater barrel life and a less tapered case for better use in a push-through design metallic disintegrating link. At , the 7.62 NATO's velocity drops to about ; at that range, the .338NM travels at and out to , the round is capable of defeating Level III armor. A machine gun was then designed around the concept with Short Recoil Impulse Averaging technology, uses available subsystem components to keep cost down, and has a broad view 6-power scope to enable point target engagement out to 1,000-1,200 meters. The development of prototypes was entirely company-funded and took 12 months. The LWMMG was first unveiled on 15 May 2012 at the Joint Armaments Conference in Seattle, Washington. | https://en.wikipedia.org/wiki?curid=35841962 | 596,515 |
498,528 | A great disappointment, his repulse for the mastership of Balliol, also in 1854, appears to have roused him into the completion of his book on "The Epistles of St Paul". This work, described by one of his friends as "a miracle of boldness", is full of originality and suggestiveness, but its publication awakened against him a storm of theological opposition from the Orthodox Evangelicals, which followed him more or less through life. Instead of yielding to this, he joined with Henry Bristow Wilson and Rowland Williams, who had been similarly attacked, in the production of the volume known as "Essays and Reviews". This appeared in 1860 and gave rise to a strong outbreak of criticism. Jowett's loyalty to those who were prosecuted on this account was no less characteristic than his persistent silence while the augmentation of his salary as Greek professor was withheld. This persecution was continued until 1865, when E. A. Freeman and Charles Elton discovered by historical research that a breach of the conditions of the professorship had occurred, and Christ Church, Oxford, raised the endowment from £40 a year to £500. Jowett was one of the recipients of Nightingale's three volume work "Suggestions for Thought" for proof-reading and criticism. In the third volume of "Essays and Reviews" he contributed "On the Interpretation of Scripture" in which he attempted to reconcile her assertion that religion was law and could be unified with science. Her radical thoughts on women's place in the home, and his departure from liberal Anglican theology helped to block for a decade his career advancement to the Mastership of Balliol. By 1860, he was already Regius Professor of Greek and a Fellow of Balliol, but an increase in his stipend was withheld. While the work gained fulsome praise from philosopher-politician John Stuart Mill, it profoundly shook the more traditional establishment's fervent belief that the working-classes would continue to worship in parish churches. Recognition that this was no longer so, was just one of the theological departures. In October 1862 he was invited to Oak Hill Park to offer Florence the sacrament. Accepting the prospect with relish, he nonetheless consulted with Archbishop Tait for permission. Many of his letters to her and Mrs Bracebridge have survived; their religion was tinged with a mutual respect for their shared common interests and intellectual gifts. Also included is an unflattering description of a middle-aged man. | https://en.wikipedia.org/wiki?curid=231263 | 498,271 |
1,376,731 | Zeeman's style of leadership was informal, but inspirational, and he rapidly took Warwick to international recognition for the quality of its mathematical research. The first six appointments he made were all in topology, enabling the department to immediately become internationally competitive, followed by six in algebra, and finally six in analysis and six in applied mathematics. He was able to trade four academic appointments for funding that enabled PhD students to give undergraduate supervisions in groups of two for the first two years, in a manner similar to the tutorial system at Oxford and Cambridge. He remained at Warwick until 1988, but from 1966 to 1967 he was a visiting professor at the University of California at Berkeley, after which his research turned to dynamical systems, inspired by many of the world leaders in this field, including Stephen Smale and René Thom, who both spent time at Warwick. In 1963, Zeeman showed that that causality in special relativity expressed by preservation of partial ordering is given exactly and only by the Lorentz transforms. Zeeman subsequently spent a sabbatical with Thom at the Institut des Hautes Études Scientifiques in Paris, where he became interested in catastrophe theory. On his return to Warwick, he taught an undergraduate course in Catastrophe Theory that became immensely popular with students; his lectures generally were "standing room only". In 1973 he gave an MSc course at Warwick giving a complete detailed proof of Thom's classification of elementary catastrophes, mainly following an unpublished manuscript, "Right-equivalence" written by John Mather at Warwick in 1969. David Trotman wrote up his notes of the course as an MSc thesis. These were then distributed in thousands of copies throughout the world and published both in the proceedings of a 1975 Seattle conference on catastrophe theory and its applications, and in a 1977 collection of papers on catastrophe theory by Zeeman. In 1974 Zeeman gave an invited address at the International Congress of Mathematicians in Vancouver, about applications of catastrophe theory. | https://en.wikipedia.org/wiki?curid=871415 | 1,375,969 |
1,006,665 | Definitive prototypes of the Ka-25 incorporated mission equipment and corrosion protection for the structure. The rotor system introduced aluminium alloy blades pressurised with nitrogen for crack detection, lubricated hinges, hydraulic powered controls, alcohol de-icing and automatic blade folding. Power was supplied by two free-turbine engines sat atop the cabin, with electrically de-iced inlets, plain lateral exhausts with no infrared countermeasures, driving the main gearbox directly and a cooling fan for the gearbox and hydraulic oil coolers aft of the main gearbox. Construction was of stressed skin duralumin throughout with flush-riveting, as well as some bonding and honeycomb sandwich panels. The 1.5m × 1,25m × 3.94m cabin had a sliding door to port flight deck forward of the cabin and fuel tanks underfloor filled using a pressure refueling nozzle on the port side. A short boom at the rear of the cabin had a central fin and twin toed-in fins at the ends of the tailplane mainly for use during auto-rotation. The undercarriage consisted of two noncastoring mainwheels with sprag brakes attached to the fuselage by parallel 'V' struts with a single angled shock absorber to dissipate landing loads, and two castoring nosewheels on straight shock absorbing legs attached directly to the fuselage either side of the cockpit which folded rearwards to reduce interference with the RADAR, all wheels were fitted with emergency rapid inflation flotation collars. Flying controls all act on the co-axial rotors with pitch, roll and collective similar to a conventional single rotor helicopter. Yaw was through differential collective which has a secondary effect of torque, an automatic mixer box ensured that total lift on the rotors remained constant during yaw maneuvers, to improve handling during deck landings. Optional extras included fold up seats for 12 passengers, rescue hoist, external auxiliary fuel tanks or containers for cameras, flares, smoke floats or beacons. | https://en.wikipedia.org/wiki?curid=16932 | 1,006,146 |
1,581,711 | Denton recognised also a crucial biological fact that no biologically relatively inactive cation had evolved in Metazoan organisms to play a role analogous to that which increase of bicarbonate plays on the anion side of the pattern during vomiting and excess Cl- loss. In this latter case no large urinary Na excretion was obligatory to keep acid-base balance compatible with life. The initial finding with pancreatic fistula was published by Nature (London, 1948). In 1949, he, and Dr Victor Wynn who had joined him in study of a second case, were supported by Professor R. D. Wright and Sir MacFarlane Burnet to set up the Ionic Research Unit of the NHMRC in the Physiology Department at Melbourne University. The Ionic Research Unit originated flame photometry in clinical medical practice internationally – a Wynn initiated step. Rapid measurement of sodium and potassium in blood and urine helped originate intensive care, and they published a monograph in Acta Medica Scandinavica. Denton and Wynn’s approach, particularly flame photometry with rapid assessment of biochemical disorder, eventually resulted in saving of tens of thousands of lives in Australia as well as in other countries by chemically accurate intervention. In 1952 Wynn went to London, where Professors Pickering and Robb at St Mary’s Hospital established a metabolic intensive care unit which Wynn directed. Some years later he was visited by Professor Francis Moore of Harvard who incorporated the Melbourne classification of distortion of body fluid status viz subtraction acidemia e.g. pancreatic fistula; addition acidemia e.g. diabetic coma; subtraction alkalemia e.g. vomiting gastric juice; and addition alkalemia (excess alkali ingestion) into his textbook ‘Metabolic Aspects of Surgery” | https://en.wikipedia.org/wiki?curid=6869432 | 1,580,821 |
133,328 | In the early 20th century, the study of heredity became a major investigation after the rediscovery in 1900 of the laws of inheritance developed by Mendel. The 20th century also saw the integration of physics and chemistry, with chemical properties explained as the result of the electronic structure of the atom. Linus Pauling's book on "The Nature of the Chemical Bond" used the principles of quantum mechanics to deduce bond angles in ever-more complicated molecules. Pauling's work culminated in the physical modelling of DNA, "the secret of life" (in the words of Francis Crick, 1953). In the same year, the Miller–Urey experiment demonstrated in a simulation of primordial processes, that basic constituents of proteins, simple amino acids, could themselves be built up from simpler molecules, kickstarting decades of research into the chemical origins of life. By 1953, James D. Watson and Francis Crick clarified the basic structure of DNA, the genetic material for expressing life in all its forms, building on the work of Maurice Wilkins and Rosalind Franklin, suggested that the structure of DNA was a double helix. In their famous paper "Molecular structure of Nucleic Acids" In the late 20th century, the possibilities of genetic engineering became practical for the first time, and a massive international effort began in 1990 to map out an entire human genome (the Human Genome Project). The discipline of ecology typically traces its origin to the synthesis of Darwinian evolution and Humboldtian biogeography, in the late 19th and early 20th centuries. Equally important in the rise of ecology, however, were microbiology and soil science—particularly the cycle of life concept, prominent in the work Louis Pasteur and Ferdinand Cohn. The word "ecology" was coined by Ernst Haeckel, whose particularly holistic view of nature in general (and Darwin's theory in particular) was important in the spread of ecological thinking. In the 1930s, Arthur Tansley and others began developing the field of ecosystem ecology, which combined experimental soil science with physiological concepts of energy and the techniques of field biology. | https://en.wikipedia.org/wiki?curid=14400 | 133,275 |
1,209,438 | The 1100/80 introduced a high-speed cache memory - the SIU or Storage Interface Unit. The SIU contained either 8K, or (optionally) 16K 36-bit words of buffer memory, and was logically and physically positioned between the CAU(s)/IOU(s) and the (larger, slower) Main Memory units. The first version of the 1100/80 system could be expanded to a maximum of two CAUs, and two IOUs. A later version was expandable to four CAUs and four IOUs. The SIU control panel of the updated 1100/80 (pictured above) was able to logically and physically partition larger Multi-Processor configurations into completely independent systems, each with its separate Operating System. The CAU was capable of executing both 36-bit 1100 series instructions, and 30-bit 490 series instructions. The CAU contained the same basic register stack, in the first 128 words of addressable memory, as previous generations of 1100 Series machines, but since these registers were implemented with the same ECL chips as the rest of the system, the registers did not require parity to be generated/checked with each write/read. The IOU, or Input/Output Unit was modular in design and could be configured with different Channel Modules to support varying I/O requirements. The Word Channel Module included four 1100 Series (parallel) Word Channels. Block Multiplexer and Byte Channel Modules allowed direct connection of high-speed disk/tape systems, and low speed printers, etc. respectively. The Control/Maintenance Panel was now on the SIU, and provided a minimum of indicator/buttons since the system incorporated a mini-computer, based on the BC/7 (business computer) as a maintenance processor. This was used to load microcode, and for diagnostic purposes. The CAU, IOU, and SIU units were implemented using emitter-coupled logic (ECL) on high density multi-layer PC boards. The ECL circuitry utilized DC voltages of +0 and -2 volts, with the CAU requiring four 50 amp -2 volt power supplies. Power was 400 Hz, to reduce large scale DC power supplies. The 400 Hz power was supplied by a motor/alternator, because even though solid state 400 Hz inverters were available, they were not considered reliable enough to meet the system uptime requirements. | https://en.wikipedia.org/wiki?curid=201462 | 1,208,791 |
1,795,777 | Cissé was born in Niamey, Niger. As a child he assumed that he would work in his father's law firm. He became interested in science through Hollywood films. In Niamey there were few opportunities to practise science as his school did not have a laboratory. He was keen to move to America to study, and completed high school two years early. He moved to the United States at the age of 17 to attend college. Cissé studied physics at the North Carolina Central University and graduated in 2004. During his undergraduate degree he was encouraged by Carl Wieman to apply for a fellowship, and spent a summer at Princeton University working in condensed matter physics. During this summer project he looked at jammed disordered packings, investigating how M&M's arrange in a small volume with Paul Chaikin. Cissé used various techniques to study jammed packings, including magnetic resonance imaging, but in the end used a much simpler approach - painting M&M's and analysing how many times they knocked into each other. The result was published in Science. He moved to Urbana for his graduate studies, and earned his PhD under the supervision of single molecule biophysicist Taekjip Ha at the University of Illinois at Urbana–Champaign in 2009. After completing his doctorate, Cissé joined the École Normale Supérieure in Paris. He worked as a Pierre Gilles de Gennes Fellow in the joint labs of a physicist, Maxime Dahan, and a biologist, Xavier Darzacq. He also held a long-term fellowship with the European Molecular Biology Organization. In Paris, Cissé developed the single-cell microscopy technique time-correlated photoactivated localization microscopy (tcPALM), allowing for time resolved measurements in vitro. Cissé used transient-PALM to demonstrated that RNA polymerase II forms clusters that deconstruct after their work is done. Until Cissé made this discovery it was assumed that RNA polymerases were stable. | https://en.wikipedia.org/wiki?curid=60545408 | 1,794,768 |
1,706,376 | Some 10 years later, during which time computing power and readily accessible data storage had continued to advance, Microsoft announced the planned development of its "Planetary Computer", an "approach to computing that is planetary in scale and allows us to query every aspect of environmental and nature-based solutions available in real time." Meanwhile, from around 2010 onwards, Google had already developed a somewhat similar facility entitled "Google Earth Engine" that uses cloud computing for numerical analysis of large quantities of satellite imagery; as at 2021, the project website states that "Google Earth Engine combines a multi-petabyte catalog of satellite imagery and geospatial datasets with planetary-scale analysis capabilities. Scientists, researchers, and developers use Earth Engine to detect changes, map trends, and quantify differences on the Earth's surface." Such initiatives can perhaps be viewed as the "high end" for ingestion of massive, global scale input datasets and associated computation; at the other end of the scale, the development of cross-platform (open) standards for the exchange of digitized geographic information by the Open Geospatial Consortium since the early 2000s has enabled researchers equipped with minimal software to request, display, overlay and otherwise interact with subsets of remote global data streams via (for example) Web Map Service (WMS), Web Feature Service (WFS) and Web Coverage Service (WCS) without a requirement to hold any of the data locally, capable of producing a type of "macroscope" functionality at modest cost (free in the case of open source solutions such as GeoServer, MapServer and more) for displaying information of the user's choice against a range of possible base maps. Other presently available solutions of a similar nature - where the client "virtual globe" software is installed either on the user's device or runs in a web browser, and can then access either remote, or locally held data layers for display over pre-prepared base maps - include NASA WorldWind and ESRI's ArcGIS Earth. | https://en.wikipedia.org/wiki?curid=64208794 | 1,705,418 |
215,690 | Claude Bernard proposed in 1865 that the body strives to maintain a steady state in the internal environment (milieu intérieur), introducing the concept of homeostasis. In 1885, J.R. Tarchanoff showed that voluntary control of heart rate could be fairly direct (cortical-autonomic) and did not depend on "cheating" by altering breathing rate. In 1901, J. H. Bair studied voluntary control of the retrahens aurem muscle that wiggles the ear, discovering that subjects learned this skill by inhibiting interfering muscles and demonstrating that skeletal muscles are self-regulated. Alexander Graham Bell attempted to teach the deaf to speak through the use of two devices—the phonautograph, created by Édouard-Léon Scott's, and a manometric flame. The former translated sound vibrations into tracings on smoked glass to show their acoustic waveforms, while the latter allowed sound to be displayed as patterns of light. After World War II, mathematician Norbert Wiener developed cybernetic theory, that proposed that systems are controlled by monitoring their results. The participants at the landmark 1969 conference at the Surfrider Inn in Santa Monica coined the term biofeedback from Wiener's feedback. The conference resulted in the founding of the Bio-Feedback Research Society, which permitted normally isolated researchers to contact and collaborate with each other, as well as popularizing the term "biofeedback." The work of B.F. Skinner led researchers to apply operant conditioning to biofeedback, decide which responses could be voluntarily controlled and which could not. In the first experimental demonstration of biofeedback, Shearn used these procedures with heart rate. The effects of the perception of autonomic nervous system activity was initially explored by George Mandler's group in 1958. In 1965, Maia Lisina combined classical and operant conditioning to train subjects to change blood vessel diameter, eliciting and displaying reflexive blood flow changes to teach subjects how to voluntarily control the temperature of their skin. In 1974, H.D. Kimmel trained subjects to sweat using the galvanic skin response. | https://en.wikipedia.org/wiki?curid=292906 | 215,582 |
440,651 | Every engineering discipline is engaged in sustainable design, employing numerous initiatives, especially life cycle analysis (LCA), pollution prevention, Design for the Environment (DfE), Design for Disassembly (DfD), and Design for Recycling (DfR). These are replacing or at least changing pollution control paradigms. For example, concept of a "cap and trade" has been tested and works well for some pollutants. This is a system where companies are allowed to place a "bubble" over a whole manufacturing complex or trade pollution credits with other companies in their industry instead of a "stack-by-stack" and "pipe-by-pipe" approach, i.e. the so-called "command and control" approach. Such policy and regulatory innovations call for some improved technology based approaches as well as better quality-based approaches, such as leveling out the pollutant loadings and using less expensive technologies to remove the first large bulk of pollutants, followed by higher operation and maintenance (O&M) technologies for the more difficult to treat stacks and pipes. But, the net effect can be a greater reduction of pollutant emissions and effluents than treating each stack or pipe as an independent entity. This is a foundation for most sustainable design approaches, i.e. conducting a life-cycle analysis, prioritizing the most important problems, and matching the technologies and operations to address them. The problems will vary by size (e.g. pollutant loading), difficulty in treating, and feasibility. The most intractable problems are often those that are small but very expensive and difficult to treat, i.e. less feasible. Of course, as with all paradigm shifts, expectations must be managed from both a technical and an operational perspective. Historically, sustainability considerations have been approached by engineers as constraints on their designs. For example, hazardous substances generated by a manufacturing process were dealt with as a waste stream that must be contained and treated. The hazardous waste production had to be constrained by selecting certain manufacturing types, increasing waste handling facilities, and if these did not entirely do the job, limiting rates of production. Green engineering recognizes that these processes are often inefficient economically and environmentally, calling for a comprehensive, systematic life cycle approach. Green engineering attempts to achieve four goals: | https://en.wikipedia.org/wiki?curid=18624923 | 440,436 |
804,601 | The 1965 landmark report "Restoring the Quality of Our Environment" by U.S. President Lyndon B. Johnson's Science Advisory Committee warned of the harmful effects of carbon dioxide emissions from fossil fuel and mentioned "deliberately bringing about countervailing climatic changes," including "raising the albedo, or reflectivity, of the Earth." As early as 1974, Russian climatologist Mikhail Budyko suggested that if global warming ever became a serious threat, it could be countered with airplane flights in the stratosphere, burning sulfur to make aerosols that would reflect sunlight away. Along with carbon dioxide removal, solar geoengineering was discussed jointly as "geoengineering" in a 1992 climate change report from the US National Academies. The topic was essentially taboo in the climate science and policy communities until Nobel Laureate Paul Crutzen published an influential scholarly paper in 2006. Major reports by the Royal Society (2009) and the US National Academies (2015, 2021) followed. Total research funding worldwide remains modest, less than 10 million US dollars annually. Almost all research into solar geoengineering has to date consisted of computer modeling or laboratory tests, and there are calls for more research funding as the science is poorly understood. Only a few outdoor tests and experiments have proceeded. Major academic institutions, including Harvard University, have begun research into solar geoengineering. The Degrees Initiative is a registered charity in the UK which was established in 2010 to build capacity in developing countries to evaluate solar geoengineering. The 2021 US National Academy of Sciences, Engineering, and Medicine report recommended an initial investment into solar geoengineering research of $100–$200 million over five years. In May 2022, the Climate Overshoot Commission was launched to recommend a comprehensive strategy to reduce climate risk which includes sunlight reflection methods in its policy portfolio, and will issue a final report prior to the 2023 UN Climate Change Conference. | https://en.wikipedia.org/wiki?curid=20694764 | 804,172 |
1,528,168 | A CTS critique of OTS revolves around many OTS scholars' links to institutions of power. CTS questions these links to hegemonic actors and structures from the global North that can be seen to be furthering the agendas of certain states, because OTS is a discipline that is primarily concerned with looking at acts of terrorism by non-state actors. This is a very state-centric perspective which has a limited set of assumptions and narratives about the nature and cause of terrorism. Moreover, this becomes accepted as the general consensus at the macro, meso and micro levels of government and institutions, and is reflected in policy and the way the mainstream view terrorism. Traditional terrorism studies is also largely concerned with "problem solving theory", which looks at the world "with the prevailing social and power relationships and the institutions into which they are organised, as the given framework for action", and then works to "make these relationships and institutions work smoothly by dealing effectively with particular sources of trouble". Therefore, they look at dealing with the issue of terrorism within the current dominant structures of power. An example of this is the scholars and associated research that is affiliated with the RAND Corporation. Now an independent think tank, RAND was established by the US air force in 1945, and was contracted to the Douglas Aircraft Company. It has maintained close links with US administrations, and former board members include Donald Rumsfeld and Condoleezza Rice, both leading members of the George W. Bush administration. The major problem with this association with government is that it privileges research on threats by non-state actors, and marginalises research around state sponsorship of terrorism. According to Burnett and Whyte, the Corporation acts "effectively as an influential prestigious voice in the American military-industrial lobby and in world politics; particularly with regard to its interventions on the war on terror". Scholars, or "embedded experts" associated with RAND have key editorial positions in the two most prominent English-language terrorism journals, "Terrorism and Political Violence" and "Conflict and Terrorism". RAND scholars helped to found the St Andrew Centre for Studies in Terrorism and Political Violence (CSTPV), the leading centre for the study of terrorism in the UK. As well, experts associated with the RAND-St Andrews nexus have significant professional ties with businesses and military personnel associated with counter-terrorism activity, many of which have made "windfall profits" in the Iraq conflict. | https://en.wikipedia.org/wiki?curid=19455353 | 1,527,304 |
1,399,270 | Exterior ballistics research at BRL focused on the outward design of Army missiles and the aerodynamic phenomena that influence their flight. In addition to known forces such as drag and lift, BRL researchers were tasked with analyzing potential factors that could influence a projectile's behavior such as the effects of the Magnus force and moment. Both theoretical and experimental studies helped BRL researchers create new techniques for designing aerodynamically stable missiles. One of the most important tasks that BRL performed was developing techniques for predicting the dynamic stability of proposed spin-stabilized missile designs. However, researchers also analyzed designs for fin-stabilized projectiles as well. Other areas of research included analysis on boundary layers, heating rates, and the chemical interactions between the travelling projectile and the surrounding air and electric fields. BRL's exterior ballistics division was not solely responsible for developing better projectiles and firing techniques. This section of the lab was also in charge of preparing the firing and bombing tables for soldiers in the field. During World War II, weapon accuracy became a critical focal point for BRL researchers, who directed much of their wartime effort toward refining the ballistic performance of the projectiles. In order to test the performance of different projectiles under various conditions, the lab relied heavily on the supersonic wind tunnels and aerodynamic ranges installed at Aberdeen Proving Ground. The wind tunnels were used extensively during the late 1950s for BRL's cross-wind program, which arose from the Army's need to obtain aerodynamic data in order to prepare firing tables for aircraft rounds fired at large initial yaw angles. During the Space Race, BRL assisted in the development of several spacecraft, including the Mercury, Gemini, and Apollo Projects. The lab also engaged in research regarding high altitude atmospheric physics research, fluid physics, and experimental aeroballistics as well as the development of intercontinental ballistic missiles. | https://en.wikipedia.org/wiki?curid=30865936 | 1,398,495 |
1,649,505 | Born in villa "La Gracieuse", Morges, Switzerland, to Victor Forel a pious Swiss Calvinist and Pauline Morin, a French Huguenot he was brought up under a protective household. At the age of seven he began to take an interest in insects. He went to school at Morges and Lausanne before joining the medical school at Zurich. Forel had a diverse and mixed career as a thinker on many subjects. At Zurich he was inspired by the work of Bernhard von Gudden (1824-1886). In 1871 he went to Vienna and studied under Theodor Meynert (1833-1892) but was disappointed by Meynert. In 1873 he moved to Germany to assist Gudden at his Munich Kreis-Irrenanstalt. He improved upon various techniques in neuro-anatomy including modifications to Gudden's microtome design. In 1877 he described the nuclear and fibrillar organization of the tegmental region which is now known as "Campus Foreli." He then became a lecturer at the Ludwig-Maximilians-Universität in Munich while also continuing his researches on ants. His first major work was a 450-page treatise on the ants of Switzerland which was published in 1874 and commended by Charles Darwin. He was appointed professor of psychiatry in 1879 at the University of Zurich Medical School. He not only ran the Burghölzli asylum there, but continued to publish papers on insanity, prison reform, and social morality. The asylum was very poorly run with corrupt staff and poor standards before Forel took over and converted to be among the best in Europe. Forel named his home as "La Fourmilière" —the Ant Colony. Around 1900 Forel was a eugenicist. Forel suffered a stroke that paralyzed his right side in 1912, but he taught himself to write with his left hand and was able to continue his studies. By 1914 he was a good friend of the eminent British entomologist Horace Donisthorpe, with whom he stayed in Switzerland; his ardent socialist views frequently caused political arguments between the two. After hearing of the religion from his son in law Dr. Arthur Brauns (married to his daughter Martha), in 1920 he became a member of the Baháʼí Faith, abandoning his earlier racist and socialist views, writing in his will and testament, | https://en.wikipedia.org/wiki?curid=583572 | 1,648,573 |
1,221,988 | Researchers reported finding "Armillaria gallica" in the Upper Peninsula of Michigan in the early 1990s, during an unrelated research project to study the possible biological effects of extremely low frequency radio stations, which were being investigated as a means to communicate with submerged submarines. In one particular forest stand, "Armillaria"-infected oak trees had been harvested, and their stumps were left to rot in the field. Later, when red pines were planted in the same location, the seedlings were killed by the fungus, identified as "A. gallica" (then known as "A. bulbosa"). Using molecular genetics, they determined that the underground mycelia of one individual fungal colony covered , weighing over , with an estimated age of 1,500 years. The analysis used restriction fragment length polymorphism (RFLP) and random amplification of polymorphic DNA (RAPD) to examine isolates collected from fruit bodies and rhizomorphs (underground aggregations of fungal cells that resemble plant roots) along transects in the forest. The 15-hectare area yielded isolates that had identical mating type alleles and mitochondrial DNA restriction fragment patterns; this degree of genetic similarity indicated that the samples were all derived from a single genetic individual, or clone, that had reached its size through vegetative growth. In their conclusion, the authors noted: "This is the first report estimating the minimum size, mass, and age of an unambiguously defined fungal individual. Although the number of observations for plants and animals is much greater, members of the fungal kingdom should now be recognized as among the oldest and largest organisms on earth." After the "Nature" paper was published, major media outlets from around the world visited the site where the specimens were found; as a result of this publicity, the individual acquired the common name "humongous fungus". There was afterward some scholarly debate as to whether the fungus qualified to be considered in the same category as other large organisms such as the blue whale or the giant redwood. | https://en.wikipedia.org/wiki?curid=22450244 | 1,221,329 |
1,543,895 | A variety of proposals have been made to implement nanocircuitry in different forms. These include Nanowires, Single-Electron Transistors, Quantum dot cellular automata, and Nanoscale Crossbar Latches. However, likely nearer-term approaches will involve incorporation of nanomaterials to improve MOSFETs (metal-oxide-semiconductor field-effect transistors). These currently form the basis of most analog and digital circuit designs, the scaling of which drives Moore's Law. A review article covering the MOSFET design and its future was published in 2004 comparing different geometries of MOSFETs under scale reduction and noted that circular cross-section vertical channel FETs are optimal for scale reduction. This configuration is capable of being implemented with a high density using vertical semiconductor cylindrical channels with nanoscale diameters and Infineon Technologies and Samsung have begun research and development in this direction resulting in some basic patents using nanowires and carbon nanotubes in MOSFET designs. In an alternative approach, Nanosys uses solution based deposition and alignment processes to pattern pre-fabricated arrays of nanowires on a substrate to serve as a lateral channel of an FET. While not capable of the same scalability as single nanowire FETs, the use of pre-fabricated multiple nanowires for the channel increases reliability and reduces production costs since large volume printing processes may be used to deposit the nanowires at a lower temperature than conventional fabrication procedures. In addition, due to the lower temperature deposition a wider variety of materials such as polymers may be used as the carrier substrate for the transistors opening the door to flexible electronic applications such as electronic paper, bendable flat panel displays, and wide area solar cells. | https://en.wikipedia.org/wiki?curid=10214429 | 1,543,022 |
500,645 | Cushing Hall was built in 1933 thanks to a donation from John F. Cushing, president of the Great Lakes Dredge and Dock Company. Cushing graduated in 1906 with an engineering degree, but had almost dropped out due to financial reasons. President Andrew Morrissey forgave his tuition and in recognition for his kindness, Cushing later donated the $300,000 needed for the new building. The hall was designed by Kervick and Fagan in collegiate Gothic style, and its exterior is decorated with the names of great scientists and engineers on the outside, and with engineering themed mosaics and frescoes on the inside. In 1977, Fitzpatrick Hall was built directly south of Cushing. Built by Ellerbe Associates, it was financed thanks to Edward B. Fitzpatrick, a New York construction executive and 1954 civil engineering graduate. Fitzpatrick's 184,960 square-feet were added to Cushing Hall’s 104,898 square feet, more than doubling the school's space. Today, with its high-tech laboratories, Fitzpatrick Hall is the primary research, teaching, and computer center for the college, while Cushing is primarily used for office space. Stinson-Remick Hall of Engineering, built in 2009, is a $70 million and 160,000-square-foot building that hosts some of the most advanced facilities of the college. These include a nanotechnology research center, which include an 8,500-square-foot semiconductor processing and device fabrication cleanroom, which features industry-grade tools for production of integrated circuits and medical devices with nanometer-sized features. The Hessert Laboratory for Aerospace Research and Hessert Laboratory at White Field offer a combined 84,000-square-feet of research space for aerospace research. Combined, they house 19 major high-speed wind tunnels to provide near-flight conditions for research related to innovations in flight and flight speed, jet engines fuel-efficiency, and other projects for commercial use, national defense, and space exploration. Hessert houses the facilities of the Institute for Flow Physics and Control (FlowPAC), one of the world’s largest research projects focused on fluid mechanics. Other facilities are also dedicated to in aero-acoustics, aero-optics, multiphase flow, fluid-structure interaction, general flow control, hypersonics, gas-turbine propulsion, wind energy, and sensor and flow actuator development. | https://en.wikipedia.org/wiki?curid=49273941 | 500,388 |
391,200 | When the Second World War did break out, it was swiftly realised that the power of contemporary aircraft allowed armour plate to be fitted to protect the pilot and other vulnerable areas. This innovation proved highly effective against rifle-calibre machine gun rounds, which tended to ricochet off harmlessly. Similarly the introduction of self sealing fuel tanks provided reliable protection against these small projectiles. These new defenses, synergistically with the general robustness of new aircraft designs and of course their sheer speed, which made simply shooting them accurately in the first place far more difficult, entailed that it took a lot of such bullets and a fair amount of luck to cause them critical damage; but potentially a single cannon shell with a high-explosive payload could instantly sever essential structural elements, penetrate armour or open up a fuel tank beyond the capacity of self-sealing compounds to counter, even from fairly long range. (Instead of explosives, such shells could carry incendiaries, also highly effective at destroying planes, or a combination of explosives and incendiaries.) Thus by the end of the war, the fighter aircraft of almost all the belligerents mounted cannon of some sort, the only exception being the United States which in most cases favoured the Browning AN/M2 "light-barrel" .50 calibre heavy machine gun. A fighter equipped with these intermediate weapons in sufficient numbers was adequately armed to fulfill most of the Americans' combat needs aloft, as they tended to confront enemy fighters and other small planes far more often than large bombers; and as, in the earlier phases of the war, the Japanese aircraft they dealt with were not only unusually lightly built but went without either armour plate or self-sealing tanks in order to reduce their weight. Nevertheless, the U.S. also adopted planes fitted with autocannon, such as the Lockheed P-38 Lightning, despite experiencing technical difficulties with developing and manufacturing these large-calibre automatic guns. | https://en.wikipedia.org/wiki?curid=275006 | 391,005 |
1,076,422 | Due to continued growth of the student population of the BCPS and especially in the growing demand for higher secondary education at high schools like at BPI and BCC and the girls schools, the technical school relocated in 1913 to Calvert Street and North Avenue. The former 1860s converted mansion of the Maryland School for the Blind was purchased sitting on a slight hill and two massive wings on the east and west sides were added with a Greek Revival style columns on the front facade. For the first time in its 30 years history "Tech" had a suitable building expansive enough to handle both its academic and technical education requirements. By 1930, the old original central wing of "The Mansion" was razed and replaced by a simpler center wing between the two flanking 1913 structures with an additional large enormous auditorium/gymnasium wing further to the east facing North Avenue were constructed. This massive assembly hall was the largest at the time in the city and served many secular/civic/cultural occasions and events for decades into the mid-1980s. While at this location, the school expanded both its academic, technical and athletic programs under the extensive longtime supervision of Dr. Wilmer Dehuff, who was fourth principal from 1921 to 1958 and reluctantly (see below) oversaw the racial integration of the school in 1952, the first instance in City of Baltimore public schools with admitting African-American/then called "Negro" – "Colored" students and two years before the rest of the nation took up this serious issue of discrimination addressed finally by the Supreme Court of the United States in May 1954 in the famous case of "Brown vs. Board of Education of Topeka, Kansas". Previous black students had attended Frederick Douglass High School (formerly the "Colored High School" – second oldest in the nation – founded the same year as Poly – 1883) and the Paul Laurence Dunbar High School Dehuff later served after his 37 years career at Poly, as the president and dean of faculty at the University of Baltimore on Mount Royal Avenue. | https://en.wikipedia.org/wiki?curid=913572 | 1,075,867 |
772,477 | Some MOF materials may resemble enzymes when they combine isolated polynuclear sites, dynamic host–guest responses, and hydrophobic cavity environment which are characteristics of an enzyme. Some well-known examples of cooperative catalysis involving two metal ions in biological systems include: the diiron sites in methane monooxygenase, dicopper in cytochrome c oxidase, and tricopper oxidases which have analogy with polynuclear clusters found in the 0D coordination polymers, such as binuclear Cu paddlewheel units found in MOP-1 and [Cu(btc)] (btc=benzene-1,3,5-tricarboxylate) in HKUST-1 or trinuclear units such as { } in MIL-88, and IRMOP-51. Thus, 0D MOFs have accessible biomimetic catalytic centers. In enzymatic systems, protein units show "molecular recognition", high affinity for specific substrates. It seems that molecular recognition effects are limited in zeolites by the rigid zeolite structure. In contrast, dynamic features and guest-shape response make MOFs more similar to enzymes. Indeed, many hybrid frameworks contain organic parts that can rotate as a result of stimuli, such as light and heat. The porous channels in MOF structures can be used as photocatalysis sites. In photocatalysis, the use of mononuclear complexes is usually limited either because they only undergo single- electron process or from the need for high-energy irradiation. In this case, binuclear systems have a number of attractive features for the development of photocatalysts. For 0D MOF structures, polycationic nodes can act as semiconductor quantum dots which can be activated upon photostimuli with the linkers serving as photon antennae. Theoretical calculations show that MOFs are semiconductors or insulators with band gaps between 1.0 and 5.5 eV which can be altered by changing the degree of conjugation in the ligands. Experimental results show that the band gap of IRMOF-type samples can be tuned by varying the functionality of the linker. An integrated MOF nanozyme was developed for anti-inflammation therapy. | https://en.wikipedia.org/wiki?curid=9821563 | 772,062 |
702,157 | "", released for the MSX in 1987 and for the Mega Drive as "Super Hydlide" in 1989, adopted the morality meter of its predecessor, expanded on its time option with the introduction of an in-game clock setting day-night cycles and a need to sleep and eat, and made other improvements such as cut scenes for the opening and ending, a combat system closer to "The Legend of Zelda", the choice between four distinct character classes, a wider variety of equipment and spells, and a weight system affecting the player's movement depending on the overall weight of the equipment carried. That same year, Kogado Studio's sci-fi RPG "Cosmic Soldier: Psychic War" featured a unique "tug of war" style real-time combat system, where battles are a clash of energy between the party and the enemy, with the player needing to push the energy towards the enemy to strike them, while being able to use a shield to block or a suction ability to absorb the opponent's power. It also featured a unique non-linear conversation system, where the player can recruit allies by talking to them, choose whether to kill or spare an enemy, and engage enemies in conversation, similar to "Megami Tensei". Also in 1987, the survival horror game "", an MSX2 title developed by Fun Factory and published by Victor Music Industries, was the first true survival horror RPG. Designed by Katsuya Iwamoto, the game revolved around a female SWAT member Lila rescuing survivors in an isolated monster-infested town and bringing them to safety in a church. It was open-ended like "Dragon Quest" and had real-time side-view battles like "". Unlike other RPGs at the time, however, the game had a dark and creepy atmosphere expressed through the story, graphics, and music, while the gameplay used shooter-based combat and gave limited ammunition for each weapon, forcing the player to search for ammo and often run away from monsters in order to conserve ammo. That same year saw the release of "Laplace no Ma", another hybrid of survival horror and RPG, though with more traditional RPG elements such as turn-based combat. It was mostly set in a mansion infested with undead creatures, and the player controlled a party of several characters with different professions, including a scientist who constructs tools and a journalist who takes pictures. | https://en.wikipedia.org/wiki?curid=32408675 | 701,792 |
2,064,301 | Ghodssi received his Bachelor's (1990), Master's (1992), and Doctoral (1996) degrees in Electrical Engineering from the University of Wisconsin-Madison. He then performed his post-doctorate work at the Massachusetts Institute of Technology from 1997-1999, joining the faculty at the University of Maryland in 2000. Between 2009 and 2017, Ghodssi directed the ISR, launching a number of interdisciplinary initiatives such as the Maryland Robotics Center (MRC) and the Brain and Behavior Initiative (BBI), of which he served as the founding co-Director for six years (2015-2021). These initiatives are aimed at enhancing the impact of ISR's research on society; they also looked to build a more interactive faculty, staff, and student community across the different disciplines within ISR. Efforts at the MRC include advancing the underlying component technologies and the applications of robotics through a focus on interdisciplinary educational and research programs. Work at the BBI aims to revolutionize the interface between neuroscience and engineering by generating novel approaches and tools to understand the complex behaviors produced by the human brain. Part of Ghodssi's community-building also includes reaching out to industry and alumni: these efforts have resulted in a large number of industry-sponsored monthly seminar series as well as annual fellowships for graduate students and post-doctoral associates. Combined, these initiatives work to promote an active industry-oriented mentoring ecosystem in the Systems Engineering Education program in ISR. Dr. Ghodssi served as the President-elect of the Transducer Research Foundation (TRF) from 2020 until 2022. Since June 2022, he has been serving as the new President of TRF. The TRF is a nonprofit organization in the United States whose mission is to stimulate research in science and engineering, with emphasis on technologies related to transducers, microsystems, and nanosystems, and to foster the exchange of ideas and information between academic, industrial, and government researchers. | https://en.wikipedia.org/wiki?curid=53721687 | 2,063,110 |
1,364,461 | Beginning in 2004, the USMLE program undertook a comprehensive review of the USMLE, referred to as the Comprehensive Review of USMLE (CRU). The review was overseen by the committee to Evaluate the USMLE Program (CEUP), which was composed of students, residents, clinicians, and members of the licensing, graduate, and undergraduate education communities. The goal of the committee was to determine if the mission and purpose of USMLE were effectively and efficiently supported by the current design, structure, and format of the USMLE. This process was to be guided, in part, by an analysis of information gathered from stakeholders, and was to result in recommendations to USMLE governance. The CEUP worked from 2006 to early 2008. The CEUP's final report states that "none of the feedback (received from other stakeholders) seemed to indicate that USMLE is broken, but there was considerable interest in enhancing and improving the program." Additionally, the report states that "there appeared to be very strong reactions to Step 2 CS, and CEUP felt that survey and stakeholder meeting data on this component needed to be interpreted in a special way by attempting to separate (but still be attentive to) issues related to the mechanics and costs of Step 2 CS versus the value of what the exam is intended to measure. On the issue of mechanics and costs, CEUP recognized that USMLE must be very attentive to the burden put on examinees by this testing format and that the impact on examinees must be considered when proposing future directions. Concerning the skills measured by Step 2 CS, there seemed to be legitimate concerns about content. Many people wanted to see the exam begin to assess whether the examinee can detect and interpret abnormal findings and handle challenging communication issues. There was a frequently expressed sentiment that this exam was ripe for enhancement and that many of the more advanced communication skills and other competencies could be assessed through this vehicle." In response to the feedback gathered, the CEUP recommended that "the assessment of clinical skills remain a component of USMLE, but that USMLE consider ways to further enhance the testing methods currently used, in order to address additional skills im-portant to medical practice. It is also recommended that the administrative challenges and costs to examinees associated with related testing formats be given substantial weight in the consideration of future changes." | https://en.wikipedia.org/wiki?curid=1660985 | 1,363,705 |
1,813,923 | The Universal House of Justice, current head of the religion, through its Bahá'í International Community, released a statement in 1995, "The Prosperity of Humankind" which says in part: For the vast majority of the world’s population, the idea that human nature has a spiritual dimension—indeed that its fundamental identity is spiritual—is a truth requiring no demonstration. It is a perception of reality that can be discovered in the earliest records of civilization and that has been cultivated for several millennia by every one of the great religious traditions of humanity’s past. Its enduring achievements in law, the fine arts, and the civilizing of human intercourse are what give substance and meaning to history. In one form or another its promptings are a daily influence in the lives of most people on earth and, as events around the world today dramatically show, the longings it awakens are both inextinguishable and incalculably potent. and further Future generations … will find almost incomprehensible the circumstance that, in an age paying tribute to an egalitarian philosophy and related democratic principles, development planning should view the masses of humanity as essentially recipients of benefits from aid and training. Despite acknowledgment of participation as a principle, the scope of the decision making left to most of the world’s population is at best secondary, limited to a range of choices formulated by agencies inaccessible to them and determined by goals that are often irreconcilable with their perceptions of reality. The scholar Graham Hassall summarizes that statement saying it "demonstrates the breath-taking scope of the Bahá'í program of governance reform, from local to global levels, and encompasses not only political and legal fundamentals, but the roles of science and technology in the global distribution of knowledge and power." and university professor Sabet Behrooz called "…a brilliant statement …(showing) the necessity of harmony between science and religion…(which) must be the guiding light and the organizing principle of our endeavors in integrative studies of the Bahá'í Faith." | https://en.wikipedia.org/wiki?curid=2353883 | 1,812,889 |
1,516,812 | Miguel Pérez Carreño (Valencia, 1904– Caracas, 1966) was a physician, researcher, scientist, university professor and writer. He graduated with a bachelor's degree from the Central University of Venezuela with his thesis called "Calor animal" (Animal heat). In 1920 he re-entered the university to study medicine and before graduating he worked as a clinical monitor. He earned a PhD in Medical Sciences in October 1926 with the presentation of the thesis "Autoseroterapia de los derrames" (Auto-serum therapy of effusions) and then devoted himself largely as a teacher. Between 1933 and 1934 he completed his academic training at hospitals of New York City, Paris and Vienna. He considered diagnosis an art that had to be accomplished not only through clinical history, but through long, sustained, conversation with the patient about their health problems and living conditions. Beginning in 1936, he worked on the study, analysis and evaluation of definitive treatment for surgical diseases. In Venezuela he did a series of interventions including pasacro nerve resection in the treatment of pelvic neuralgia, resection of the rectum with contra natura permanent anum, (1932), ovarian homografts (1936), the new technique of lymphatic blockade in infectious processes, carried out with electrosurgery linked with sulfonamide therapy (1938), the radical cure of rectal prolapse with fascia lata (aponeurosis of the thigh) ligation of the femoral artery by gangrene and embolectomy by phlebitis. He also contributed to improving the treatment of Banti syndrome (abnormal growth of the spleen) and portal hypertension (usually caused by liver cirrhosis). Active in the Caracas Polyclinic, the José María Vargas Hospital and the University Hospital, Perez-Carreño was head of descriptive practical anatomy procedures, head of surgical medicine, chief of clinical surgery and dean of the Faculty of Medicine, among other duties. He spent part of his last years on cancer research. | https://en.wikipedia.org/wiki?curid=29302481 | 1,515,960 |
198,301 | Living Breakwaters is a $67 million strategy that aims to help a waterfront neighborhood called Tottenville, on the southwestern part of New York City, survive rising sea levels and storm surges caused by climate change. The strategy connects aquatic landscape architecture, science education, waste collection and coastal housing politics through the implementation of a 13,000 ft long breakwater with oysters planted along it. Each oyster planted is capable of filtering 50 gallons per day, removing all pollutants and toxins to create a cleaner harbor. The reproductive shimmy of the bivalves attracts many marine species, fostering significant ecosystem growth. Pete Malinowski, from the Billion Oyster Project, states that of the millions of oyster spawn that become babies, hardly any survive due to the lack of surfaces to settle on. Therefore, the implementation of reef will enhance the likelihood of oyster’s survival. This concept of using oysters to fight climate change in coastal towns, “oyster-tecture”, was introduced in 2010 by landscape architect Kate Orff. Orff and her team created soft infrastructure made from fuzzy rope that allows for the seeding of oysters along points of the harbor. Living Breakwaters was awarded the Rebuild By Design grant in 2014 and was implemented by the New York State Governor’s Office of Storm Recovery. This grant additionally includes an on-shore building, the “water-hub, to host community activities near the breakwater. Simultaneously, NY Rising, the state’s civic rebuilding process, matched this federally funding initiative with the Tottenville Dune and Coastal Dune Plantings project The breakwaters will strengthen the vegetated dune system, providing protection to the beachside community in Tottenville by tempering beach erosion. Additionally, northern and eastern expansion of Living Breakwaters is expected along Staten Island at Lemon Creek and Great Kills. As a result, Living Breakwaters provides educational opportunities for Tottenville youth, and overall economic growth for the town. | https://en.wikipedia.org/wiki?curid=1018652 | 198,199 |
666,805 | After the war he led in the formation of the International Union of Crystallography and was elected its first president. He reorganised the Cavendish into units to reflect his conviction that "the ideal research unit is one of six to twelve scientists and a few assistants, helped by one or more first-class instrument mechanics and a workshop in which the general run of apparatus can be constructed." Senior members of staff now had offices, telephones and secretarial support. The scope of the department was enlarged with a new unit on radio astronomy. His own work focused on the structure of metals, using both X-rays and the electron microscope. In 1947 he persuaded the Medical Research Council (MRC) to support what he described as the "gallant attempt" to determine protein structure as the Laboratory of Molecular Biology, initially consisting of Perutz, John Kendrew and two assistants. Bragg worked with them, by 1960 they had resolved the structure of myoglobin to the atomic level. After this he was less involved; their analysis of haemoglobin was easier after they incorporated two mercury atoms as markers in each molecule. The first monumental triumph of the MRC was decoding the structure of DNA by James Watson and Francis Crick. Bragg announced the discovery at a Solvay conference on proteins in Belgium on 8 April 1953, it went unreported by the press. He then gave a talk at Guy's Hospital Medical School in London on Thursday 14 May 1953, which resulted in an article by Ritchie Calder in the "News Chronicle" of London on Friday 15 May 1953, entitled "Why You Are You. Nearer Secret of Life". Bragg nominated Crick, Watson and Maurice Wilkins for the 1962 Nobel Prize in Physiology or Medicine; Wilkins' share recognised the contribution of X-ray crystallographers at King's College London. Among them was Rosalind Franklin, whose "photograph 51" showed that DNA was a double helix, not the triple helix that Linus Pauling had proposed. Franklin died before the prize (which only goes to living people) was awarded. | https://en.wikipedia.org/wiki?curid=303544 | 666,457 |
1,773,503 | The upper photo on the right is a 50 times magnification of the combined photomasks used to fabricate the Hughes H4040, the linear silicon photodiode array used in the Thematic Mapper to image the visible bands. Each of the 16 photodiodes is 100 microns square and their separation is 100 microns. There are two rows because it is scanned perpendicular to the lines of diodes and they produce a complete line with no separation. The alignment marks and their layer names can be seen at each end. Each layer is a different color. The pink layer is a second layer of aluminium acting as an aperture. The openings had to be 100u square exactly. The exact dimensions were required in order to achieve a 30 meter resolution on the ground. A set of four of these were fabricated and the fabrication process documented to NASA requirements and verified the dimensions as part of my employment at Hughes Aircraft Company's Industrial Products Division in Carlsbad California in 1978. The challenge was to customize each of these for one of the narrow visible bands that were required. To do that the thickness of the silicon nitride antireflective layer had to meet a precise target, for example, for one band the target was 120 nm while the next band required 130 nm. In addition all of the 16 photodiodes in the array had to have the same thickness. At the time all that was available to manufacture this film was an atmospheric deposition system that basically burned silane (SiH4) in the presence of ammonia in a horizontal tube heated to about 850°C. But that process yielded a smoky film that varied significantly over the silicon wafer and the diode array. So with the help of a workmate who had invented low pressure (LPCVD) polysilicon deposition at Motorola a few years earlier a low pressure silicon nitride system was built using silane and ammonia to produce the precisely tunable and uniform thicknesses needed at each of the bands. Without that innovation there would have been no TM. The boron diffused photodiodes had to have very low dark current and high photosensitivity in order to meet the imager specifications. The large crosses visible on the pattern were used when the different arrays were aligned together at final assembly in the Thematic Mapper. The final assembly or the TM Focal Plane without filers is shown in the second photo to the right. They flew in Landsat 4 and for 20 years maintained operation. | https://en.wikipedia.org/wiki?curid=11064666 | 1,772,506 |
190,523 | "Haruna" was laid down at Kobe by Kawasaki on 16 March 1912, launched 14 December 1913, and formally commissioned 19 April 1915. After a short patrolling duty off Sasebo, "Haruna" suffered a breech explosion during gunnery drills on 12 September 1920; seven crewmen were killed and the No. 1 turret badly damaged. After a long period of time in reserve, "Haruna" underwent her first modernization from 1926 to 1928. The process upgraded her propulsion capabilities, enabled her to carry and launch floatplanes, increasing her armour capacity by over 4,000 tons, and was shortly thereafter reclassified as a Battleship. She was overhauled a second time from 1933 to 1935, which additionally strengthened her armour and reclassified her as a fast battleship. During the Second Sino-Japanese War, "Haruna" primarily served as a large-scale troop transport for Japanese troops to the Chinese mainland. On the eve of the commencement of World War II, "Haruna" sailed as part of Vice-Admiral Nobutake Kondō's Southern Force. On 8 December 1941, "Haruna" provided heavy support for the invasion of Malaya and Singapore. She participated in the major Japanese offensives in the southern and southwestern Pacific in early 1942, before sailing as part of the carrier-strike force during the Battle of Midway. "Haruna" bombarded American positions at Henderson Field at Guadalcanal, and provided escort to carriers during the Solomon Islands campaign. In 1943, she deployed as part of a larger force on multiple occasions to counter the threat of American carrier strikes, but did not actively participate in a single battle. In 1944, "Haruna" was an escort during the Battle of the Philippine Sea and fought American surface vessels off Samar during the Battle of Leyte Gulf. She was the only one of the four battleships in her class to survive 1944. "Haruna" remained at Kure throughout 1945, where she was sunk by aircraft of Task Force 38 on 28 July 1945, after taking nine bomb hits at her moorings. She was subsequently raised and broken up for scrap in 1946. | https://en.wikipedia.org/wiki?curid=3636947 | 190,424 |
81,748 | When referring to laptop displays or independent displays and projectors intended primarily for use with computers, WXGA is also used to describe a resolution of pixels, with an aspect ratio of . This was once particularly popular for laptop screens, usually with a diagonal screen size of between 12 and 15 inches, as it provided a useful compromise between 4:3 XGA and 16:9 WXGA, with improved resolution in "both" dimensions vs. the old standard (especially useful in portrait mode, or for displaying two standard pages of text side by side), a perceptibly "wider" appearance and the ability to display 720p HD video "native" with only very thin letterbox borders (usable for on-screen playback controls) and no stretching. Additionally, like , it required only 1000KB (just under 1MB) of memory per 8-bit channel; thus, a typical double-buffered 32-bit colour screen could fit within 8MB, limiting everyday demands on the complexity (and cost, energy use) of integrated graphics chipsets and their shared use of typically sparse system memory (generally allocated to the video system in relatively large blocks), at least when only the internal display was in use (external monitors generally being supported in "extended desktop" mode to at least resolution). 16:10 (or 8:5) is itself a rather "classic" computer aspect ratio, harking back to early modes (and their derivatives) as seen in the Commodore 64, IBM CGA card and others. However, as of mid-2013, this standard is becoming increasingly rare, crowded out by the more standardised and thus more economical-to-produce panels, as its previously beneficial features become less important with improvements to hardware, gradual loss of general backwards software compatibility, and changes in interface layout. As of August 2013, the market availability of panels with native resolution had been generally relegated to data projectors or niche products such as convertible tablet PCs and LCD-based eBook readers. | https://en.wikipedia.org/wiki?curid=28385304 | 81,715 |
996,076 | There are also advanced professional ballistic models like PRODAS available. These are based on six degrees of freedom (6 DoF) calculations. 6 DoF modeling accounts for x, y, and z position in space along with the projectiles pitch, yaw, and roll rates. 6 DoF modeling needs such elaborate data input, knowledge of the employed projectiles and expensive data collection and verification methods that it is impractical for non-professional ballisticians, but not impossible for the curious, computer literate, and mathematically inclined. Semi-empirical aeroprediction models have been developed that reduced extensive test range data on a wide variety of projectile shapes, normalizing dimensional input geometries to calibers; accounting for nose length and radius, body length, and boattail size, and allowing the full set of 6-dof aerodynamic coefficients to be estimated. Early research on spin-stabilized aeroprediction software resulted in the SPINNER computer program. The FINNER aeroprediction code calculates 6-dof inputs for fin stabilized projectiles. Solids modeling software that determines the projectile parameters of mass, center of gravity, axial and transverse moments of inertia necessary for stability analysis are also readily available, and simple to computer program. Finally, algorithms for 6-dof numerical integration suitable to a 4th order Runge-Kutta are readily available. All that is required for the amateur ballistician to investigate the finer analytical details of projectile trajectories, along with bullet nutation and precession behavior, is computer programming determination. Nevertheless, for the small arms enthusiast, aside from academic curiosity, one will discover that being able to predict trajectories to 6-dof accuracy is probably not of practical significance compared to more simplified point mass trajectories based on published bullet ballistic coefficients. 6 DoF is generally used by the aerospace and defense industry and military organizations that study the ballistic behavior of a limited number of (intended) military issue projectiles. Calculated 6 DoF trends can be incorporated as correction tables in more conventional ballistic software applications. | https://en.wikipedia.org/wiki?curid=584911 | 995,558 |
1,516,797 | Gabriel A. Rincon-Mora (born in Caracas in 1972) is a Venezuelan-American electrical engineer, scientist, professor, inventor, and author who was elevated to the grade of Fellow by the Institute of Electrical and Electronics Engineers (IEEE) in 2011 and to the grade of Fellow by the Institution of Engineering and Technology (IET) in 2009 for his contributions to energy-harvesting and power-conditioning integrated circuits (ICs). "Hispanic Business Magazine" voted him one of "The 100 Most Influential Hispanics" in 2000, the Society of Hispanic Professional Engineers (SHPE) awarded him the National Hispanic in Technology Award in 2000, Florida International University (FIU) awarded him the Charles E. Perry Visionary Award in 2000, the Georgia Institute of Technology inducted him into its Council of Outstanding Young Engineering Alumni in 2000, and former lieutenant governor Cruz Bustamante of California presented him a Commendation Certificate in 2001. Rincón-Mora grew up in Maracay, and migrated to the United States when he was 11 years old. He graduated at Florida International University as Electrical Engineer in 1992, Georgia Tech with a Master of Science degree in electrical engineering with a minor in mathematics in 1994, and Georgia Tech with a PhD in electrical engineering in 1996 with a dissertation on "Current Efficient, Low Voltage, Low Dropout Regulators" (Advisor: Prof. Phil Allen). He worked for Texas Instruments from 1994 to 2003, was an adjunct professor for the School of Electrical and Computer Engineering at Georgia Tech (1999–2001), professor at Georgia Tech since 2001 and visiting professor at National Cheng Kung University (NCKU) in Taiwan since 2011. He has written several books, chapters of others, and over 160 other publications. His work has generated 38 patents. He has designed over 26 commercial power-chip designs and delivered over 95 presentations worldwide. his publications had been cited over 5200 times. His work and research is on the design and development of silicon-based microsystems that draw and condition power from tiny batteries, fuel cells, and generators that harness ambient energy from motion, light, temperature, and radiation to supply mobile, portable, and self-sustaining devices such as wireless microsensors for biomedical, consumer, industrial, and military applications. He has worked on voltage references, low-dropout regulators, switching dc–dc converters, and energy-harvesting microsystems. | https://en.wikipedia.org/wiki?curid=29302481 | 1,515,945 |
2,138,015 | The extension of the World Wide Web architecture into the product is important to understand, as all decisions for manufacturing of spare parts, scheduling for flights, and other factory OEM and airline operator functions, are driven primarily by what happens to the product in the field (rate of wear and impending failure, primarily). Predicting the rate of wear, and hence the impact on operations and forecasting for producing spare parts in the future, is critical for optimizing operations for all involved. Managing a complex system such as a fleet of aircraft, vehicles or fixed location products can be accomplished in this manner. For example, coupled with technologies such as RFID, the system could track parts from the factory to the aircraft on board, then continue to read the configuration of the subsystem’s replaceable tagged parts, map their configuration to hours run and duty cycles, then process/communicate the projected wear rate through the World Wide Web back to the operator or factory. In this way mechanical wear rates and future failures can be predicted more accurately and the forecasting of spare parts manufacturing and shipment can be significantly improved. This is called Prognostics Health Monitoring (PHM), which has become possible in recent years with the advent of electronic controllers, and is a recent evolutionary step in aircraft support and maintenance management that began as individual processes prior to World War II and solidified into a manual tracking system to support aircraft fleets in the Korean War. Support for the mechanic comes in local wireless access to technical information stored and remotely updated on board the micro-webserver component relevant to that product, such as service bulletins, factory updates, fault code driven, intelligent 3D computer game-like maintenance procedures, and social media applications for sharing of product issues and maintenance procedure improvements in the field to include collaborative 2-way voice, text and image communications. Note that this architecture can be utilized on any system that requires monitoring and trending, to include mobile medical applications for monitoring functionality of human systems when the subject is equipped with data sensors. | https://en.wikipedia.org/wiki?curid=16620919 | 2,136,785 |
144,032 | Schools are continuously updating their curricula to keep up with accelerating technological developments. This often includes computers in the classroom, the use of educational software to teach curricula, and course materials being made available to students online. Students are often taught literacy skills such as how to verify credible sources online, cite websites, and prevent plagiarism. Google and Wikipedia are frequently used by students "for everyday life research," and are just two common tools that facilitate modern education. Digital technology has impacted the way material is taught in the classroom. With the use of technology rising over the past decade, educators are altering traditional forms of teaching to include course material on concepts related to digital literacy. Educators have also turned to social media platforms to communicate and share ideas with one another. Social media and social networks have become a crucial part of the information landscape. Many students are using social media to share their areas of interest, which helps boost their level of engagement with educators. New standards have been put into place as digital technology has augmented classrooms, with many classrooms being designed to use smartboards and audience response systems in replacement of traditional chalkboards or whiteboards. “The development of Teacher’s Digital Competence (TDC) should start in initial teacher training, and continue throughout the following years of practice. All this to use Digital Technologies (DT) to improve teaching and professional development.” New models of learning are being developed with digital literacy in mind. Several countries have based their models on the emphasis of finding new digital didactics to implement as they find more opportunities and trends through surveys conducted with educators and college instructors. Additionally, these new models of learning in the classroom have aided in promoting global connectivity and have enabled students to become globally-minded citizens. According to the study Building Digital Literacy Bridges Connecting Cultures and Promoting Global Citizenship in Elementary Schools through School-Based Virtual Field Trips by Stacy Delacruz, Virtual Field Trips (VFT) a new form of multimedia presentation has gained popularity over the years in that they offer the "opportunity for students to visit other places, talk to experts and participate in interactive learning activities without leaving the classroom". They have also been used as a vessel for supporting cross-cultural collaboration amongst schools which includes: "improved language skills, greater classroom engagement, deeper understandings of issues from multiple perspectives, and an increased sensitivity to multicultural differences". It also allows students to be the creators of their own digital content, a core standard from The International Society for Technology in Education (ISTE). | https://en.wikipedia.org/wiki?curid=5169750 | 143,974 |
148,034 | In 1887, Heinrich Rudolf Hertz discovered but could not explain the photoelectric effect, which was later explained in 1905 by Albert Einstein (Nobel Prize in Physics 1921). Two years after Einstein's publication, in 1907, P.D. Innes experimented with a Röntgen tube, Helmholtz coils, a magnetic field hemisphere (an electron kinetic energy analyzer), and photographic plates, to record broad bands of emitted electrons as a function of velocity, in effect recording the first XPS spectrum. Other researchers, including Henry Moseley, Rawlinson and Robinson, independently performed various experiments to sort out the details in the broad bands. After WWII, Kai Siegbahn and his research group in Uppsala (Sweden) developed several significant improvements in the equipment, and in 1954 recorded the first high-energy-resolution XPS spectrum of cleaved sodium chloride (NaCl), revealing the potential of XPS. A few years later in 1967, Siegbahn published a comprehensive study of XPS, bringing instant recognition of the utility of XPS and also the first hard X-ray photoemission experiments, which he referred to as Electron Spectroscopy for Chemical Analysis (ESCA). In cooperation with Siegbahn, a small group of engineers (Mike Kelly, Charles Bryson, Lavier Faye, Robert Chaney) at Hewlett-Packard in the US, produced the first commercial monochromatic XPS instrument in 1969. Siegbahn received the Nobel Prize for Physics in 1981, to acknowledge his extensive efforts to develop XPS into a useful analytical tool. In parallel with Siegbahn's work, David Turner at Imperial College London (and later at Oxford University) developed ultraviolet photoelectron spectroscopy (UPS) for molecular species using helium lamps. | https://en.wikipedia.org/wiki?curid=70847 | 147,975 |
1,642,576 | Following his PhD, Huxley continued research on the structure and function of muscle. Since Cambridge did not have electron microscopy, which began to be used for biological studies at the time, he went to Massachusetts Institute of Technology as a postdoctoral fellow on a Commonwealth Fellowship in late summer of 1952. He work in F. O. Schmitt's laboratory where he was joined by Jean Hanson in 1953. Their collaboration proved to be fruitful as they discovered the so-called "sliding filament theory" of muscle contraction. Their publication in the 22 May 1954 issue of "Nature" became a landmark in muscle physiology. He returned to MRC unit of Cambridge in the late spring of 1954. Using X-ray diffraction he found the molecular interaction in the muscle fibres. The LMB was then equipped with electron microscope, but still had technical issues. Knowing his potential the University College London appointed him to the faculty, and moved there to join Bernard Katz's biophysics department in 1955. For his purpose he was bought a new electron microscope with fund from the Wellcome Trust. His innovative contribution was making a modified version of thin-sectioning microtome, by which he could make histological sections of only 100–150 Å in thickness. Based on his LMB X-ray diffraction images, the new technique immediately helped him to establish the cross-bridge concept (interaction site of the muscle proteins, myosin and actin). As the MRC unit was enlarged he was invited back in 1962, with a research fellowship at King's College for five years and then a more permanent one at Churchill College. He became the joint Head of the Structural Studies Division of the LMB in 1975, and its Deputy Director in 1979. In 1969, on the basis of his work over more than 15 years, he finally formulated the "swinging cross-bridge hypothesis" of muscle contraction, which is the molecular basis of muscle contraction. The concept itself became directly fundamental to other types of cell motility. In 1987 he joined the biology faculty at Brandeis University in Waltham, Massachusetts, where he also served as Director of the Rosenstiel Basic Medical Sciences Research Center, and becoming emeritus from 1997 until his death. | https://en.wikipedia.org/wiki?curid=3383286 | 1,641,649 |
1,253,837 | Beginning with the Meiji Restoration of 1868, which established a new, centralized regime, Japan set out to "gather wisdom from all over the world" and embarked on an ambitious program of military, social, political, and economic reforms that transformed it within a generation into a modern nation-state and major world power. “Propelled by both fear and discontent with the old regime, they generated an ambitious agenda, through a process of trial and error, aiming to build a new sort of national power”. Multiple new policies were brought forth, One of the things that was brought forth was the Charter Oath, “The oath called for an assembly of daimyo in which decisions would be made after open discussion ; “the high and the low” ( samurai and commoners) to administer together financial affairs; both military and “common people” to be allowed to fulfill their goals without strife; past evil practices to be abandoned and accepted world precepts followed; and, finally, knowledge to be sought worldwide to strengthen the foundation of imperial rule" (Hopper Pg.57). The Meiji oligarchy was aware of Western progress, and "learning missions" were sent abroad to absorb as much of it as possible. The Iwakura Mission, the most important one, was led by Iwakura Tomomi, Kido Takayoshi and Ōkubo Toshimichi, contained forty-eight members in total and spent two years (1871–73) touring the United States and Europe, studying every aspect of modern nations, such as government institutions, courts, prison systems, schools, the import-export business, factories, shipyards, glass plants, mines, and other enterprises. Upon returning, mission members called for domestic reforms that would help Japan catch up with the West. | https://en.wikipedia.org/wiki?curid=2998679 | 1,253,157 |
1,273,498 | In 1936 Beadle left the California Institute of Technology to become Assistant Professor of Genetics at Harvard University. A year later he was appointed Professor of Biology (Genetics) at Stanford University and there he remained for nine years, working for most of this period in collaboration with Tatum. This work of Beadle and Tatum led to an important generalization. This was that most mutants unable to grow on minimal medium, but able to grow on “complete” medium, each require addition of only one particular supplement for growth on minimal medium. If the synthesis of a particular nutrient (such as an amino acid or vitamin) was disrupted by mutation, that mutant strain could be grown by adding the necessary nutrient to the minimal medium. This finding suggested that most mutations affected only a single metabolic pathway. Further evidence obtained soon after the initial findings tended to show that generally only a single step in the pathway is blocked. Following their first report of three such auxotroph mutants in 1941, Beadle and Tatum used this method to create series of related mutants and determined the order in which amino acids and some other metabolites were synthesized in several metabolic pathways. The obvious inference from these experiments was that each gene mutation affects the activity of a single enzyme. This led directly to the one gene-one enzyme hypothesis, which, with certain qualifications and refinements, has remained essentially valid to the present day. As recalled by Horowitz, the work of Beadle and Tatum also demonstrated that genes have an essential role in biosynthesis. At the time of the experiments (1941), non-geneticists still generally believed that genes governed only trivial biological traits, such as eye color, and bristle arrangement in fruit flies, while basic biochemistry was determined in the cytoplasm by unknown processes. Also, many respected geneticists thought that gene action was far too complicated to be resolved by any simple experiment. Thus Beadle and Tatum brought about a fundamental revolution in our understanding of genetics. | https://en.wikipedia.org/wiki?curid=767277 | 1,272,806 |
676,587 | The Confederacy was beset by growing problems as its territory steadily shrank, its people grew impoverished, and hopes of victory changed from reliance on Confederate military prowess to dreams of foreign intervention, to finally a desperate hope that the Yankees would grow so weary of war they would sue for peace. The South lost its lucrative export market as the Union blockade shut down all commercial traffic, with only very expensive blockade runners getting in and out. In 1861 the South lost most of its border regions, with Maryland, Kentucky and Missouri gained for the enemy, and western Virginia broken off. The Southern transportation system depended on a river system that the Union gunboats soon dominated, as control of the Mississippi, Missouri, Cumberland, and Tennessee rivers fell to the Union in 1862–63. That meant all the river towns fell to the Union as well, and so did New Orleans in 1862. The rickety railroad system was not designed for long-distance traffic (it was meant to haul cotton to the nearest port), and it steadily deteriorated until by the end practically no trains were running. Civilian morale and recruiting held up reasonably well, as did the morale of the army, until the last year or so. The Confederacy had democratic elections (for all white men), but no political parties. One result was that governors became centers of opposition to Jefferson Davis and his increasingly unpopular central administration in Richmond. Financially the South was in bad shape as it lost its export market, and internal markets failed one after the other. By 1864 women in the national capital were rioting because of soaring food prices they could not afford. With so few imports available, it was necessary to make do, use ersatz (such as local beans for coffee), use up, and do without. The large slave population never rose up in armed revolt, but black men typically took the first opportunity to escape to Union lines, where over 150,000 enrolled in the Union army. When the end came the South had a shattered economy, 300,000 dead, hundreds of thousands wounded, and millions impoverished, but three million former slaves were now free. | https://en.wikipedia.org/wiki?curid=752072 | 676,234 |
1,740,901 | The revolutionary aspect of the aircraft is meant to be its time on station cost, planned to be only 20 percent the cost per hour compared to current aerial surveillance aircraft like the Predator, MQ-9 Reaper, and MC-12W Liberty through efficient aerodynamics and propulsion, lightweight airframe, reliable systems, autonomous operation, and requiring fewer takeoffs and landings. Its ferry range is projected to be , longer than even the Global Hawk, which enables a time-on-station capability ranging from 113 hours at to 47 hours at . With a mission range possible to over , the Orion can be positioned much further from the patrol area, reducing costs that would otherwise be needed to transport an aircraft to a closer main operating base; unit price is expected to be less than the Reaper. The aircraft has an empty weight of and carries of fuel. It has the capacity to carry of sensors and weapons spread through the airframe, able to support in the nose, in the aft fuselage, and under the wings. The base sensor is the Raytheon MTS-B electro-optical/infrared turret, but options can include a ground moving target indication (GMTI) radar under the nose, a multi-camera wide-area surveillance sensor in the aft bay, and external fuel tanks and Hellfire missiles under the wings. Its wingspan is only slightly longer than the Global Hawk's, made of a long-span, one-piece, low-drag, light weight composite wing from tip-to-tip, which reduces weight and cost but prevents it from being disassembled and airlifted to another location. Top speed is slow at by design to balance fuel efficiency and power consumption with weather tolerance. Propulsion comes from a pair of Austro Engine AE300 diesel engines rather than more expensive and less fuel efficient gas turbines. Although it is designed to fly for five days carrying standard payload weight, it could fly for a week with a lighter payload. | https://en.wikipedia.org/wiki?curid=49014171 | 1,739,920 |
557,994 | Neuromechanics is the coupling of neurobiology, biomechanics, sensation and perception, and robotics (Edwards 2010). Researchers are using advanced techniques and models to study the mechanical properties of neural tissues and their effects on the tissues' ability to withstand and generate force and movements as well as their vulnerability to traumatic loading (Laplaca & Prado 2010). This area of research focuses on translating the transformations of information among the neuromuscular and skeletal systems to develop functions and governing rules relating to operation and organization of these systems (Nishikawa et al. 2007). Neuromechanics can be simulated by connecting computational models of neural circuits to models of animal bodies situated in virtual physical worlds (Edwards 2010). Experimental analysis of biomechanics including the kinematics and dynamics of movements, the process and patterns of motor and sensory feedback during movement processes, and the circuit and synaptic organization of the brain responsible for motor control are all currently being researched to understand the complexity of animal movement. Dr. Michelle LaPlaca's lab at Georgia Institute of Technology is involved in the study of mechanical stretch of cell cultures, shear deformation of planar cell cultures, and shear deformation of 3D cell containing matrices. Understanding of these processes is followed by development of functioning models capable of characterizing these systems under closed loop conditions with specially defined parameters. The study of neuromechanics is aimed at improving treatments for physiological health problems which includes optimization of prostheses design, restoration of movement post injury, and design and control of mobile robots. By studying structures in 3D hydrogels, researchers can identify new models of nerve cell mechanoproperties. For example, LaPlaca et al. developed a new model showing that strain may play a role in cell culture (LaPlaca et al. 2005). | https://en.wikipedia.org/wiki?curid=2567511 | 557,705 |
336,670 | That trend started to turn around in the late 1990s as corporations found new uses for their existing mainframes and as the price of data networking collapsed in most parts of the world, encouraging trends toward more centralized computing. The growth of e-business also dramatically increased the number of back-end transactions processed by mainframe software as well as the size and throughput of databases. Batch processing, such as billing, became even more important (and larger) with the growth of e-business, and mainframes are particularly adept at large-scale batch computing. Another factor currently increasing mainframe use is the development of the Linux operating system, which arrived on IBM mainframe systems in 1999 and is typically run in scores or up to c. 8,000 virtual machines on a single mainframe. Linux allows users to take advantage of open source software combined with mainframe hardware RAS. Rapid expansion and development in emerging markets, particularly People's Republic of China, is also spurring major mainframe investments to solve exceptionally difficult computing problems, e.g. providing unified, extremely high volume online transaction processing databases for 1 billion consumers across multiple industries (banking, insurance, credit reporting, government services, etc.) In late 2000, IBM introduced 64-bit z/Architecture, acquired numerous software companies such as Cognos and introduced those software products to the mainframe. IBM's quarterly and annual reports in the 2000s usually reported increasing mainframe revenues and capacity shipments. However, IBM's mainframe hardware business has not been immune to the recent overall downturn in the server hardware market or to model cycle effects. For example, in the 4th quarter of 2009, IBM's System z hardware revenues decreased by 27% year over year. But MIPS (millions of instructions per second) shipments increased 4% per year over the past two years. Alsop had himself photographed in 2000, symbolically eating his own words ("death of the mainframe"). | https://en.wikipedia.org/wiki?curid=20266 | 336,491 |
1,668,798 | When war broke out, Low joined the military and received officer training. After a few months he was promoted to captain and seconded to the Royal Flying Corps, the precursor of the RAF. His brief was to use his civilian research on Televista to remotely control the RFC drone weapons proposed by the Royal Aircraft Factory, so it could be used as a guided missile. With two other officers (Captain Poole and Lieutenant Bowen) under him, they set to work to see if it were possible. This project was called "Aerial Target" or AT a misnomer to fool the Germans into thinking it was about building a drone plane to test anti-aircraft capabilities. After they built a prototype, General Sir David Henderson (director-general of Directorate of Military Aeronautics) ordered that an Experimental Works at Feltham should be created to build the first proper "Aerial Target" complete with explosive warhead. As head of the Experimental Works, Low was given about 30 picked men, including jewellers, carpenters and aircraftsmen in order to get the pilotless plane built as quickly as possible. The AT planes were from manufacturers such as Airco, Sopwith Aviation Company and the Royal Aircraft Factory. The de Havilland-designed Airco ATs had their first trial on 21 March 1917 at Upavon Central Flying School near Salisbury Plain, attended by 30–40 allied generals. The AT was launched from the back of a lorry using compressed air (another first). Low and his team successfully demonstrated their ability to control the craft before engine failure led to its crash landing. A subsequent trial of the RAF ATs on 6 July 1917 was cut short as an AT had been lost at takeoff. At a later date an electrically driven gyrocompass (yet another first) was added to the plane. In 1918 Low's Feltham Works developed the airborne controlled Royal Navy Distance Control Boats (DCB), a variant of the Coastal Motor Boat. | https://en.wikipedia.org/wiki?curid=5778070 | 1,667,858 |
1,532,282 | Although current treatments can be administered in a controlled hospital setting, many hospitals are ill-suited for a situation involving mass casualties among civilians. Inexpensive positive-pressure devices that can be used easily in a mass casualty situation, and drugs to prevent inflammation and pulmonary edema are needed. Several drugs that have been approved by the FDA for other indications hold promise for treating chemically induced pulmonary edema. These include β2-agonists, dopamine, insulin, allopurinol, and non-steroidal anti-inflammatory drugs (NSAIDs), such as ibuprofen. Ibuprofen is particularly appealing because it has an established safety record and can be easily administered as an initial intervention. Inhaled and systemic forms of β2-agonists used in the treatment of asthma and other commonly used medications, such as insulin, dopamine, and allopurinol have also been effective in reducing pulmonary edema in animal models but require further study. A recent study documented in the "AANA Journal" discussed the use of volatile anesthetic agents, such as sevoflurane, to be used as a bronchodilator that lowered peak airway pressures and improved oxygenation. Other promising drugs in earlier stages of development act at various steps in the complex molecular pathways underlying pulmonary edema. Some of these potential drugs target the inflammatory response or the specific site(s) of injury. Others modulate the activity of ion channels that control fluid transport across lung membranes or target surfactant, a substance that lines the air sacs in the lungs and prevents them from collapsing. Mechanistic information based on toxicology, biochemistry, and physiology may be instrumental in determining new targets for therapy. Mechanistic studies may also aid in the development of new diagnostic approaches. Some chemicals generate metabolic byproducts that could be used for diagnosis, but detection of these byproducts may not be possible until many hours after initial exposure. Additional research must be directed at developing sensitive and specific tests to identify individuals quickly after they have been exposed to varying levels of chemicals toxic to the respiratory tract. | https://en.wikipedia.org/wiki?curid=36597555 | 1,531,415 |
1,679,966 | Use of magnetic resonance neurography is increasing in neurology and neurosurgery as the implications of its value in diagnosing various causes of sciatica becomes more widespread. There are 1.5 million lumbar MRI scans performed in the US each year for sciatica, leading to surgery for a herniated disk in about 300,000 patients per year. Of these, about 100,000 surgeries fail. Therefore, there is successful treatment for sciatica in just 200,000 and failure of diagnosis or treatment in up to 1.3 million annually in the US alone. The success rate of the paradigm of lumbar MRI and disk resection for treatment of sciatica is therefore about 15%(Filler 2005). Neurography has been applied increasingly to evaluate the distal nerve roots, lumbo-sacral plexus and proximal sciatic nerve in the pelvis and thigh to find other causes of sciatica. It is increasingly important for brachial plexus imaging and for the diagnosis of thoracic outlet syndrome. Research and development in the clinical use of diagnostic neurography has taken place at Johns Hopkins, the Mayo Clinic, UCLA, UCSF, Harvard, the University of Washington in Seattle, University of London, and Oxford University (see references below) as well as through the Neurography Institute. Recent patent litigation concerning MR Neurography has led some unlicensed centers to discontinue offering the technique. Courses have been offered for radiologists at the annual meetings of the Radiological Society of North America (RSNA), and at the International Society for Magnetic Resonance in Medicine and for surgeons at the annual meetings of the American Association of Neurological Surgeons and the Congress of Neurological Surgeons. The use of imaging for diagnosis of nerve disorders represents a change from the way most physicians were trained to practice over the past several decades, as older routine tests fail to identify the diagnosis for nerve related disorders. The New England Journal of Medicine in July 2009 published a report on whole body neurography using a diffusion based neurography technique. In 2010, RadioGraphics - a publication of the Radiological Society of North America that serves to provide continuing medical education to radiologists - published an article series taking the position that Neurography has an important role in the evaluation of entrapment neuropathies. | https://en.wikipedia.org/wiki?curid=18296474 | 1,679,023 |
1,227,021 | In the context of the 2019–2020 coronavirus pandemic Neal Baer writes that the "public, scientists, lawmakers, and others" "need to have thoughtful conversations about gene editing now". Ensuring the biosafety level of laboratories may also be an important component of pandemic prevention. This issue may have gotten additional attention in 2020 after news outlets reported that U.S. State Department cables indicate that, although there may be no conclusive proof at the moment, the COVID-19 virus responsible for the COVID-19 pandemic may, possibly, have accidentally come from a Wuhan (China) laboratory, studying bat coronaviruses that included modifying virus genomes to enter human cells, and determined to be unsafe by U.S. scientists in 2018, rather than from a natural source. As of 18 May 2020, an official UN investigation into the origins of the COVID-19 virus, supported by over 120 countries, was being considered. United States' president Donald Trump claimed to have seen evidence that gave him a "high degree of confidence" that the novel coronavirus originated in the Chinese laboratory but did not offer any evidence, data or details, contradicted statements by the United States' intelligence community and garnered a lot of harsh criticism and doubts. As of 5 May, assessments and internal sources from the Five Eyes nations indicated that the coronavirus outbreak being the result of a laboratory accident was "highly unlikely", since the human infection was "highly likely" a result of natural human and animal interaction. Many others have also criticized statements by US government officials and theories of laboratory release. Virologist and immunologist Vincent R. Racaniello said that "accident theories – and the lab-made theories before them – reflect a lack of understanding of the genetic make-up of Sars-CoV-2." Virologist Peter Daszak stated that an estimated 1–7 million people in Southeast Asia who live or work in proximity to bats are infected each year with bat coronaviruses. In January 2021, the WHO's investigations into the origin of COVID-19 was launched. In early 2021, the hypothesis of a laboratory cause of the pandemic received renewed interest and expert consideration due to renewed media discussion. | https://en.wikipedia.org/wiki?curid=63478457 | 1,226,360 |
601,859 | On 1 February 2016, after many delays over whether the UCLASS would specialize in strike or ISR roles, it was reported that a significant portion of the UCLASS effort would be directed to produce a Super Hornet–sized carrier-based aerial tanker as the Carrier-Based Aerial-Refueling System (CBARS), with "a little ISR" and some capabilities to relay communications, with strike capabilities deferred to a future version of the aircraft. The Pentagon chose this in order to address the Navy's expected fighter shortfall by directing funds to buy additional Super Hornets and accelerate purchases and development of the F-35C, quickly getting naval stealth fighters into service, and extending their range to penetrate hostile airspace. It will likely be a less-stealthy wing–body–tail configuration that will limit its ability to operate in contested airspace, be more sensitive to cost considerations, and favor Boeing and General Atomics submissions. Having the CBARS as the first carrier-based UAV provides a less complex bridge to the future F/A-XX, should it be an autonomous strike platform. It also addresses the carriers' need for an organic refueling aircraft, proposed as a mission for the UCLASS since 2014, freeing up the 20–30 percent of Super Hornets performing the mission in a more capable and cost effective manner than modifying the F-35, V-22 Osprey, and E-2D Hawkeye, or bringing the retired S-3 Viking back into service. Although initially designated the RAQ-25, the name was changed to the MQ-25 Stingray. Stealth requirements will be "descoped" and it may still be capable of firing missiles or dropping bombs from drop tank pylons, but surveillance and destroying targets will not be its main missions. Reducing the low-observable requirement is expected to make things easier for existing UCLASS competitors, and to open the competition to new entrants. An RFP for the air vehicle was issued in late 2016; Boeing was awarded the CBARS contract in August 2018. | https://en.wikipedia.org/wiki?curid=39092723 | 601,550 |
702,182 | In 1994, "Final Fantasy VI" moved away from the medieval setting of its predecessors, instead being set in a steampunk environment. The game received considerable acclaim, and is seen as one of the greatest RPGs of all time, for improvements such as its broadened thematic scope, plotlines, characters, multiple-choice scenarios, and variation of play. "Final Fantasy VI" dealt with mature themes such as suicide, war crimes, child abandonment, teen pregnancy, and coping with the deaths of loved ones. Square's "Live A Live", released for the Super Famicom in Japan, featured eight different characters and stories, with the first seven unfolding in any order the player chooses, as well as four different endings. The game's ninja chapter in particular was an early example of stealth game elements in an RPG, requiring the player to infiltrate a castle, rewarding the player if the entire chapter can be completed without engaging in combat. Other chapters had similar innovations, such as Akira's chapter where the character uses telepathic powers to discover information. That same year saw the release of the 3DO console port of the 1991 PC RPG "Knights of Xentar", which had introduced a unique pausable real-time battle system, where characters automatically attack based on a list of different AI scripts chosen by the player. FromSoftware's first video game title, "King's Field", a first-person RPG, is noted for being one of the earliest known 3D console role-playing games. In addition, the game is known for its difficulty and unconventional structure, and would go on to influence FromSoftware's future RPG titles including "Shadow Tower" and "Demon's Souls", the latter described by its staff as a spiritual successor to "King's Field". "Robotrek" by Quintet and Ancient was a predecessor to "Pokémon" in the sense that the protagonist does not himself fight, but sends out his robots to do so. Like "Pokémon", "Robotrek" was designed to appeal to a younger audience, allowed team customization, and each robot was kept in a ball. However, unlike the mentioned game, the protagonist sometimes use Big Bombs or Weather as a defense. | https://en.wikipedia.org/wiki?curid=32408675 | 701,815 |
400,751 | In reserve, "Belfast"s future was uncertain: post-war defence cuts made manpower-intensive cruisers excessively costly to operate, and it was not until March 1955 that the decision was taken to modernise "Belfast". Work began on 6 January 1956. Although described as only an extended refit, the cost of £5.5 million was substantial for this large middle-aged cruiser. Changes included: providing the new twin MK 5 40 mm and the twin 4-inch mount with individual MRS8 directors; the 4-inch guns training and elevation speed was increased to 20 degrees a second; and protecting key parts of the ship against nuclear, biological or chemical attack. This last consideration meant significantly enlarging and enclosing her bridge, creating a two-tiered, five-sided superstructure which radically altered her appearance. The most significant change was better accommodations for a smaller crew more fitting of post-war needs, her tripod masts replaced with lattice masts, and timber decking replaced with steel everywhere except the quarterdeck. The overall effect was to create a cruiser significantly more habitable but different internally and to a degree in external appearance, from wartime cruisers but still essentially a surface warfare, 'anti Sverdlov' cruiser, with anti-aircraft defence, updated only for point defence, with 262 radar, locking only out. "Belfast" recommissioned at Devonport on 12 May 1959. Her close-range armament was standardised to six twin Bofors guns, and her close-range fire direction similarly standardised to eight close-range blind fire directors fitted with Type 262 radar. Her 1959 radar fit included two Type 274, lock and follow radar directors, for main armament direction, against sea and land targets, (other 1950s cruiser reconstructions of three Town cruisers and HMS "Newfoundland" and HMS "Ceylon", had only a single main 274 director, limiting their surface effectiveness) Type 277Q and 293Q for height-finding and surface warning, Type 960M for air warning, and 974 for surface warning. In order to save weight, her torpedo armament was removed. Modern passive sonar type 174, 176 was installed and noise-reducing rubber insulation fitted to the propeller shaft. | https://en.wikipedia.org/wiki?curid=209815 | 400,552 |
626,868 | In one case study, stimulation of thalamus using deep brain stimulation (DBS) led to some behavioral improvements. The patient was a 38-year-old male who had remained in minimally conscious state following a severe traumatic brain injury. He had been unresponsive to consistent command following or communication ability and had remained non-verbal over two years in inpatient rehabilitation. fMRI scans showed preservation of a large-scale, bi-hemispheric cerebral language network, which indicates that possibility for further recovery may exist. Positron emission tomography showed that the patient's global cerebral metabolism levels were markedly reduced. He had DBS electrodes implanted bilaterally within his central thalamus. More specifically, the DBS electrodes targeted the anterior intralaminar nuclei of thalamus and adjacent paralaminar regions of thalamic association nuclei. Both electrodes were positioned within the central lateral nucleus, the paralaminar regions of the median dorsalis, and the posterior-medial aspect of the centromedian/parafasicularis nucleus complex. This allowed maximum coverage of the thalamic bodies. A DBS stimulation was conducted such that the patient was exposed to various patterns of stimulation to help identify optimal behavioral responses. Approximately 140 days after the stimulation began, qualitative changes in behavior emerged. There were longer periods of eye opening and increased responses to command stimuli as well as higher scores on the JFK coma recovery scale (CRS). Functional object use and intelligible verbalization was also observed. The observed improvements in arousal level, motor control, and consistency of behavior could be a result of direct activation of frontal cortical and basal ganglia systems that were innervated by neurons within the thalamic association nuclei. These neurons act as a key communication relay and form a pathway between the brainstem arousal systems and frontal lobe regions. This pathway is crucial for many executive functions such as working memory, effort regulation, selective attention, and focus. | https://en.wikipedia.org/wiki?curid=2213172 | 626,535 |
523,762 | Under the 'New Math' initiative, created after the successful launch of the Soviet satellite "Sputnik" in 1957, conceptual abstraction gained a central role in mathematics education. It was part of an international movement influenced by the Nicholas Bourbaki school in France, attempting to bring the mathematics taught in schools closer to what research mathematicians actually use. Students received lessons in set theory, which is what mathematicians actually use to construct the set of real numbers, normally taught to advanced undergraduates in real analysis (see Dedekind cuts and Cauchy sequences). Arithmetic with bases other than ten was also taught (see binary arithmetic and modular arithmetic). However, this educational initiative faced strong opposition, not just from teachers, who struggled to understand the new material, let alone teach it, but also from parents, who had problems helping their children with homework. It was criticized by experts, too. In a 1965 essay, physicist Richard Feynman argued, "first there must be freedom of thought; second, we do not want to teach just words; and third, subjects should not be introduced without explaining the purpose or reason, or without giving any way in which the material could be really used to discover something interesting. I don't think it is worthwhile teaching such material." In his 1973 book, "", mathematician and historian of mathematics Morris Kline observed that it was "practically impossible" to learn new mathematical creations without first understanding the old ones, and that "abstraction is not the first stage, but the last stage, in a mathematical development." Kline criticized the authors of the 'New Math' textbooks, not for their mathematical faculty, but rather their narrow approach to mathematics, and their limited understanding of pedagogy and educational psychology. Mathematician George F. Simmons wrote in the algebra section of his book "Precalculus Mathematics in a Nutshell" (1981) that the New Math produced students who had "heard of the commutative law, but did not know the multiplication table." | https://en.wikipedia.org/wiki?curid=24836125 | 523,490 |
275,429 | The United States Army Armament Research, Development and Engineering Center (ARDEC) began development of a 40 mm smart airburst fuze (proximity fuze) in 2011 to improve the ability of grenade launchers like the M203 and M320 to engage targets in defilade. Called "small arms grenade munitions" (SAGMs), they double the lethality of the standard M433 grenade round by adding a small "smart" fuze sensor that detonates in the air to hit targets in cover or behind obstacles. The airburst function is similar to the XM25 CDTE, which has an onboard laser system to determine the distance to the target, but SAGMs are considered complementary to the XM25 rather than competing against it, as the XM25 provides low-angle fire while 40 mm launchers fire a lobbing trajectory. Integrated sensors and logic devices scan and filter the environment and then autonomously airburst the fuze without needing to be told to by the firer, thereby not requiring the soldier to carry extra weapon accessories. SAGMs enable soldiers to accurately incapacitate personnel targets in defilade at ranges between 50 and 500 meters. The round is engineered with three firing modes: airburst; point detonation; and self-destruct. A successful demonstration occurred in November 2013. Although the SAGM sensor does not need a laser rangefinder or any pre-fire programming sequence, it does require some skill by the user to aim and fire the round correctly so that it can detect the wall or obstruction to detonate in the air. The SAGM was to undergo evaluation in July 2015 and, if successful, transition into an official Army Program of Record by the end of the year. Not only does the fuze burst over walls, but it can detonate when passing cover like trees, bursting just as it senses and passes the trunk. The sort of sensor SAGMs use to differentiate clutter from triggering obstacles is highly classified, but shows airburst reliability of 76 percent. | https://en.wikipedia.org/wiki?curid=1941603 | 275,280 |
780,732 | Cases of GAS are still present today, but were also evident before World War I. This was shown by a training camp located in Texas, where a harmful strain of pneumonia complicating measles was caused by a strain of Streptococcus. Existence of streptococci strains was additionally found in World War II. An epidemic of streptococcal infection in the United States Navy during this war indicated that this type of disease was able to exist and spread in formerly unexposed individuals by environments that serological types of group A streptococci preferred. In later years, a positive test result for the presence of group A streptococci was found in 32.1 percent of individuals after throat cultures were carried out in a 20 yearlong (1953/1954-1973/1974) study performed in Nashville, TN. Also, from 1972 to 1974, recurring GAS illness was observed with a prevalence of 19 percent in school-aged children as well as a prevalence rate of 25 percent in families. The severity of streptococcal infections has decreased over the years, and so has rheumatic fever (a sequelae of GAS) which is indicated by the change in numerous hospitals from containing wards allocated for the sole purpose of treating rheumatic fever to hardly seeing the disease at all. Environmental factors, such as less crowding and the increase of family living space, can account for the reduction in incidence and severity of group A streptococci. With more space for individuals to reside in, it provides the bacteria with less opportunities to spread from person to person. This is especially important considering an estimated 500,000 deaths worldwide all occurring after acute rheumatic fever, invasive infection, or subsequent heart disease can be accredited to GAS. This number is quite large, often leaving the health care system encumbered, since 91 percent of patients infected with invasive GAS need to be hospitalized with 8950–11,500 episodes and 1050-1850 deaths taking place each year. A later study that occurred from 2005 to 2012 found that there were 10,649-13,434 cases consequently resulting in 1136-1607 deaths per year. | https://en.wikipedia.org/wiki?curid=58638 | 780,314 |
1,703,847 | The ILLIAC IV was one of the first attempts at a massively parallel computer. Key to the design as conceived by Daniel Slotnick, the director of the project, was fairly high parallelism with up to 256 processors, used to allow the machine to work on large data sets in what would later be known as array processing. The machine was to have 4 quadrants. Each quadrant had a Control Unit (CU) and 64 Processor Elements (PEs). Originally Texas Instruments made a commitment to build the Processing Elements (PEs) out of large scale integrated (LSI) circuits. Several years into the project, TI backed out and said that they could not produce the LSI chips at the contracted price. This required a complete redesign using medium scale integrated circuits, leading to large delays and greatly increasing costs. This also led to scaling the system back from four quadrants to a single quadrant, owing to the fact that the MSI version was going to be many times larger than the LSI version would have been. This led to the CU having pull out 'cards' that were on the order of two feet square. For the PEs what should have been chips about 1 inch in diameter were now roughly 6 by 10 inches. Space, power and air conditioning (not to mention budget) did not allow for a four quadrant machine. The machine was 10' high, 8' deep and 50' long. There could be 10-12 instructions being sent from the CU on the wires to the PEs at any time. The power supplies for the machine were so large that it required designing a single tongue fork lift to remove and reinstall the power supply. The power supply buss bars on the machine spanned distances greater than three feet, and were octopus-like in design. Thick copper, the busses were coated in epoxy that often cracked resulting in shorts and an array of other issues. ILLIAC IV was designed by Burroughs Corporation and built in quadrants in Great Valley, PA during the years of 1967 through 1972. It had a traditional one address accumulator architecture, rather than the revolutionary stack architecture pioneered by Burroughs in the 5500/6500 machines. Illiac IV was designed in fact to be a "back end processor" to a B6700. The cost overruns caused by not getting the LSI chips and other design errors by Burroughs (the control unit was built with positive logic and the PEs with negative logic, etc.) made the project untenable. | https://en.wikipedia.org/wiki?curid=1393934 | 1,702,891 |
1,910,777 | Many high school students will experience some form of head injury during their experiences in amateur sports and the majority of these can be classified as concussions. Even by the beginning of high school, 53% of athletes will have already suffered a concussion. Less than 50% of them report it in order to stay in the game. Many concussions may be subtle and go undiagnosed. While the majority of these minor injuries will recover without consequence within 3 to 7 days, it is the repetitive injury that is associated with neurological sequela. Multiple concussions appear to have a cumulative effect on memory performance. If an athlete returns to competition before being completely healed, they are more susceptible to suffer another concussion. A repeat concussion can have a much slower recovery rate and be accompanied by increased symptoms and long-term effects. This “second impact syndrome” has, in some cases, been fatal. A history of concussion in football players has been linked to sports-related sudden death. The severity of complications from concussion can include brain swelling, blood clots and brain damage. Ice hockey, soccer, wrestling and basketball carry a high risk for concussion however, football is the most dangerous. Concussion causing situations that involve leading with the head, hitting head to head and striking a defenseless athlete have become subject to penalty in order to discourage players and coaches from this type of play. These rule changes have resulted in technique changes at the youngest levels of sports, and youth athletes are now being trained in methods avoiding illegal contact. Youth sport organizations have also made equipment changes to better protect players. A widespread myth is that helmets protect athletes from concussions; they are actually worn to prevent skull fractures. Facts like this have prompted trainings on proper equipment use and not utilizing helmets as an implement of contact. | https://en.wikipedia.org/wiki?curid=33849637 | 1,909,679 |
2,032,488 | According to Blough & Zepp, superoxide is one of the hardest reactive oxygen species to quantify because it is present in low concentrations: 2×10 M in the open ocean and up to 2×10M in coastal areas. The main sources of biological superoxide in the ocean come from the reduction of oxygen at the cell surface and metabolites released into the water. In marine systems, superoxide most often acts as a one-electron reductant, but it can also serve as an oxidant and may increase the normally slow oxidation rates of environmental compounds. Superoxide is very unstable, with between 50 and 80% of its concentration of anions spontaneously disproportionating to hydrogen peroxide. At its peak, this reaction occurs with a rate constant on the order of 2.2×10 – 4.5×10 L molsec in seawater. The dismutation of superoxide to hydrogen peroxide can also be catalyzed by the antioxidant enzyme superoxide dismutase with a rate constant on the order of 2×10 L molsec. As a result of these fast acting processes, the steady state concentration of superoxide is very small. Since superoxide is also moderately reactive towards trace metals and dissolved organic matter, any remaining superoxide is thought to be removed from the water column through reactions with these species. As a result, the presence of superoxide in surface waters has been known to result in an increase of reduced iron. This, in turn, serves to enhance the availability of iron to phytoplankton whose growth is often limited by this key nutrient. As a charged radical species, superoxide is unlikely to significantly affect an organism's cellular function since it is not able to easily diffuse through the cell membrane. Instead, its potential toxicity lies in its ability to react with extracellular surface proteins or carbohydrates to inactivate their functions. Although its lifetime is fairly short (about 50 microseconds), superoxide has the potential to reach cell surfaces since it has a diffusion distance of about 320 nm. | https://en.wikipedia.org/wiki?curid=38173326 | 2,031,318 |
1,743,599 | The real drivers of this movement were the generals of the Autonomous Kwantung Army. They applied pressure to have the oldest diplomat in Japan, Prince Saionji, nominated to the Inukai. In the wake of the electoral triumph a wave of political assassinations plagued the nation, aimed largely at ministers still in the cabinet who displayed some independence. In March 1932 ex-minister Inouye and Baron Takuma Dan, a leader of the banking interest Mitsui and one of the most powerful financial figures in Japan, were killed by shooting. These crimes were actions of the Brotherhood of Blood League, formed by a fanatical lieutenant and a Buddhist priest. This and other secret groups, particularly the Black Dragon Society of Mitsuru Toyama, were attractive to the sons of discarded citizens, small merchants and industrialists left to ruin by the great zaibatsus. Their embittered descendants became the backbone of the Japanese Army, many of their officers being of lower rank than the Zaibatsu families. This Radical movement took its ideas from Fascism, combined with right wing socialist elements. It hated the plutocracy but believed in rights for labor and a severely militarized state with strong controls over commercial monopolies and a hierarchical system, leading to the imperial conquest of vast new territories, providing wealth and successful careers for its members. The Assassins of the Brotherhood of Blood are aided, and receiving only some cases. The terror reached its highest point with the assassination of the leader of the Seiyukai, Prime Minister Inukai. This caused industrial concentration and supported monopolies, which raised large sums from loans to obtain the resources for factories and military industries, which rationalized their processes to reduce the low costs of Japanese production. | https://en.wikipedia.org/wiki?curid=2005857 | 1,742,615 |
623,382 | In 1985, inventor Harry Wainwright created the first fully animated sweatshirt. The shirt consisted of fiber optics, leads, and a microprocessor to control individual frames of animation. The result was a full-color cartoon displayed on the surface of the shirt. in 1995, Wainwright went on to invent the first machine enabling fiber optics to be machined into fabrics, the process needed for manufacturing enough for mass markets and, in 1997, hired a German machine designer, Herbert Selbach, from Selbach Machinery to produce the world's first CNC machine able to automatically implant fiber optics into any flexible material. Receiving the first of a dozen patents based on LED/Optic displays and machinery in 1989, the first CNC machines went into production in 1998 beginning with the production of animated coats for Disney Parks in 1998. The first ECG bio-physical display jackets employing LED/optic displays were created by Wainwright and David Bychkov, the CEO of Exmovere at the time in 2005 using GSR sensors in a watch connected via Bluetooth to the embedded machine washable display in a denim jacket and were demonstrated at the Smart Fabrics Conference held in Washington, D.C. May 7, 2007. Additional smart fabric technologies were unveiled by Wainwright at two Flextech Flexible Display conferences held in Phoenix, AZ, showing infrared digital displays machine-embedded into fabrics for IFF (Identification of Friend or Foe) which were submitted to BAE Systems for evaluation in 2006 and won an "Honorable Mention" award from NASA in 2010 on their Tech Briefs, "Design the Future" contest. MIT personnel purchased several fully animated coats for their researchers to wear at their demonstrations in 1999 to bring attention to their "Wearable Computer" research. Wainwright was commissioned to speak at the Textile and Colorists Conference in Melbourne, Australia on June 5, 2012. He was requested to demonstrate his fabric creations that change color using any smartphone, indicate callers on mobile phones without a digital display, and contain WIFI security features that protect purses and personal items from theft. | https://en.wikipedia.org/wiki?curid=14206086 | 623,050 |
2,134,432 | Conducted in the 1990s and sponsored by the U.S. Army Medical Research and Materiel Command, the Albuquerque Studies were a series of human volunteer studies that aimed to establish new limits on the acceptable level of exposure to impulse noise produced by heavy weapons. The studies took place at Kirkland Air Force Base in Albuquerque, New Mexico, where participants were exposed to four different pressure-time signatures at seven different intensity levels and at various successions and sequences. The data collected from these studies formed a large database used to evaluate the performance of the AHAAH model. The experiment consisted of exposures to free-field impulse waveforms produced by explosive charges at distances of 5, 3, and 1 meters while wearing hearing protection. The 5m exposure was performed with a bare charge suspended above the ground and the subjects wore an unmodified earmuff with the left ear towards the charge. The 5m exposure was repeated with a modified earmuff that included a series of small tubes inserted through the earmuff cushion to simulate a poorly fit earmuff. The 3m and 1m exposures used the modified earmuff and the charges were detonated at the base of a tube pointed vertically. The left ears of the subjects were positioned 1m or 3m from the lip of the tube and 1 inch (2.54 cm) or 3 inches (7.62 cm) above the top edge of the tube. The fourth exposure condition was a reverberant environment with the participants seated at the end of a 3-meter long steel tube that opened into a concrete bunker. The explosive charged were detonated outside the end of the 3m tube. Various conditions were accounted for, such as the distance of the participant’s ear from the tube, the acoustics of the surrounding environment, the level of hearing protection, and the number of impulses, establishing a matrix of possible exposures. An audiogram was used before and after each exposure to measure the threshold and the resulting threshold shift. The pressure-time signatures were measured using bare gauges for all exposure conditions. According to the data obtained from the Albuquerque Studies, the AHAAH model correctly predicted the acoustic hazards in 95 percent of the cases, while the MIL-STD-1474D was correct in only 38 percent of the cases and the A-weighted energy method was correct in only 25 percent of the cases. For all three approaches, the errors mainly stemmed from the methods overpredicting the danger of the hazard. | https://en.wikipedia.org/wiki?curid=57869905 | 2,133,207 |
1,874,852 | Psychological studies of bipolar disorder have examined the development of a wide range of both the core symptoms of psychomotor activation and related clusterings of depression/anxiety, increased hedonic tone, irritability/aggression and sometimes psychosis. The existing evidence has been described as patchy in terms of quality but converging in a consistent manner. The findings suggest that the period leading up to mania is often characterized by depression and anxiety at first, with isolated sub-clinical symptoms of mania such as increased energy and racing thoughts. The latter increase and lead to increased activity levels, the more so if there is disruption in circadian rhythms or goal attainment events. There is some indication that once mania has begun to develop, social stressors, including criticism from significant others, can further contribute. There are also indications that individuals may hold certain beliefs about themselves, their internal states, and their social world (including striving to meet high standards despite it causing distress) that may make them vulnerable during changing mood states in the face of relevant life events. In addition, subtle frontal-temporal and subcortical difficulties in "some" individuals, related to planning, emotional regulation and attentional control, may play a role. Symptoms are often subthreshold and likely continuous with normal experience. Once (hypo)mania has developed, there is an overall increase in activation levels and impulsivity. Negative social reactions or advice may be taken less notice of, and a person may be more caught up in their own thoughts and interpretations, often along a theme of feeling criticised. There is some suggestion that the mood variation in bipolar disorder may not be cyclical as often assumed, nor completely random, but results from a complex interaction between internal and external variables unfolding over time; there is mixed evidence as to whether relevant life events are found more often in early than later episodes. Many with the condition report inexplicably varied cyclical patterns, however. | https://en.wikipedia.org/wiki?curid=30639411 | 1,873,775 |
203,332 | Rockefeller faculty have made contributions to breakthroughs in biomedical sciences. Michael W. Young was one of several scientists who located genes that regulate the sleep–wake cycle in 1984. In 1994, Jeffrey M. Friedman’s laboratory discovered leptin, a gene that influences appetite and weight. Charles David Allis helped identify the first enzyme that modifies histones in 1996, providing early evidence that the DNA packaging material plays a crucial role in gene regulation. In 1998, Roderick MacKinnon’s laboratory elucidated the structure and mechanism of a potassium channel, explaining how electrical signals are conveyed across cell membranes. Titia de Lange was part of a team that found how telomeres protect chromosome ends, shedding light on the role of genome instability in cancer in 1999. Robert B. Darnell led research that defined the molecular basis of fragile X syndrome, the second leading cause of intellectual disability, in 2001. Vincent A. Fischetti was part of a group that developed a powerful agent that can target and wipe out anthrax bacteria in 2002. Charles M. Rice helped produce an infectious form of the hepatitis C virus in laboratory cultures of human cells in 2005, leading directly to three new classes of hepatitis C drugs. Elaine Fuchs helped define the stem cells that can initiate squamous cell carcinoma in 2011, and also characterized the signaling pathways that drive malignancy. In 2013, Leslie B. Vosshall’s laboratory identified a gene in mosquitoes that is responsible for their attraction to humans and their sensitivity to the insect repellent DEET. Ali Brivanlou's laboratory developed a method to grow embryos outside the uterus for up to 13 days in 2016, allowing scientists to study the earliest events of human development. | https://en.wikipedia.org/wiki?curid=605466 | 203,228 |
687,241 | Alongside his colleague Wei Pu, Shen planned to map the orbital paths of the Moon and the planets in an intensive five-year project involving daily observations, yet this was thwarted by political opponents at court. To aid his work in astronomy, Shen Kuo made improved designs of the armillary sphere, gnomon, sighting tube, and invented a new type of inflow water clock. Shen Kuo devised a geological hypothesis for land formation (geomorphology), based upon findings of inland marine fossils, knowledge of soil erosion, and the deposition of silt. He also proposed a hypothesis of gradual climate change, after observing ancient petrified bamboos that were preserved underground in a dry northern habitat that would not support bamboo growth in his time. He was the first literary figure in China to mention the use of the drydock to repair boats suspended out of water, and also wrote of the effectiveness of the relatively new invention of the canal pound lock. Although not the first to invent camera obscura, Shen noted the relation of the focal point of a concave mirror and that of the pinhole. Shen wrote extensively about movable type printing invented by Bi Sheng (990–1051), and because of his written works the legacy of Bi Sheng and the modern understanding of the earliest movable type has been handed down to later generations. Following an old tradition in China, Shen created a raised-relief map while inspecting borderlands. His description of an ancient crossbow mechanism he unearthed as an amateur archaeologist proved to be a Jacob's staff, a surveying tool which wasn't known in Europe until described by Levi ben Gerson in 1321. | https://en.wikipedia.org/wiki?curid=1102000 | 686,884 |
702,181 | In 1993, Square's "Secret of Mana", the second in the "Mana" series, further advanced the action RPG subgenre with its introduction of cooperative multiplayer into the genre. The game was created by a team previously responsible for the first three "Final Fantasy" titles: Nasir Gebelli, Koichi Ishii, and Hiromichi Tanaka. It was intended to be one of the first CD-ROM RPGs, as a launch title for the SNES CD add-on, but had to be altered to fit onto a standard game cartridge after the SNES CD project was dropped. The game received considerable acclaim, for its innovative pausable real-time battle system, the "Ring Command" menu system, its innovative cooperative multiplayer gameplay, where the second or third players could drop in and out of the game at any time rather than players having to join the game at the same time, and the customizable AI settings for computer-controlled allies. The game has influenced a number of later action RPGs. That same year also saw the release of "", which introduced the use of pre-programmable combat manoeuvers called 'macros', a means of setting up the player's party AI to deliver custom attack combos. "Madou Monogatari", a 1989 MSX and PC-98 computer RPG ported to the Game Gear handheld console in 1993, had several unique features, including magic-oriented turn-based combat that completely lacked physical attacks, and the replacement of numerical statistics with visual representations, where the protagonist's condition is represented by her facial expressions and sprite graphics while experience is measured in jewels that encircle the screen, with the only visible numerical statistic being the collected gold. That year also saw the release of "Romancing Saga 2", which further expanded the non-linear gameplay of its predecessor. While in the original "Romancing Saga", scenarios were changed according to dialogue choices during conversations, "Romancing Saga 2" further expanded on this by having unique storylines for each character that can change depending on the player's actions, including who is chosen, what is said in conversation, what events have occurred, and who is present in the party. "PCGamesN" credits "Romancing SaGa 2" for having laid the foundations for modern Japanese RPGs with its progressive, non-linear, open world design and subversive themes. | https://en.wikipedia.org/wiki?curid=32408675 | 701,814 |
1,270,628 | As winter approaches, the cells will approach the last stage of their life cycle. The orange cells mature into red cysts, the form in which it will remain for the remainder and longest portion of its life cycle. Cells at this stage are most resistant to harsh environmental conditions. Inorganic and organic materials such as bacteria, fungi, and dust particles coat the mucilage layer of the cell wall. The inorganic impurities were found to be rich in silicon, iron, and aluminum. These elements can also be taken up into the cellular compartment and stored in vacuoles and may be an important source of mineral supply. The cell wall, as the boundary that protects the inner contents of the cell from the harsh conditions in its habitat, is very rigid and hard to destroy. It also may play a role in protecting the algal cells from desiccation during the freeze-thaw cycle alternations during seasonal changes. The spherical immotile red cysts range from 35-40 µm in diameter. The cell contains one central chloroplast that has a naked pyrenoid, ribosomes, starch grains, and numerous small grana stacks composed of 3-7 thylakoids within it. Negatively charged phosphatidylglycerol composes the majority of the thylakoid membranes. The thylakoid membrane lipid composition can also be changed to enhance lipid fluidity in response to lower temperatures. An undulated membrane encloses the chloroplast. Lipid bodies and carotenoid globules surround the plastid. A red secondary pigment, astaxanthin and esterified derivatives of it, accumulates up to 20 times the amount of chlorophyll a in the cytoplasmic lipid bodies of mature red spores. Astaxanthin protects the chloroplast from excessive light by absorbing a portion of it before it reaches the photosynthetic apparatus which subsequently prevents photoinhibition and UV damage. The absorbed radiation is converted to heat, aiding in the melt of nearby snow and ice crystals to access needed nutrients and liquid water. Astaxanthin can also act as a metabolic sink for the metabolically active spores that do not divide. | https://en.wikipedia.org/wiki?curid=4338212 | 1,269,937 |
631,604 | A small company in New Hampshire, R.H. Research, owned by Robert Howard researched printing from 1982 -1983 and decided the single-nozzle inkjet was a possible fit and he then contacted an inventor at Exxon who named Al Hock as a good choice for this project. Al Hock invited Tom Peer and Dave Lutz to join him in New Hampshire to look into this new venture and they accepted the job offer. Dave Lutz contacted two jet people still at Exxon, Jim and Kathy McMahon and they also accepted offers to be founders in this venture later to be named Howtek, Inc. Within a few months the Alpha jets made by the new Howtek team were working fine. Howtek management chose to change the glass nozzles to Tefzel based on the inkjet test results. Tefzel allowed the inkjet to work at high temperature with the new Thermoplastic Hot-melt inks and run with no vibrations in the nozzle structure to generate stray drops. Each squeeze produced one drop over a frequency range o 1–16,000 drops per second. The nozzles were manufacturable and the Pixelmaster was born. There were 32 inkjet single nozzles per printhead, printing 4 colors (8 jets per color) CMYK. The mechanism was a printhead rotating at 121 rpm and placing uniform size and shaped drops precisely in place as subtractive color text and image printing for the graphics industry. This technology of hot-melt inks printing layers of CMYK was a precursor to a 3D patent by Richard Helinski. A few years later(1993) the patent was licensed first by Sanders Prototype, Inc.,(Renamed Solidscape, Inc) a manufacturer of the first desktop Rapid Prototype printer in the industry, the Modelmaker 6 Pro. This printer and newer products use these Howtek style inkjets and thermoplastic inks. Models printed with the Thermoplastic were perfect for investment casting with no ash during burnout. Thermoplastic ink drop printing is accurate and precise giving high quality surface finish models popular with jewelers and detail sensitive CAD designers. The Howtek inkjets designed to print a page in 4 minutes were now printing in some case for 4 days straight. The first printer was sold in 1993 to Hitchner Corporations, Metal Casting Technology R&D group where they printer golf club heads and parts for automobile engines. | https://en.wikipedia.org/wiki?curid=53292993 | 631,266 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.