text_id
stringlengths
22
22
page_url
stringlengths
31
389
page_title
stringlengths
1
250
section_title
stringlengths
0
4.67k
context_page_description
stringlengths
0
113k
context_section_description
stringlengths
1
179k
media
sequence
hierachy
sequence
category
sequence
projected-04044867-008
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Indirect recursion
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Most basic examples of recursion, and most of the examples presented here, demonstrate direct recursion, in which a function calls itself. Indirect recursion occurs when a function is called not by itself but by another function that it called (either directly or indirectly). For example, if f calls f, that is direct recursion, but if f calls g which calls f, then that is indirect recursion of f. Chains of three or more functions are possible; for example, function 1 calls function 2, function 2 calls function 3, and function 3 calls function 1 again. Indirect recursion is also called , which is a more symmetric term, though this is simply a difference of emphasis, not a different notion. That is, if f calls g and then g calls f, which in turn calls g again, from the point of view of f alone, f is indirectly recursing, while from the point of view of g alone, it is indirectly recursing, while from the point of view of both, f and g are mutually recursing on each other. Similarly a set of three or more functions that call each other can be called a set of mutually recursive functions.
[]
[ "Types of recursion", "Indirect recursion" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-009
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Anonymous recursion
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Recursion is usually done by explicitly calling a function by name. However, recursion can also be done via implicitly calling a function based on the current context, which is particularly useful for s, and is known as .
[]
[ "Types of recursion", "Anonymous recursion" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-010
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Structural versus generative recursion
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Some authors classify recursion as either "structural" or "generative". The distinction is related to where a recursive procedure gets the data that it works on, and how it processes that data: Thus, the defining characteristic of a structurally recursive function is that the argument to each recursive call is the content of a field of the original input. Structural recursion includes nearly all tree traversals, including XML processing, binary tree creation and search, etc. By considering the algebraic structure of the natural numbers (that is, a natural number is either zero or the successor of a natural number), functions such as factorial may also be regarded as structural recursion. is the alternative: This distinction is important in of a function. All structurally recursive functions on finite () data structures can easily be shown to terminate, via : intuitively, each recursive call receives a smaller piece of input data, until a base case is reached. Generatively recursive functions, in contrast, do not necessarily feed smaller input to their recursive calls, so proof of their termination is not necessarily as simple, and avoiding requires greater care. These generatively recursive functions can often be interpreted as corecursive functions – each step generates the new data, such as successive approximation in Newton's method – and terminating this corecursion requires that the data eventually satisfy some condition, which is not necessarily guaranteed. In terms of s, structural recursion is when there is an obvious loop variant, namely size or complexity, which starts off finite and decreases at each recursive step. By contrast, generative recursion is when there is not such an obvious loop variant, and termination depends on a function, such as "error of approximation" that does not necessarily decrease to zero, and thus termination is not guaranteed without further analysis.
[]
[ "Types of recursion", "Structural versus generative recursion" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-011
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Implementation issues
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
In actual implementation, rather than a pure recursive function (single check for base case, otherwise recursive step), a number of modifications may be made, for purposes of clarity or efficiency. These include: Wrapper function (at top) Short-circuiting the base case, aka "Arm's-length recursion" (at bottom) Hybrid algorithm (at bottom) – switching to a different algorithm once data is small enough On the basis of elegance, wrapper functions are generally approved, while short-circuiting the base case is frowned upon, particularly in academia. Hybrid algorithms are often used for efficiency, to reduce the overhead of recursion in small cases, and arm's-length recursion is a special case of this.
[]
[ "Implementation issues" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-012
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Wrapper function
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
A is a function that is directly called but does not recurse itself, instead calling a separate auxiliary function which actually does the recursion. Wrapper functions can be used to validate parameters (so the recursive function can skip these), perform initialization (allocate memory, initialize variables), particularly for auxiliary variables such as "level of recursion" or partial computations for , and handle exceptions and errors. In languages that support s, the auxiliary function can be nested inside the wrapper function and use a shared scope. In the absence of nested functions, auxiliary functions are instead a separate function, if possible private (as they are not called directly), and information is shared with the wrapper function by using .
[]
[ "Implementation issues", "Wrapper function" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-013
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Short-circuiting the base case
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Short-circuiting the base case, also known as arm's-length recursion, consists of checking the base case before making a recursive call – i.e., checking if the next call will be the base case, instead of calling and then checking for the base case. Short-circuiting is particularly done for efficiency reasons, to avoid the overhead of a function call that immediately returns. Note that since the base case has already been checked for (immediately before the recursive step), it does not need to be checked for separately, but one does need to use a wrapper function for the case when the overall recursion starts with the base case itself. For example, in the factorial function, properly the base case is 0! = 1, while immediately returning 1 for 1! is a short circuit, and may miss 0; this can be mitigated by a wrapper function. The box shows code to shortcut factorial cases 0 and 1. Short-circuiting is primarily a concern when many base cases are encountered, such as Null pointers in a tree, which can be linear in the number of function calls, hence significant savings for algorithms; this is illustrated below for a depth-first search. Short-circuiting on a tree corresponds to considering a leaf (non-empty node with no children) as the base case, rather than considering an empty node as the base case. If there is only a single base case, such as in computing the factorial, short-circuiting provides only savings. Conceptually, short-circuiting can be considered to either have the same base case and recursive step, checking the base case only before the recursion, or it can be considered to have a different base case (one step removed from standard base case) and a more complex recursive step, namely "check valid then recurse", as in considering leaf nodes rather than Null nodes as base cases in a tree. Because short-circuiting has a more complicated flow, compared with the clear separation of base case and recursive step in standard recursion, it is often considered poor style, particularly in academia.
[]
[ "Implementation issues", "Short-circuiting the base case" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-014
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Depth-first search
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
A basic example of short-circuiting is given in (DFS) of a binary tree; see section for standard recursive discussion. The standard recursive algorithm for a DFS is: base case: If current node is Null, return false recursive step: otherwise, check value of current node, return true if match, otherwise recurse on children In short-circuiting, this is instead: check value of current node, return true if match, otherwise, on children, if not Null, then recurse. In terms of the standard steps, this moves the base case check before the recursive step. Alternatively, these can be considered a different form of base case and recursive step, respectively. Note that this requires a wrapper function to handle the case when the tree itself is empty (root node is Null). In the case of a of height h, there are 2h+1−1 nodes and 2h+1 Null pointers as children (2 for each of the 2h leaves), so short-circuiting cuts the number of function calls in half in the worst case. In C, the standard recursive algorithm may be implemented as: bool tree_contains(struct node *tree_node, int i) { if (tree_node == NULL) return false; // base case else if (tree_node->data == i) return true; else return tree_contains(tree_node->left, i) || tree_contains(tree_node->right, i); } The short-circuited algorithm may be implemented as: // Wrapper function to handle empty tree bool tree_contains(struct node *tree_node, int i) { if (tree_node == NULL) return false; // empty tree else return tree_contains_do(tree_node, i); // call auxiliary function } // Assumes tree_node != NULL bool tree_contains_do(struct node *tree_node, int i) { if (tree_node->data == i) return true; // found else // recurse return (tree_node->left && tree_contains_do(tree_node->left, i)) || (tree_node->right && tree_contains_do(tree_node->right, i)); } Note the use of of the Boolean && (AND) operators, so that the recursive call is made only if the node is valid (non-Null). Note that while the first term in the AND is a pointer to a node, the second term is a boolean, so the overall expression evaluates to a boolean. This is a common idiom in recursive short-circuiting. This is in addition to the short-circuit evaluation of the Boolean || (OR) operator, to only check the right child if the left child fails. In fact, the entire of these functions can be replaced with a single Boolean expression in a return statement, but legibility suffers at no benefit to efficiency.
[]
[ "Implementation issues", "Short-circuiting the base case", "Depth-first search" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-015
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Hybrid algorithm
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Recursive algorithms are often inefficient for small data, due to the overhead of repeated function calls and returns. For this reason efficient implementations of recursive algorithms often start with the recursive algorithm, but then switch to a different algorithm when the input becomes small. An important example is , which is often implemented by switching to the non-recursive when the data is sufficiently small, as in the . Hybrid recursive algorithms can often be further refined, as in , derived from a hybrid merge sort/insertion sort.
[]
[ "Hybrid algorithm" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-016
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Recursion versus iteration
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Recursion and are equally expressive: recursion can be replaced by iteration with an explicit , while iteration can be replaced with . Which approach is preferable depends on the problem under consideration and the language used. In , iteration is preferred, particularly for simple recursion, as it avoids the overhead of function calls and call stack management, but recursion is generally used for multiple recursion. By contrast, in recursion is preferred, with tail recursion optimization leading to little overhead. Implementing an algorithm using iteration may not be easily achievable. Compare the templates to compute xn defined by xn = f(n, xn-1) from xbase: For an imperative language the overhead is to define the function, and for a functional language the overhead is to define the accumulator variable x. For example, a function may be implemented iteratively in by assigning to a loop index variable and accumulator variable, rather than by passing arguments and returning values by recursion: unsigned int factorial(unsigned int n) { unsigned int product = 1; // empty product is 1 while (n) { product *= n; --n; } return product; }
[]
[ "Recursion versus iteration" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-017
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Expressive power
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Most s in use today allow the direct specification of recursive functions and procedures. When such a function is called, the program's keeps track of the various s of the function (often using a , although other methods may be used). Every recursive function can be transformed into an iterative function by replacing recursive calls with and simulating the call stack with a explicitly managed by the program. Conversely, all iterative functions and procedures that can be evaluated by a computer (see ) can be expressed in terms of recursive functions; iterative control constructs such as s and s are routinely rewritten in recursive form in s. However, in practice this rewriting depends on , which is not a feature of all languages. , , and are notable mainstream languages in which all function calls, including s, may cause stack allocation that would not occur with the use of looping constructs; in these languages, a working iterative program rewritten in recursive form may , although tail call elimination may be a feature that is not covered by a language's specification, and different implementations of the same language may differ in tail call elimination capabilities.
[]
[ "Recursion versus iteration", "Expressive power" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-018
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Performance issues
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
In languages (such as and ) that favor iterative looping constructs, there is usually significant time and space cost associated with recursive programs, due to the overhead required to manage the stack and the relative slowness of function calls; in , a function call (particularly a ) is typically a very fast operation, and the difference is usually less noticeable. As a concrete example, the difference in performance between recursive and iterative implementations of the "factorial" example above depends highly on the used. In languages where looping constructs are preferred, the iterative version may be as much as several faster than the recursive one. In functional languages, the overall time difference of the two implementations may be negligible; in fact, the cost of multiplying the larger numbers first rather than the smaller numbers (which the iterative version given here happens to do) may overwhelm any time saved by choosing iteration.
[]
[ "Recursion versus iteration", "Performance issues" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-019
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Stack space
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
In some programming languages, the maximum size of the is much less than the space available in the , and recursive algorithms tend to require more stack space than iterative algorithms. Consequently, these languages sometimes place a limit on the depth of recursion to avoid s; is one such language. Note the caveat below regarding the special case of .
[]
[ "Recursion versus iteration", "Stack space" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-020
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Vulnerability
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Because recursive algorithms can be subject to stack overflows, they may be vulnerable to or input. Some malware specifically targets a program's call stack and takes advantage of the stack's inherently recursive nature. Even in the absence of malware, a stack overflow caused by unbounded recursion can be fatal to the program, and may not prevent the corresponding from being .
[]
[ "Recursion versus iteration", "Vulnerability" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-021
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Multiply recursive problems
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Multiply recursive problems are inherently recursive, because of prior state they need to track. One example is as in ; though both recursive and iterative methods are used, they contrast with list traversal and linear search in a list, which is a singly recursive and thus naturally iterative method. Other examples include s such as , and functions such as the . All of these algorithms can be implemented iteratively with the help of an explicit , but the programmer effort involved in managing the stack, and the complexity of the resulting program, arguably outweigh any advantages of the iterative solution.
[]
[ "Recursion versus iteration", "Multiply recursive problems" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-022
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Refactoring recursion
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Recursive algorithms can be replaced with non-recursive counterparts. One method for replacing recursive algorithms is to simulate them using in place of . An alternative is to develop a replacement algorithm entirely based on non-recursive methods, which can be challenging. For example, recursive algorithms for , such as ' algorithm, were once typical. Non-recursive algorithms for the same purpose, such as the , have been developed to avoid the drawbacks of recursion and have improved only gradually based on techniques such as collecting and performance.
[]
[ "Recursion versus iteration", "Refactoring recursion" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-023
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Tail-recursive functions
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Tail-recursive functions are functions in which all recursive calls are s and hence do not build up any deferred operations. For example, the gcd function (shown again below) is tail-recursive. In contrast, the factorial function (also below) is not tail-recursive; because its recursive call is not in tail position, it builds up deferred multiplication operations that must be performed after the final recursive call completes. With a or that treats tail-recursive calls as rather than function calls, a tail-recursive function such as gcd will execute using constant space. Thus the program is essentially iterative, equivalent to using imperative language control structures like the "for" and "while" loops. The significance of tail recursion is that when making a tail-recursive call (or any tail call), the caller's return position need not be saved on the ; when the recursive call returns, it will branch directly on the previously saved return position. Therefore, in languages that recognize this property of tail calls, tail recursion saves both space and time.
[]
[ "Tail-recursive functions" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-024
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Order of execution
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Consider these two functions:
[]
[ "Order of execution" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-025
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Function 1
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
void recursiveFunction(int num) { printf("%d\n", num); if (num < 4) recursiveFunction(num + 1); }
[ "Recursive1.svg" ]
[ "Order of execution", "Function 1" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-026
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Function 2
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
void recursiveFunction(int num) { if (num < 4) recursiveFunction(num + 1); printf("%d\n", num); } Function 2 is function 1 with the lines swapped. In the case of a function calling itself only once, instructions placed before the recursive call are executed once per recursion before any of the instructions placed after the recursive call. The latter are executed repeatedly after the maximum recursion has been reached. Also note that the order of the print statements is reversed, which is due to the way the functions and statements are stored on the .
[ "Recursive2.svg" ]
[ "Order of execution", "Function 2" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-028
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Factorial
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
A classic example of a recursive procedure is the function used to calculate the of a : The function can also be written as a : This evaluation of the recurrence relation demonstrates the computation that would be performed in evaluating the pseudocode above: This factorial function can also be described without using recursion by making use of the typical looping constructs found in imperative programming languages: The imperative code above is equivalent to this mathematical definition using an accumulator variable : The definition above translates straightforwardly to s such as ; this is an example of iteration implemented recursively.
[]
[ "Recursive procedures", "Factorial" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-029
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Greatest common divisor
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
The , which computes the of two integers, can be written recursively. Function definition: for greatest common divisor, where expresses the of : if The recursive program above is ; it is equivalent to an iterative algorithm, and the computation shown above shows the steps of evaluation that would be performed by a language that eliminates tail calls. Below is a version of the same algorithm using explicit iteration, suitable for a language that does not eliminate tail calls. By maintaining its state entirely in the variables x and y and using a looping construct, the program avoids making recursive calls and growing the call stack. The iterative algorithm requires a temporary variable, and even given knowledge of the Euclidean algorithm it is more difficult to understand the process by simple inspection, although the two algorithms are very similar in their steps.
[]
[ "Recursive procedures", "Greatest common divisor" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-030
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Towers of Hanoi
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
The Towers of Hanoi is a mathematical puzzle whose solution illustrates recursion. There are three pegs which can hold stacks of disks of different diameters. A larger disk may never be stacked on top of a smaller. Starting with n disks on one peg, they must be moved to another peg one at a time. What is the smallest number of steps to move the stack? Function definition: Recurrence relation for hanoi: Example implementations: Although not all recursive functions have an explicit solution, the Tower of Hanoi sequence can be reduced to an explicit formula.
[ "Tower of Hanoi.jpeg" ]
[ "Recursive procedures", "Towers of Hanoi" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-031
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Binary search
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
The algorithm is a method of searching a for a single element by cutting the array in half with each recursive pass. The trick is to pick a midpoint near the center of the array, compare the data at that point with the data being searched and then responding to one of three possible conditions: the data is found at the midpoint, the data at the midpoint is greater than the data being searched for, or the data at the midpoint is less than the data being searched for. Recursion is used in this algorithm because with each pass a new array is created by cutting the old one in half. The binary search procedure is then called recursively, this time on the new (and smaller) array. Typically the array's size is adjusted by manipulating a beginning and ending index. The algorithm exhibits a logarithmic order of growth because it essentially divides the problem domain in half with each pass. Example implementation of binary search in C: /* Call binary_search with proper initial conditions. INPUT: data is an array of integers SORTED in ASCENDING order, toFind is the integer to search for, count is the total number of elements in the array OUTPUT: result of binary_search */ int search(int *data, int toFind, int count) { // Start = 0 (beginning index) // End = count - 1 (top index) return binary_search(data, toFind, 0, count-1); } /* Binary Search Algorithm. INPUT: data is a array of integers SORTED in ASCENDING order, toFind is the integer to search for, start is the minimum array index, end is the maximum array index OUTPUT: position of the integer toFind within array data, -1 if not found */ int binary_search(int *data, int toFind, int start, int end) { //Get the midpoint. int mid = start + (end - start)/2; //Integer division if (start > end) //Stop condition (base case) return -1; else if (data[mid] == toFind) //Found, return index return mid; else if (data[mid] > toFind) //Data is greater than toFind, search lower half return binary_search(data, toFind, start, mid-1); else //Data is less than toFind, search upper half return binary_search(data, toFind, mid+1, end); }
[]
[ "Recursive procedures", "Binary search" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-032
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Recursive data structures (structural recursion)
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
An important application of recursion in computer science is in defining dynamic data structures such as and . Recursive data structures can dynamically grow to a theoretically infinite size in response to runtime requirements; in contrast, the size of a static array must be set at compile time. "Recursive algorithms are particularly appropriate when the underlying problem or the data to be treated are defined in recursive terms." The examples in this section illustrate what is known as "structural recursion". This term refers to the fact that the recursive procedures are acting on data that is defined recursively. As long as a programmer derives the template from a data definition, functions employ structural recursion. That is, the recursions in a function's body consume some immediate piece of a given compound value.
[]
[ "Recursive data structures (structural recursion)" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-033
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Linked lists
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Below is a C definition of a linked list node structure. Notice especially how the node is defined in terms of itself. The "next" element of struct node is a pointer to another struct node, effectively creating a list type. struct node { int data; // some integer data struct node *next; // pointer to another struct node }; Because the struct node data structure is defined recursively, procedures that operate on it can be implemented naturally as recursive procedures. The list_print procedure defined below walks down the list until the list is empty (i.e., the list pointer has a value of NULL). For each node it prints the data element (an integer). In the C implementation, the list remains unchanged by the list_print procedure. void list_print(struct node *list) { if (list != NULL) // base case { printf ("%d ", list->data); // print integer data followed by a space list_print (list->next); // recursive call on the next node } }
[]
[ "Recursive data structures (structural recursion)", "Linked lists" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-034
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Binary trees
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Below is a simple definition for a binary tree node. Like the node for linked lists, it is defined in terms of itself, recursively. There are two self-referential pointers: left (pointing to the left sub-tree) and right (pointing to the right sub-tree). struct node { int data; // some integer data struct node *left; // pointer to the left subtree struct node *right; // point to the right subtree }; Operations on the tree can be implemented using recursion. Note that because there are two self-referencing pointers (left and right), tree operations may require two recursive calls: // Test if tree_node contains i; return 1 if so, 0 if not. int tree_contains(struct node *tree_node, int i) { if (tree_node == NULL) return 0; // base case else if (tree_node->data == i) return 1; else return tree_contains(tree_node->left, i) || tree_contains(tree_node->right, i); } At most two recursive calls will be made for any given call to tree_contains as defined above. // Inorder traversal: void tree_print(struct node *tree_node) { if (tree_node != NULL) { // base case tree_print(tree_node->left); // go left printf("%d ", tree_node->data); // print the integer followed by a space tree_print(tree_node->right); // go right } } The above example illustrates an of the binary tree. A is a special case of the binary tree where the data elements of each node are in order.
[]
[ "Recursive data structures (structural recursion)", "Binary trees" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-035
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Filesystem traversal
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
Since the number of files in a may vary, is the only practical way to traverse and thus enumerate its contents. Traversing a filesystem is very similar to that of , therefore the concepts behind tree traversal are applicable to traversing a filesystem. More specifically, the code below would be an example of a of a filesystem. import java.io.File; public class FileSystem { public static void main(String [] args) { traverse(); } /** * Obtains the filesystem roots * Proceeds with the recursive filesystem traversal */ private static void traverse() { File[] fs = File.listRoots(); for (int i = 0; i < fs.length; i++) { System.out.println(fs[i]); if (fs[i].isDirectory() && fs[i].canRead()) { rtraverse(fs[i]); } } } /** * Recursively traverse a given directory * * @param fd indicates the starting point of traversal */ private static void rtraverse(File fd) { File[] fss = fd.listFiles(); for (int i = 0; i < fss.length; i++) { System.out.println(fss[i]); if (fss[i].isDirectory() && fss[i].canRead()) { rtraverse(fss[i]); } } } } This code is both recursion and - the files and directories are iterated, and each directory is opened recursively. The "rtraverse" method is an example of direct recursion, whilst the "traverse" method is a wrapper function. The "base case" scenario is that there will always be a fixed number of files and/or directories in a given filesystem.
[]
[ "Recursive data structures (structural recursion)", "Filesystem traversal" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-036
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Time-efficiency of recursive algorithms
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
The of recursive algorithms can be expressed in a of . They can (usually) then be simplified into a single Big-O term.
[]
[ "Time-efficiency of recursive algorithms" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-037
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
Shortcut rule (master theorem)
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
If the time-complexity of the function is in the form Then the Big O of the time-complexity is thus: If for some constant , then If , then If for some constant , and if for some constant and all sufficiently large , then where represents the number of recursive calls at each level of recursion, represents by what factor smaller the input is for the next level of recursion (i.e. the number of pieces you divide the problem into), and represents the work that the function does independently of any recursion (e.g. partitioning, recombining) at each level of recursion.
[]
[ "Time-efficiency of recursive algorithms", "Shortcut rule (master theorem)" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-038
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
See also
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
s s
[]
[ "See also" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044867-040
https://en.wikipedia.org/wiki/Recursion%20%28computer%20science%29
Recursion (computer science)
References
In , recursion is a method of solving a where the solution depends on solutions to smaller instances of the same problem. Recursion solves such by using that call themselves from within their own code. The approach can be applied to many types of problems, and recursion is one of the central ideas of computer science. Most computer s support recursion by allowing a function to call itself from within its own code. Some languages (for instance, ) do not define any looping constructs but rely solely on recursion to repeatedly call code. It is proved in that these recursive-only languages are ; this means that they are as powerful (they can be used to solve the same problems) as s based on control structures such as and . Repeatedly calling a function from within itself may cause the to have a size equal to the sum of the input sizes of all involved calls. It follows that, for problems that can be solved easily by iteration, recursion is generally less , and, for large problems, it is fundamental to use optimization techniques such as optimization.
(viii+64 pages)
[]
[ "References" ]
[ "Theoretical computer science", "Recursion", "Computability theory", "Articles with example pseudocode", "Programming idioms", "Subroutines" ]
projected-04044882-000
https://en.wikipedia.org/wiki/Clare%20%28electoral%20district%29
Clare (electoral district)
Introduction
Clare is a provincial in , which existed between 1949-2013 and since 2021. Prior to 1949, Clare was part of district. It elects one member of the . The electoral district includes most of the , an area occupying the southwestern half of . For four consecutive elections from 1988 to 1999, the district had the highest voter turnout in the province. The electoral district was abolished following the 2012 electoral boundary review and was largely replaced by the new electoral district of . It was re-created our of Clare-Digby following the 2019 Electoral Boundary Review.
[]
[ "Introduction" ]
[ "Former provincial electoral districts of Nova Scotia" ]
projected-04044882-001
https://en.wikipedia.org/wiki/Clare%20%28electoral%20district%29
Clare (electoral district)
Geography
Clare is a provincial in , which existed between 1949-2013 and since 2021. Prior to 1949, Clare was part of district. It elects one member of the . The electoral district includes most of the , an area occupying the southwestern half of . For four consecutive elections from 1988 to 1999, the district had the highest voter turnout in the province. The electoral district was abolished following the 2012 electoral boundary review and was largely replaced by the new electoral district of . It was re-created our of Clare-Digby following the 2019 Electoral Boundary Review.
The land area of Clare is .
[]
[ "Geography" ]
[ "Former provincial electoral districts of Nova Scotia" ]
projected-04044882-002
https://en.wikipedia.org/wiki/Clare%20%28electoral%20district%29
Clare (electoral district)
Members of the Legislative Assembly
Clare is a provincial in , which existed between 1949-2013 and since 2021. Prior to 1949, Clare was part of district. It elects one member of the . The electoral district includes most of the , an area occupying the southwestern half of . For four consecutive elections from 1988 to 1999, the district had the highest voter turnout in the province. The electoral district was abolished following the 2012 electoral boundary review and was largely replaced by the new electoral district of . It was re-created our of Clare-Digby following the 2019 Electoral Boundary Review.
The electoral district was represented by the following :
[]
[ "Members of the Legislative Assembly" ]
[ "Former provincial electoral districts of Nova Scotia" ]
projected-04044883-000
https://en.wikipedia.org/wiki/Anna%20M.%20Harkness
Anna M. Harkness
Introduction
Anna Maria Richardson Harkness (October 25, 1837 – March 27, 1926) was an .
[]
[ "Introduction" ]
[ "1837 births", "1926 deaths", "American philanthropists", "Harkness family", "Burials at Lake View Cemetery, Cleveland", "People from Wayne County, Ohio" ]
projected-04044883-001
https://en.wikipedia.org/wiki/Anna%20M.%20Harkness
Anna M. Harkness
Early life
Anna Maria Richardson Harkness (October 25, 1837 – March 27, 1926) was an .
She was born on October 25, 1837, in , and was the daughter of James Richardson and Anna ( Ranck) Richardson. Not much is known about her early life.
[]
[ "Early life" ]
[ "1837 births", "1926 deaths", "American philanthropists", "Harkness family", "Burials at Lake View Cemetery, Cleveland", "People from Wayne County, Ohio" ]
projected-04044883-002
https://en.wikipedia.org/wiki/Anna%20M.%20Harkness
Anna M. Harkness
Married life
Anna Maria Richardson Harkness (October 25, 1837 – March 27, 1926) was an .
On February 13, 1854, then sixteen-year-old Anna was married to the 34-year-old , an early investor with and became the second-largest shareholder in before his death in March 1888. Stephen had previously been married to Laura Osborne, who died in August 1852, and with whom he had three children, only one of whom, , was living at the time of their marriage. Together, Anna and Stephen lived at his estate on in Cleveland (known as Millionaires' Row) were the parents of four more children, three of whom survived to adulthood: Jennie A. Harkness (1856–1864), who died young. (1860–1916), who married Mary Warden (1864–1916) in 1896; both died of influenza in 1916. Florence Harkness (1864–1895), who married the widower (1838–1913). (1874–1940), who married Mary Stillman (1874–1950), daughter of New York attorney Thomas Stillman, in 1904. Her husband died aboard his yacht on March 6, 1888, and was buried in Cleveland's . He left an estate valued at $150,000,000 (equivalent to $ today) and Anna inherited one-third of his fortune at $50,000,000 (equivalent to $ today), consisting primarily of stock in Standard Oil. In 1891, Anna moved to New York City, but continued to maintain a home in . She died on March 27, 1926, at her home, in New York City. After a private funeral, she was buried alongside her late husband in Lake View Cemetery. At her death, she had already given away $40,000,000, yet her wealth had increased to nearly $85,000,000 (equivalent to $ today).
[]
[ "Married life" ]
[ "1837 births", "1926 deaths", "American philanthropists", "Harkness family", "Burials at Lake View Cemetery, Cleveland", "People from Wayne County, Ohio" ]
projected-04044883-003
https://en.wikipedia.org/wiki/Anna%20M.%20Harkness
Anna M. Harkness
Philanthropy
Anna Maria Richardson Harkness (October 25, 1837 – March 27, 1926) was an .
Their first child, Jennie, died aged seven in 1864. After her death, the Harknesses erected and furnished a memorial pavilion at in Cleveland as a memorial to her. On July 29, 1895, within a year of Florence's marriage to , the former Treasurer of , their second daughter also died. Similarly, Anna and her son-in-law Louis donated the funds for the construction of the Florence Harkness Memorial Chapel at in Cleveland. After her eldest son died in 1916, Anna gave $3,000,000 (equivalent to $ today) to for the construction of in Charles' memory, including , the most visible symbol of Yale on the skyline. Anna's portrait by is displayed in the dining hall of , part of the Memorial Quadrangle. In 1920, she donated an additional $3,000,000 to Yale towards increases in faculty salaries. In October 1918, Anna established the , a foundation dedicated to the improvement of healthcare with an initial gift of $10,000,000 (equivalent to $ today). Along with her son, , the foundation made charitable gifts totaling more than $129 million, the equivalent of $2 billion in 2005 dollars, including funds for the establishment of the s, the construction of St. Salvator's Hall at the , the at , and many of the undergraduate dormitories at and Universities (known as "houses" and "residential colleges," respectively). The fund was also a major benefactor of the and s, the in , the and , the and the , where it established the Museum's collection of . The Harkness Pavilion at / is also named for the family.
[]
[ "Married life", "Philanthropy" ]
[ "1837 births", "1926 deaths", "American philanthropists", "Harkness family", "Burials at Lake View Cemetery, Cleveland", "People from Wayne County, Ohio" ]
projected-04044887-000
https://en.wikipedia.org/wiki/USDF
USDF
Introduction
USDF may refer to: (United we stand, divided we fall), from econophysics , the Military of Swaziland , active during World War II United Student Democratic Federation, Indian leftist student association
[]
[ "Introduction" ]
[]
projected-04044887-001
https://en.wikipedia.org/wiki/USDF
USDF
See also
USDF may refer to: (United we stand, divided we fall), from econophysics , the Military of Swaziland , active during World War II United Student Democratic Federation, Indian leftist student association
"", a motto
[]
[ "See also" ]
[]
projected-04044906-000
https://en.wikipedia.org/wiki/The%20Super%20Fight
The Super Fight
Introduction
The Super Fight was a fictional between and shot in 1969 and released in 1970. At the time, Ali and Marciano were the only undefeated heavyweight champions in history and fans often debated who would win had they met in their primes. Ali and Marciano were filmed sparring for 75 one-minute rounds producing several possible scenarios for a genuine fight, with the result claimed to have been determined using formulas entered into a . The final film was only shown once in select cinemas around the world, grossing ( adjusted for inflation) from 1,500 theaters across North America and Europe. It was released as a over three decades later.
[]
[ "Introduction" ]
[ "1970 films", "1970s sports films", "American boxing films", "Muhammad Ali", "Wide World of Sports (American TV series)", "Rocky Marciano", "Cultural depictions of boxers", "Cultural depictions of Muhammad Ali", "Cultural depictions of Joe Louis", "Cultural depictions of Max Schmeling", "Cultural depictions of Jack Dempsey", "1970s English-language films", "1970s American films" ]
projected-04044906-001
https://en.wikipedia.org/wiki/The%20Super%20Fight
The Super Fight
Background
The Super Fight was a fictional between and shot in 1969 and released in 1970. At the time, Ali and Marciano were the only undefeated heavyweight champions in history and fans often debated who would win had they met in their primes. Ali and Marciano were filmed sparring for 75 one-minute rounds producing several possible scenarios for a genuine fight, with the result claimed to have been determined using formulas entered into a . The final film was only shown once in select cinemas around the world, grossing ( adjusted for inflation) from 1,500 theaters across North America and Europe. It was released as a over three decades later.
In 1967, Murray Woroner had the idea of determining the all-time great heavyweight champion of the world by placing boxing champions of different eras in a series of fantasy fights. Woroner sent out a survey to 250 boxing experts and writers to help determine which boxers would be used in what would become a fantasy tournament. Hank Meyer, president and salesman with a one other partner in SPS, was instrumental in setting this competition up, and contended at the time that it was his idea. Woroner picked the first round of fantasy matches to be: vs. vs. vs. vs. vs. vs. vs. vs. -by-punch details of the boxer's records during their prime were entered into an computer. Also their strengths, weaknesses, fighting styles and patterns and other factors and s that the boxers could go through were converted into formulas. The NCR-315 with 20K of memory was supplied by SPS (Systems Programming Services), an independent service bureau in . The algorithms were supplied by an NCR mathematician, and programming was done in by an employee of SPS. Hank Meyer, President and salesman with a one other partner in SPS, was instrumental in setting this competition up, and contended at the time that it was his idea. The actual running of the software was done the night before each broadcast round of the 'computer championship' and took approximately 45 minutes to run, the output was a formatted report containing a series of codes describing each punch. This was then written to magnetic tape, the tape was then manually transferred to a 1005 and printed. This took place in early 1968. The outcomes were then staged as s with Woroner and radio Guy LeBow as the . The fantasy fights were broadcast worldwide. Even the boxers who were still alive at the time listened to the programs and some of them participated as commentators. After the series of elimination rounds, the final fight was between Dempsey and Marciano. Marciano defeated Dempsey and was considered to be the all-time greatest heavyweight champion by the computer. Woroner awarded the real Marciano a gold and diamond worth $10,000.
[]
[ "Background" ]
[ "1970 films", "1970s sports films", "American boxing films", "Muhammad Ali", "Wide World of Sports (American TV series)", "Rocky Marciano", "Cultural depictions of boxers", "Cultural depictions of Muhammad Ali", "Cultural depictions of Joe Louis", "Cultural depictions of Max Schmeling", "Cultural depictions of Jack Dempsey", "1970s English-language films", "1970s American films" ]
projected-04044906-002
https://en.wikipedia.org/wiki/The%20Super%20Fight
The Super Fight
The film
The Super Fight was a fictional between and shot in 1969 and released in 1970. At the time, Ali and Marciano were the only undefeated heavyweight champions in history and fans often debated who would win had they met in their primes. Ali and Marciano were filmed sparring for 75 one-minute rounds producing several possible scenarios for a genuine fight, with the result claimed to have been determined using formulas entered into a . The final film was only shown once in select cinemas around the world, grossing ( adjusted for inflation) from 1,500 theaters across North America and Europe. It was released as a over three decades later.
After Ali lost a fantasy fight in one of the radio broadcasts, he filed a $1 million lawsuit against Woroner for , stating his anger at his elimination at the second round to Jim Jeffries, a boxer Ali had previously called "history's clumsiest, most slow-footed heavyweight." The lawsuit was settled when Woroner offered to pay Ali $10,000 while also getting his agreement to participate in a filmed version of a fantasy fight in which he would fight Marciano. Ali and Marciano agreed on the condition that they would also receive a cut of the film's profits. Marciano, whose last fight before retiring undefeated at 49–0 was 14 years prior, also agreed to participate with a similar deal. In preparation for the film, Rocky lost over and wore a in order to look as he did in his prime. Both he and Ali were reported to be enthusiastic about meeting each other and getting back in the ring. The same formulas as the radio fantasy fights were used and entered into the NCR 315, with filming commencing February 1969 in a studio. The two fighters sparred for between 70 and 75 rounds, exchanging mainly body blows with some head shots in-between, which were later edited together according to the findings of the computer. Braddock, Louis, Schmeling, Sharkey and Walcott also recorded commentary to be used in the film. The final outcome would not be revealed until the release of the film on January 20, 1970, shown in 1,500 theaters by in the United States, Canada, and throughout Europe. American and Canadian audiences were shown a version of Marciano knocking out Ali in the 13th round, as staged by the boxers, while European audiences were shown another ending in which Ali was depicted the winner after opening cuts on Marciano, also simulated.
[ "The Superfight - Rocky Marciano vs Muhammad Ali Ticket.jpg" ]
[ "The film" ]
[ "1970 films", "1970s sports films", "American boxing films", "Muhammad Ali", "Wide World of Sports (American TV series)", "Rocky Marciano", "Cultural depictions of boxers", "Cultural depictions of Muhammad Ali", "Cultural depictions of Joe Louis", "Cultural depictions of Max Schmeling", "Cultural depictions of Jack Dempsey", "1970s English-language films", "1970s American films" ]
projected-04044906-003
https://en.wikipedia.org/wiki/The%20Super%20Fight
The Super Fight
Box office and reaction
The Super Fight was a fictional between and shot in 1969 and released in 1970. At the time, Ali and Marciano were the only undefeated heavyweight champions in history and fans often debated who would win had they met in their primes. Ali and Marciano were filmed sparring for 75 one-minute rounds producing several possible scenarios for a genuine fight, with the result claimed to have been determined using formulas entered into a . The final film was only shown once in select cinemas around the world, grossing ( adjusted for inflation) from 1,500 theaters across North America and Europe. It was released as a over three decades later.
In the United States, the film grossed more than from more than a thousand theaters. Across North America and Europe, the film grossed ( adjusted for inflation) from 1,500 theaters. Three weeks after filming was completed, Rocky Marciano on the eve of what would have been his 46th birthday. No feedback was recorded from him personally regarding the film, with the exception of Marciano's brother Peter who claimed that upon Rocky being asked whether he would win the fantasy fight, he was confident that he would win. Ali attended a screening of the film the night of the release. He immediately relaunched legal proceedings against Woroner, again stating defamation of character, alleging the film's marketing had misled audiences worldwide to believe the fight was actual, while also stating any version of the film which depicted him losing was a result of him not taking the simulation seriously. He also claimed American audiences were left angered by Marciano being depicted the winner and disputed whether the NCR 315 computer was used at all during or after filming. Ali later dropped the lawsuit upon discovering his depicted win in European theatres, while also having been made aware of the filmmakers plans to destroy remaining prints of the film to prevent potential legal action. In a 1976 interview, Ali briefly recapped on the film maintaining his ridicule of the style of filming and depicted outcomes. He however praised Marciano as a boxer stating they left filming on good terms.
[]
[ "Box office and reaction" ]
[ "1970 films", "1970s sports films", "American boxing films", "Muhammad Ali", "Wide World of Sports (American TV series)", "Rocky Marciano", "Cultural depictions of boxers", "Cultural depictions of Muhammad Ali", "Cultural depictions of Joe Louis", "Cultural depictions of Max Schmeling", "Cultural depictions of Jack Dempsey", "1970s English-language films", "1970s American films" ]
projected-04044906-004
https://en.wikipedia.org/wiki/The%20Super%20Fight
The Super Fight
Destruction of film prints and recovery
The Super Fight was a fictional between and shot in 1969 and released in 1970. At the time, Ali and Marciano were the only undefeated heavyweight champions in history and fans often debated who would win had they met in their primes. Ali and Marciano were filmed sparring for 75 one-minute rounds producing several possible scenarios for a genuine fight, with the result claimed to have been determined using formulas entered into a . The final film was only shown once in select cinemas around the world, grossing ( adjusted for inflation) from 1,500 theaters across North America and Europe. It was released as a over three decades later.
During the buildup to the film's release, concerns were held regarding Ali's ban from boxing being active at the time of the film's conceptualization, recording and release, and were later fueled by allegations that marketing and promotional work for the film did not clearly detail that the fight was fictional and the outcome was decided by the NCR 315 computer as well as opinions of boxing experts. Upon the film's release, believing audiences were misled to believe the fight was actual and Ali threatening a second lawsuit upon Woroner, the producers announced all film prints had been destroyed. Debates subsequently took place over the next three decades as to whether at least one print of film had survived. It was cited that many theaters had continued to play the film long after January 20, 1970, and was also noted that the film had one airing on 's in 1970, and another on late night in 1977, with many more broadcasts alleged throughout. Following an official discovery of a surviving print in 2005, the film was authorized for release and distribution. On December 27, 2005, The Superfight: Marciano vs. Ali was released on and has been televised several times since. The DVD includes a about the film, audio of the original radio fantasy fights, archival interviews with the fighters that were chosen, and other features.
[]
[ "Destruction of film prints and recovery" ]
[ "1970 films", "1970s sports films", "American boxing films", "Muhammad Ali", "Wide World of Sports (American TV series)", "Rocky Marciano", "Cultural depictions of boxers", "Cultural depictions of Muhammad Ali", "Cultural depictions of Joe Louis", "Cultural depictions of Max Schmeling", "Cultural depictions of Jack Dempsey", "1970s English-language films", "1970s American films" ]
projected-04044906-005
https://en.wikipedia.org/wiki/The%20Super%20Fight
The Super Fight
Legacy
The Super Fight was a fictional between and shot in 1969 and released in 1970. At the time, Ali and Marciano were the only undefeated heavyweight champions in history and fans often debated who would win had they met in their primes. Ali and Marciano were filmed sparring for 75 one-minute rounds producing several possible scenarios for a genuine fight, with the result claimed to have been determined using formulas entered into a . The final film was only shown once in select cinemas around the world, grossing ( adjusted for inflation) from 1,500 theaters across North America and Europe. It was released as a over three decades later.
The Super Fight was featured in and inspired the plot of the 2006 film .
[]
[ "Legacy" ]
[ "1970 films", "1970s sports films", "American boxing films", "Muhammad Ali", "Wide World of Sports (American TV series)", "Rocky Marciano", "Cultural depictions of boxers", "Cultural depictions of Muhammad Ali", "Cultural depictions of Joe Louis", "Cultural depictions of Max Schmeling", "Cultural depictions of Jack Dempsey", "1970s English-language films", "1970s American films" ]
projected-04044906-006
https://en.wikipedia.org/wiki/The%20Super%20Fight
The Super Fight
References
The Super Fight was a fictional between and shot in 1969 and released in 1970. At the time, Ali and Marciano were the only undefeated heavyweight champions in history and fans often debated who would win had they met in their primes. Ali and Marciano were filmed sparring for 75 one-minute rounds producing several possible scenarios for a genuine fight, with the result claimed to have been determined using formulas entered into a . The final film was only shown once in select cinemas around the world, grossing ( adjusted for inflation) from 1,500 theaters across North America and Europe. It was released as a over three decades later.
Boxing: All-Time Heavyweight Championship of the World, reproduced from
[]
[ "References" ]
[ "1970 films", "1970s sports films", "American boxing films", "Muhammad Ali", "Wide World of Sports (American TV series)", "Rocky Marciano", "Cultural depictions of boxers", "Cultural depictions of Muhammad Ali", "Cultural depictions of Joe Louis", "Cultural depictions of Max Schmeling", "Cultural depictions of Jack Dempsey", "1970s English-language films", "1970s American films" ]
projected-04044914-000
https://en.wikipedia.org/wiki/Dioscorea%20opposita
Dioscorea opposita
Introduction
Dioscorea opposita is an obsolete synonym of two species of yams: (Dioscorea polystachya), a widely cultivated yam native to China , a yam native to the Indian subcontinent
[]
[ "Introduction" ]
[ "Species Latin name disambiguation pages", "Dioscorea" ]
projected-04044917-000
https://en.wikipedia.org/wiki/James%20Marshall%20%28author%29
James Marshall (author)
Introduction
James Edward Marshall (October 10, 1942 – October 13, 1992) was an and writer of , probably best known for the series of (1972–1988). He illustrated books exclusively as James Marshall; when he created both text and illustrations he sometimes wrote as Edward Marshall. In 2007, the U.S. professional librarians posthumously awarded him the bi-ennial for "substantial and lasting contribution" to American children's literature.
[]
[ "Introduction" ]
[ "1942 births", "1992 deaths", "American children's book illustrators", "American children's writers", "Laura Ingalls Wilder Medal winners", "Writers from San Antonio", "Place of death missing", "American male writers", "20th-century American writers", "Southern Connecticut State University alumni", "Artists from Texas", "Deaths from brain tumor", "People from Chelsea, Manhattan" ]
projected-04044917-001
https://en.wikipedia.org/wiki/James%20Marshall%20%28author%29
James Marshall (author)
Life and death
James Edward Marshall (October 10, 1942 – October 13, 1992) was an and writer of , probably best known for the series of (1972–1988). He illustrated books exclusively as James Marshall; when he created both text and illustrations he sometimes wrote as Edward Marshall. In 2007, the U.S. professional librarians posthumously awarded him the bi-ennial for "substantial and lasting contribution" to American children's literature.
James Marshall was born in 1942, in , where he grew up on his family's 85-acre farm. His father worked on the railroad and had a band. His mother sang in the local church choir. The family later moved to . Marshall said: "Beaumont is deep south and swampy and I hated it. I knew I would die if I stayed there so I diligently studied the viola, and eventually won a scholarship to the New England Conservatory in Boston." He entered the but injured his hand, ending his music career. He returned to Texas, where he attended , and later transferred to where he received degrees in French and history. He lived between an apartment in the district of and a home in , . He died on October 13, 1992, three days after his 50th birthday. His obituary states that he died of a brain tumor; however, his sister has since clarified that he died of AIDS.
[]
[ "Life and death" ]
[ "1942 births", "1992 deaths", "American children's book illustrators", "American children's writers", "Laura Ingalls Wilder Medal winners", "Writers from San Antonio", "Place of death missing", "American male writers", "20th-century American writers", "Southern Connecticut State University alumni", "Artists from Texas", "Deaths from brain tumor", "People from Chelsea, Manhattan" ]
projected-04044917-002
https://en.wikipedia.org/wiki/James%20Marshall%20%28author%29
James Marshall (author)
Career
James Edward Marshall (October 10, 1942 – October 13, 1992) was an and writer of , probably best known for the series of (1972–1988). He illustrated books exclusively as James Marshall; when he created both text and illustrations he sometimes wrote as Edward Marshall. In 2007, the U.S. professional librarians posthumously awarded him the bi-ennial for "substantial and lasting contribution" to American children's literature.
It is stated that he discovered his vocation on a 1971 summer afternoon, lying in a hammock and drawing. His mother was watching , and the main characters, George and Martha, ultimately became characters in one of his children's books (as two ). Marshall continued creating books for children until his untimely death in 1992 from AIDS-related complications. In 1999, George and Martha became the stars of an , which aired on and Canadian . Marshall was a friend of the late , who called him the "last in the line" of children's writers for whom children's books were a cottage industry. Sendak said that Marshall was "uncommercial to a fault" and, as a consequence, was little recognized by the awards committees. (As illustrator of Goldilocks and the Three Bears, Marshall was a runner-up for the in 1989; the "Caldecott Honor Books" may display silver rather than gold seals. He won a University of Mississippi Silver Medallion in 1992. Over his career, he was three times recognized by the New York Times Book Review as one of the best illustrated children's book of the year.) Sendak said that in Marshall you got "the whole man", who "scolded, gossiped, bitterly reproached, but always loved and forgave" and "made me laugh until I cried." In introduction to the collected George and Martha, Sendak called him the "last of a long line of masters" including , , , and . Beside the lovable hippos George and Martha, James Marshall created dozens of other uniquely appealing characters and illustrated over 70 books. He is well known for his Fox series (which he wrote as "Edward Marshall"), as well as the Miss Nelson books (or Miss Viola Swamp, written by Harry Allard), (written by Allard), the Cut-ups, and many more. James Marshall had the uncanny ability to elicit wild delight from readers with relatively little text and simple drawings. With only two minute dots for eyes, his illustrated characters are able to express a wide range of emotion, and produce howls of laughter from both children and adults.
[]
[ "Career" ]
[ "1942 births", "1992 deaths", "American children's book illustrators", "American children's writers", "Laura Ingalls Wilder Medal winners", "Writers from San Antonio", "Place of death missing", "American male writers", "20th-century American writers", "Southern Connecticut State University alumni", "Artists from Texas", "Deaths from brain tumor", "People from Chelsea, Manhattan" ]
projected-04044926-000
https://en.wikipedia.org/wiki/Glace%20Bay-Dominion
Glace Bay-Dominion
Introduction
Glace Bay-Dominion is a provincial in , , that elects one member of the . The since 2021 is John White of the Progressive Conservative Party of Nova Scotia. It was created in 1933 when the district of Cape Breton was divided into five electoral districts, one of which was named Cape Breton East. In 2001, the district name was changed to Glace Bay. In 2003, the district lost a small area at its southern tip to . Following the 2019 redistribution, it gained the area from and was re-named Glace Bay-Dominion.
[]
[ "Introduction" ]
[ "Nova Scotia provincial electoral districts", "Politics of the Cape Breton Regional Municipality" ]
projected-04044926-001
https://en.wikipedia.org/wiki/Glace%20Bay-Dominion
Glace Bay-Dominion
Geography
Glace Bay-Dominion is a provincial in , , that elects one member of the . The since 2021 is John White of the Progressive Conservative Party of Nova Scotia. It was created in 1933 when the district of Cape Breton was divided into five electoral districts, one of which was named Cape Breton East. In 2001, the district name was changed to Glace Bay. In 2003, the district lost a small area at its southern tip to . Following the 2019 redistribution, it gained the area from and was re-named Glace Bay-Dominion.
The land area of Glace Bay-Dominion is .
[]
[ "Geography" ]
[ "Nova Scotia provincial electoral districts", "Politics of the Cape Breton Regional Municipality" ]
projected-04044926-002
https://en.wikipedia.org/wiki/Glace%20Bay-Dominion
Glace Bay-Dominion
Members of the Legislative Assembly
Glace Bay-Dominion is a provincial in , , that elects one member of the . The since 2021 is John White of the Progressive Conservative Party of Nova Scotia. It was created in 1933 when the district of Cape Breton was divided into five electoral districts, one of which was named Cape Breton East. In 2001, the district name was changed to Glace Bay. In 2003, the district lost a small area at its southern tip to . Following the 2019 redistribution, it gained the area from and was re-named Glace Bay-Dominion.
This riding has elected the following :
[]
[ "Members of the Legislative Assembly" ]
[ "Nova Scotia provincial electoral districts", "Politics of the Cape Breton Regional Municipality" ]
projected-04044926-019
https://en.wikipedia.org/wiki/Glace%20Bay-Dominion
Glace Bay-Dominion
1980 by-election
Glace Bay-Dominion is a provincial in , , that elects one member of the . The since 2021 is John White of the Progressive Conservative Party of Nova Scotia. It was created in 1933 when the district of Cape Breton was divided into five electoral districts, one of which was named Cape Breton East. In 2001, the district name was changed to Glace Bay. In 2003, the district lost a small area at its southern tip to . Following the 2019 redistribution, it gained the area from and was re-named Glace Bay-Dominion.
|- | |Donnie MacLeod |align="right"|4,505 |align="right"| |align="right"| |- | |Reeves Matheson |align="right"|2,996 |align="right"| |align="right"| |- | |Vincent Kachafanas |align="right"|2,904 |align="right"| |align="right"| |- | |Ignatius V. Kennedy |align="right"|101 |align="right"| |align="right"| |}
[]
[ "Election results", "1980 by-election" ]
[ "Nova Scotia provincial electoral districts", "Politics of the Cape Breton Regional Municipality" ]
projected-04044926-026
https://en.wikipedia.org/wiki/Glace%20Bay-Dominion
Glace Bay-Dominion
2000 by-election
Glace Bay-Dominion is a provincial in , , that elects one member of the . The since 2021 is John White of the Progressive Conservative Party of Nova Scotia. It was created in 1933 when the district of Cape Breton was divided into five electoral districts, one of which was named Cape Breton East. In 2001, the district name was changed to Glace Bay. In 2003, the district lost a small area at its southern tip to . Following the 2019 redistribution, it gained the area from and was re-named Glace Bay-Dominion.
|- | | |align="right"|4,017 |align="right"|43.33 |align="right"| |- | |Cecil Saccary |align="right"|3,609 |align="right"|38.93 |align="right"| |- | |Brad Kerr |align="right"|1,644 |align="right"|17.74 |align="right"| |}
[]
[ "Election results", "2000 by-election" ]
[ "Nova Scotia provincial electoral districts", "Politics of the Cape Breton Regional Municipality" ]
projected-04044926-030
https://en.wikipedia.org/wiki/Glace%20Bay-Dominion
Glace Bay-Dominion
2010 by-election
Glace Bay-Dominion is a provincial in , , that elects one member of the . The since 2021 is John White of the Progressive Conservative Party of Nova Scotia. It was created in 1933 when the district of Cape Breton was divided into five electoral districts, one of which was named Cape Breton East. In 2001, the district name was changed to Glace Bay. In 2003, the district lost a small area at its southern tip to . Following the 2019 redistribution, it gained the area from and was re-named Glace Bay-Dominion.
| |Myrtle Campbell |align="right"|2,281 |align="right"|31.52 |align="right"| | |Michelle Wheelhouse |align="right"|759 |align="right"|10.48 |align="right"| |Independent |Edna Lee |align="right"|195 |align="right"|2.69 |align="right"|
[]
[ "Election results", "2010 by-election" ]
[ "Nova Scotia provincial electoral districts", "Politics of the Cape Breton Regional Municipality" ]
projected-04044926-031
https://en.wikipedia.org/wiki/Glace%20Bay-Dominion
Glace Bay-Dominion
2013 general election
Glace Bay-Dominion is a provincial in , , that elects one member of the . The since 2021 is John White of the Progressive Conservative Party of Nova Scotia. It was created in 1933 when the district of Cape Breton was divided into five electoral districts, one of which was named Cape Breton East. In 2001, the district name was changed to Glace Bay. In 2003, the district lost a small area at its southern tip to . Following the 2019 redistribution, it gained the area from and was re-named Glace Bay-Dominion.
|- | | |align="right"|5,547 |align="right"|80.36 |align="right"| | |Mary Beth MacDonald |align="right"|1,001 |align="right"|14.50 |align="right"| |- | |Thomas Bethell |align="right"|355 |align="right"|5.14 |align="right"| |}
[]
[ "Election results", "2013 general election" ]
[ "Nova Scotia provincial electoral districts", "Politics of the Cape Breton Regional Municipality" ]
projected-04044935-000
https://en.wikipedia.org/wiki/Eric%20Yoffie
Eric Yoffie
Introduction
Eric H. Yoffie is a , and President Emeritus of the (URJ), the congregational arm of the Reform movement in North America, which represents an estimated 1.5 million Reform Jews in more than 900 synagogues across the United States and Canada. He was the unchallenged head of American Judaism's largest denomination from 1996 to 2012. Following his retirement in 2012, he has been a lecturer and writer; his writings appear regularly in , , and .
[]
[ "Introduction" ]
[ "Living people", "1940s births", "American Reform rabbis", "Rabbis from Massachusetts", "Brandeis University alumni", "Hebrew Union College alumni", "20th-century American rabbis", "21st-century American rabbis", "People from Worcester, Massachusetts", "People from Lynbrook, New York", "People from Durham, North Carolina", "People from Westfield, New Jersey" ]
projected-04044935-001
https://en.wikipedia.org/wiki/Eric%20Yoffie
Eric Yoffie
Family and career
Eric H. Yoffie is a , and President Emeritus of the (URJ), the congregational arm of the Reform movement in North America, which represents an estimated 1.5 million Reform Jews in more than 900 synagogues across the United States and Canada. He was the unchallenged head of American Judaism's largest denomination from 1996 to 2012. Following his retirement in 2012, he has been a lecturer and writer; his writings appear regularly in , , and .
Rabbi Yoffie was raised in , where his family belonged to historic , and he was involved in the Reform Movement's Youth organization, the North American Federation of Temple Youth (NFTY). He first held the position of president in the Northeast Region of NFTY before moving on to be the organization's Vice President in 1965–1966. After high school Yoffie spent his first year at , and graduated from . He received his from in New York in 1974. He served congregations in , and , before joining the URJ as director of the Midwest Council in 1980. In 1983 he was named Executive Director of the (ARZA). In 1992 he became vice president of the URJ and director of the Commission on Social Action. In addition, he served as executive editor of the magazine. On July 1, 1996, he succeeded Rabbi as president of the Union for Reform Judaism. In 1999 named Yoffie the number one Jewish leader in America. In 2009 named him # 8 on its list of "50 Influential Rabbis." He is married to Amy Jacobson Yoffie. The couple has two children, and reside in . On June 10, 2010, Rabbi Yoffie announced his intention to step down from the post of president of the URJ at the age of 65, in June 2012. He was succeeded by , who had served as the senior rabbi at Westchester Reform Temple in .
[]
[ "Family and career" ]
[ "Living people", "1940s births", "American Reform rabbis", "Rabbis from Massachusetts", "Brandeis University alumni", "Hebrew Union College alumni", "20th-century American rabbis", "21st-century American rabbis", "People from Worcester, Massachusetts", "People from Lynbrook, New York", "People from Durham, North Carolina", "People from Westfield, New Jersey" ]
projected-04044935-002
https://en.wikipedia.org/wiki/Eric%20Yoffie
Eric Yoffie
Views on Jewish life
Eric H. Yoffie is a , and President Emeritus of the (URJ), the congregational arm of the Reform movement in North America, which represents an estimated 1.5 million Reform Jews in more than 900 synagogues across the United States and Canada. He was the unchallenged head of American Judaism's largest denomination from 1996 to 2012. Following his retirement in 2012, he has been a lecturer and writer; his writings appear regularly in , , and .
Rabbi Yoffie has been a proponent of increased traditionalism within Reform Judaism, encouraging a greater focus on Jewish text study and prayer. Dr. , the dean of American Jewish historians, noted that Yoffie devoted time as President of the URJ to bringing "old ideas" to Reform Judaism, "urging its rank and file to focus on enriching their spiritual lives and expanding their knowledge of Judaism." During his tenure, he announced two major worship initiatives. The first, in 1999, was designed to help congregations become "houses in which we pray with joy." The second, eight years later, fostered Shabbat observance among individual Reform Jews while encouraging congregations to rethink their Shabbat morning worship. Rabbi Yoffie was also a proponent of lifelong Jewish study and helped synagogues to develop programs that increased Jewish literacy among adults. In 2005, he introduced the curriculum to teach sexual ethics to teens in Reform camps and congregations. In his recent writings, Yoffie has argued against understandings of Judaism that are primarily secular or cultural, referring to such Jews as "self-delusional," and suggesting that such understandings mistake a part for the whole and that a religiously-grounded Judaism is essential to assure the Jewish future.
[]
[ "Views on Jewish life" ]
[ "Living people", "1940s births", "American Reform rabbis", "Rabbis from Massachusetts", "Brandeis University alumni", "Hebrew Union College alumni", "20th-century American rabbis", "21st-century American rabbis", "People from Worcester, Massachusetts", "People from Lynbrook, New York", "People from Durham, North Carolina", "People from Westfield, New Jersey" ]
projected-04044935-003
https://en.wikipedia.org/wiki/Eric%20Yoffie
Eric Yoffie
Views on interfaith relations
Eric H. Yoffie is a , and President Emeritus of the (URJ), the congregational arm of the Reform movement in North America, which represents an estimated 1.5 million Reform Jews in more than 900 synagogues across the United States and Canada. He was the unchallenged head of American Judaism's largest denomination from 1996 to 2012. Following his retirement in 2012, he has been a lecturer and writer; his writings appear regularly in , , and .
Rabbi Yoffie has been a pioneer in and launched Movement-wide dialogue programs with both Christians and Muslims. In 2005, he was the first Jew to address the Churchwide Assembly of the . Later that year, he harshly criticized some positions of the , but in 2006 he accepted the invitation of the Rev. to address the students and faculty of ; as the first Rabbi to appear at a university-wide convocation, he talked frankly of areas of agreement and disagreement between Evangelical Christians and Jews. Yoffie first spoke on shared values of family and morality before defending and , which elicited boos from the students. On August 21, 2007, Rabbi Yoffie was the first leader of a major Jewish organization to speak at the convention of the . In his remarks he spoke of "a huge and profound ignorance of Islam" by Jews and Christians in North America. He stated that "the time has come to listen to our Muslim neighbors speak, from their heart and in their own words, about the spiritual power of Islam and their love for their religion." He also asked Muslims for more understanding of Judaism: “The dialogue will not be one way, of course. You will teach us about Islam and we will teach you about Judaism. We will help you to overcome stereotyping of Muslims, and you will help us to overcome stereotyping of Jews.” Rabbi Yoffie later was a supporter of the , and he has been a strong advocate for the rights of . In contrast to these above interfaith efforts, Yoffie strongly disagrees with , claiming that it lacks "humility, imagination, and curiosity."
[]
[ "Views on interfaith relations" ]
[ "Living people", "1940s births", "American Reform rabbis", "Rabbis from Massachusetts", "Brandeis University alumni", "Hebrew Union College alumni", "20th-century American rabbis", "21st-century American rabbis", "People from Worcester, Massachusetts", "People from Lynbrook, New York", "People from Durham, North Carolina", "People from Westfield, New Jersey" ]
projected-04044935-004
https://en.wikipedia.org/wiki/Eric%20Yoffie
Eric Yoffie
Views on social justice
Eric H. Yoffie is a , and President Emeritus of the (URJ), the congregational arm of the Reform movement in North America, which represents an estimated 1.5 million Reform Jews in more than 900 synagogues across the United States and Canada. He was the unchallenged head of American Judaism's largest denomination from 1996 to 2012. Following his retirement in 2012, he has been a lecturer and writer; his writings appear regularly in , , and .
As President of the URJ, Rabbi Yoffie spoke to a wide variety of social justice issues. He opposed the death penalty, supported LBGT rights, and was a prominent spokesperson for . He was the only religious leader to appear at the in Washington, D.C., declaring that "the indiscriminate distribution of guns is an offense against God and humanity." Rabbi Yoffie went on to state that "our gun-flooded society has turned weapons into idols, and the worship of idols must be recognized for what it is—blasphemy. And the only appropriate religious response to blasphemy is sustained moral outrage."
[]
[ "Views on social justice" ]
[ "Living people", "1940s births", "American Reform rabbis", "Rabbis from Massachusetts", "Brandeis University alumni", "Hebrew Union College alumni", "20th-century American rabbis", "21st-century American rabbis", "People from Worcester, Massachusetts", "People from Lynbrook, New York", "People from Durham, North Carolina", "People from Westfield, New Jersey" ]
projected-04044935-005
https://en.wikipedia.org/wiki/Eric%20Yoffie
Eric Yoffie
Views on relations with Israel
Eric H. Yoffie is a , and President Emeritus of the (URJ), the congregational arm of the Reform movement in North America, which represents an estimated 1.5 million Reform Jews in more than 900 synagogues across the United States and Canada. He was the unchallenged head of American Judaism's largest denomination from 1996 to 2012. Following his retirement in 2012, he has been a lecturer and writer; his writings appear regularly in , , and .
Rabbi Yoffie has devoted much of his public life to working on behalf of the Jewish state and to promoting close ties between Israel and . During his years as URJ President, he met frequently with Israel's elected officials to present the concerns of the Reform movement and North American Jewry. He has been a prominent advocate of religious freedom and religious pluralism in Israel, arguing that the cause of Judaism can only be advanced by education and persuasion and not by coercion. In an incident that drew international headlines, Rabbi Yoffie in June 2006 declined to meet with Israeli President after the President refused to address Rabbi Yoffie with the title "Rabbi". The does not recognize rabbinic ordinations from non- institutions, In 2014, Rabbi Yoffie challenged the Presidential candidate, , by asking if he would address Reform rabbis by the title "rabbi." While Rivlin did not respond directly to this issue while a candidate, a source close to him responded that he "has always received Rabbi Yoffie respectfully and will continue to have a wonderful relationship with Diaspora Jews."
[]
[ "Views on relations with Israel" ]
[ "Living people", "1940s births", "American Reform rabbis", "Rabbis from Massachusetts", "Brandeis University alumni", "Hebrew Union College alumni", "20th-century American rabbis", "21st-century American rabbis", "People from Worcester, Massachusetts", "People from Lynbrook, New York", "People from Durham, North Carolina", "People from Westfield, New Jersey" ]
projected-04044935-006
https://en.wikipedia.org/wiki/Eric%20Yoffie
Eric Yoffie
Contemporary spirituality
Eric H. Yoffie is a , and President Emeritus of the (URJ), the congregational arm of the Reform movement in North America, which represents an estimated 1.5 million Reform Jews in more than 900 synagogues across the United States and Canada. He was the unchallenged head of American Judaism's largest denomination from 1996 to 2012. Following his retirement in 2012, he has been a lecturer and writer; his writings appear regularly in , , and .
In his recent writings, in the and elsewhere, Rabbi Yoffie has addressed broad questions of belief and spirituality in American life. In particular, he has applied a progressive religious point of view to issues of sin, atheism, and community, as well as contemporary matters such as immigration, health care, and economic justice. In "What it Means to be a Liberal Person of Faith" and in other widely read articles, he has suggested that progressive religion has a vital role to play during a time of "s," fear of terrorism, and economic uncertainty.
[]
[ "Contemporary spirituality" ]
[ "Living people", "1940s births", "American Reform rabbis", "Rabbis from Massachusetts", "Brandeis University alumni", "Hebrew Union College alumni", "20th-century American rabbis", "21st-century American rabbis", "People from Worcester, Massachusetts", "People from Lynbrook, New York", "People from Durham, North Carolina", "People from Westfield, New Jersey" ]
projected-04044946-000
https://en.wikipedia.org/wiki/National%20Emblem
National Emblem
Introduction
"National Emblem", also known as the National Emblem March, is a U.S. composed in 1902 and published in 1906 by . It is a standard of the U.S. march repertoire, appearing in eleven published editions. The U.S. military uses the trio section as ceremonial music for the entry of the ceremony's official party.
[]
[ "Introduction" ]
[ "1902 compositions", "1902 songs", "American marches", "American patriotic songs", "Concert band pieces" ]
projected-04044946-001
https://en.wikipedia.org/wiki/National%20Emblem
National Emblem
History
"National Emblem", also known as the National Emblem March, is a U.S. composed in 1902 and published in 1906 by . It is a standard of the U.S. march repertoire, appearing in eleven published editions. The U.S. military uses the trio section as ceremonial music for the entry of the ceremony's official party.
Bagley composed the score during a 1902 train tour with his family band, Wheeler's Band of Bellows Falls, Vermont. He became frustrated with the ending, and tossed the composition in a bin. Members of the band retrieved it and secretly rehearsed the score in the . Bagley was surprised when the band informed him minutes before the next concert that they would perform it. It became the most famous of all of Bagley's marches. Despite this the composition did not make Bagley wealthy; he sold the copyright for $25. In the first strain, Bagley incorporated the first twelve notes of "" played by euphonium, bassoon, alto clarinet, tenor saxophone, and trombone, disguised in duple rather than triple time. The rest of the notes are all Bagley's, including the four short repeated A-flat major chords that lead to a statement by the low brass that is now reminiscent of the national anthem. Unusually, Bagley's march does not incorporate either a or a stinger. However the exact repetition of the trio's melody at a chromatic mediant (A-flat Major/m.3 of Trio, then C Major/m.10 of Trio) is suggestive of a breakstrain. The band of made the first recording of the march on May 19, 1908, followed by a recording on March 21, 1914 (both recordings by the ).
[ "\"National Emblem\" performed by the United States Naval Academy Band in 1977.oga" ]
[ "History" ]
[ "1902 compositions", "1902 songs", "American marches", "American patriotic songs", "Concert band pieces" ]
projected-04044946-002
https://en.wikipedia.org/wiki/National%20Emblem
National Emblem
Reception
"National Emblem", also known as the National Emblem March, is a U.S. composed in 1902 and published in 1906 by . It is a standard of the U.S. march repertoire, appearing in eleven published editions. The U.S. military uses the trio section as ceremonial music for the entry of the ceremony's official party.
was once asked to list the three most effective street marches ever written. Sousa listed two of his own compositions, but he selected "National Emblem" for the third. When Sousa formed and conducted the 350-member U.S. Navy Jacket Band at the he chose five marches for World War I drives. Four were by Sousa—"Semper Fidelis", "Washington Post", "The Thunderer", "Stars and Stripes Forever", and Bagley's "National Emblem March".
[]
[ "Reception" ]
[ "1902 compositions", "1902 songs", "American marches", "American patriotic songs", "Concert band pieces" ]
projected-04044946-003
https://en.wikipedia.org/wiki/National%20Emblem
National Emblem
Legacy
"National Emblem", also known as the National Emblem March, is a U.S. composed in 1902 and published in 1906 by . It is a standard of the U.S. march repertoire, appearing in eleven published editions. The U.S. military uses the trio section as ceremonial music for the entry of the ceremony's official party.
"National Emblem March" was the favorite march composition of , who made an arrangement of it in 1981. Fennell called the piece "as perfect a march as a march can be". Besides Fennell's arrangement, there are also band arrangements by Albert Morris (1978), Andrew Balent (1982), Paul Lavender (1986), and Loris J. Schissel (2000).
[]
[ "Legacy" ]
[ "1902 compositions", "1902 songs", "American marches", "American patriotic songs", "Concert band pieces" ]
projected-04044946-004
https://en.wikipedia.org/wiki/National%20Emblem
National Emblem
In popular culture
"National Emblem", also known as the National Emblem March, is a U.S. composed in 1902 and published in 1906 by . It is a standard of the U.S. march repertoire, appearing in eleven published editions. The U.S. military uses the trio section as ceremonial music for the entry of the ceremony's official party.
In 1960 a group of studio musicians led by recorded a arrangement of the tune, which was subsequently released as a single under the title National City and credited to the Joiner (Arkansas) Junior High School Band. It became a minor hit, reaching #53 on the chart. The march has been featured in films such as , and . A theme from the march is quoted in song .
[]
[ "In popular culture" ]
[ "1902 compositions", "1902 songs", "American marches", "American patriotic songs", "Concert band pieces" ]
projected-04044947-000
https://en.wikipedia.org/wiki/Levsen%20Organ%20Company
Levsen Organ Company
Introduction
Levsen Organ Company is a manufacturer of s based out of , which is near the . Levsen began operations as a tuning and repair facility for electric pianos and organs in 1954. For the first 11 years, this would be the scope of the business. Company founder Rodney E. Levsen began working with a major pipe organ builder, and completed an apprenticeship. After this he began offering his services tuning and repairing pipe organs. In 1980 he began building organs under the Levsen name. , he has built 53 organs of a variety of sizes, and is currently working on an additional six organs. Levsen organs can be found throughout the United States. He also helps service and maintain over 150 existing instruments, mainly in the upper midwest. In addition to organ work, his company has developed tools and computer software that is also used by other builders.
[]
[ "Introduction" ]
[ "Pipe organ building companies", "Musical instrument manufacturing companies of the United States" ]
projected-04044954-000
https://en.wikipedia.org/wiki/Krisztina%20Czak%C3%B3
Krisztina Czakó
Introduction
Krisztina Czakó (born December 17, 1978 in , ) is a former . She is the 1997 silver medalist and 1994 champion.
[]
[ "Introduction" ]
[ "1978 births", "Living people", "Hungarian female single skaters", "Figure skaters at the 1992 Winter Olympics", "Figure skaters at the 1994 Winter Olympics", "Olympic figure skaters of Hungary", "European Figure Skating Championships medalists", "World Junior Figure Skating Championships medalists", "Competitors at the 1999 Winter Universiade", "Figure skaters from Budapest" ]
projected-04044954-001
https://en.wikipedia.org/wiki/Krisztina%20Czak%C3%B3
Krisztina Czakó
Career
Krisztina Czakó (born December 17, 1978 in , ) is a former . She is the 1997 silver medalist and 1994 champion.
Czakó's mother Klara was a , while her father and coach was himself a figure skater and a former Hungarian men's national champion. György began teaching Krisztina how to skate before she was a year old, making her a pair of skates himself when none could be found that were small enough to fit her. Czakó was the youngest athlete to compete in the , at age 13 years and 2 months. She was so young that she was still able to compete in the World Junior Championship in 1994 and 1995 (finishing second and third, respectively), despite her Olympic experience. She made her second Olympic appearance in in , finishing 11th. She intended to compete in her third Olympics in but had to withdraw due to injury. Czakó won the silver medal at the skating her long program to the music of . It was the first medal for Hungary in the European ladies' event since 1971. Czakó also achieved a career-best 7th-place finish at the . Czakó was a seven-time Hungarian national champion (1992-1998), and represented her country in two Olympics, six World Championships, and six European championships, along with numerous other competitions. She is now retired from competitive skating.
[ "Kristina Czako 2.jpg" ]
[ "Career" ]
[ "1978 births", "Living people", "Hungarian female single skaters", "Figure skaters at the 1992 Winter Olympics", "Figure skaters at the 1994 Winter Olympics", "Olympic figure skaters of Hungary", "European Figure Skating Championships medalists", "World Junior Figure Skating Championships medalists", "Competitors at the 1999 Winter Universiade", "Figure skaters from Budapest" ]
projected-04044954-002
https://en.wikipedia.org/wiki/Krisztina%20Czak%C3%B3
Krisztina Czakó
Results
Krisztina Czakó (born December 17, 1978 in , ) is a former . She is the 1997 silver medalist and 1994 champion.
GP:
[]
[ "Results" ]
[ "1978 births", "Living people", "Hungarian female single skaters", "Figure skaters at the 1992 Winter Olympics", "Figure skaters at the 1994 Winter Olympics", "Olympic figure skaters of Hungary", "European Figure Skating Championships medalists", "World Junior Figure Skating Championships medalists", "Competitors at the 1999 Winter Universiade", "Figure skaters from Budapest" ]
projected-04044955-000
https://en.wikipedia.org/wiki/Faded%20%28Kate%20DeAraugo%20song%29
Faded (Kate DeAraugo song)
Introduction
"Faded" is a song written by , , , and , produced by Gerrard and Bryon Jones for Australian singer 's first album (2005). It was released as the album's second single in Australia on 20 February 2006 as a . Two of the song's co-writers—Jessica and Lisa Origliasso of —recorded a demo of "Faded" prior to DeAraugo's release. They have been known to perform the song live. "Faded" was DeAraugo's second top-10 single following her number-one hit "" after winning of . DeAraugo went on to achieve two other top-10 singles with girl group the . In 2008, the song was covered by German act for their second studio album, (2007).
[]
[ "Introduction" ]
[ "2005 songs", "2006 singles", "Kate DeAraugo songs", "Song recordings produced by Matthew Gerrard", "Songs written by Jessica Origliasso", "Songs written by Lisa Origliasso", "Songs written by Matthew Gerrard", "Songs written by Robbie Nevil", "Sony BMG singles" ]
projected-04044955-001
https://en.wikipedia.org/wiki/Faded%20%28Kate%20DeAraugo%20song%29
Faded (Kate DeAraugo song)
Music video
"Faded" is a song written by , , , and , produced by Gerrard and Bryon Jones for Australian singer 's first album (2005). It was released as the album's second single in Australia on 20 February 2006 as a . Two of the song's co-writers—Jessica and Lisa Origliasso of —recorded a demo of "Faded" prior to DeAraugo's release. They have been known to perform the song live. "Faded" was DeAraugo's second top-10 single following her number-one hit "" after winning of . DeAraugo went on to achieve two other top-10 singles with girl group the . In 2008, the song was covered by German act for their second studio album, (2007).
The video begins with DeAraugo in her car with a photograph of her partner, The words "FADED" appear in the shadow of underneath her car. As the chorus begins, we see her performing the song in a large warehouse with her band. She then texts her boyfriend "Can you come over?", who we see stumbling down an alley in the presumption he is , checking out another woman. After she receives the text, he drives to the same warehouse and opens the door to find photos of him with other woman scattered all over the floor, with the words "Cheat", "Liar", "Coward", "Fake", "Two Timer", and "User" put over his face. He then runs out, and upon trying to leave his car won't start, leaving him stranded there.
[]
[ "Music video" ]
[ "2005 songs", "2006 singles", "Kate DeAraugo songs", "Song recordings produced by Matthew Gerrard", "Songs written by Jessica Origliasso", "Songs written by Lisa Origliasso", "Songs written by Matthew Gerrard", "Songs written by Robbie Nevil", "Sony BMG singles" ]
projected-04044955-002
https://en.wikipedia.org/wiki/Faded%20%28Kate%20DeAraugo%20song%29
Faded (Kate DeAraugo song)
Track listing
"Faded" is a song written by , , , and , produced by Gerrard and Bryon Jones for Australian singer 's first album (2005). It was released as the album's second single in Australia on 20 February 2006 as a . Two of the song's co-writers—Jessica and Lisa Origliasso of —recorded a demo of "Faded" prior to DeAraugo's release. They have been known to perform the song live. "Faded" was DeAraugo's second top-10 single following her number-one hit "" after winning of . DeAraugo went on to achieve two other top-10 singles with girl group the . In 2008, the song was covered by German act for their second studio album, (2007).
"Faded" – 3:31 "Faded (Reactor mix)" – 3:40 "Faded (Chameleon mix)" – 4:59 "World Stands Still" – 3:56
[]
[ "Track listing" ]
[ "2005 songs", "2006 singles", "Kate DeAraugo songs", "Song recordings produced by Matthew Gerrard", "Songs written by Jessica Origliasso", "Songs written by Lisa Origliasso", "Songs written by Matthew Gerrard", "Songs written by Robbie Nevil", "Sony BMG singles" ]
projected-04044955-006
https://en.wikipedia.org/wiki/Faded%20%28Kate%20DeAraugo%20song%29
Faded (Kate DeAraugo song)
Cascada cover
"Faded" is a song written by , , , and , produced by Gerrard and Bryon Jones for Australian singer 's first album (2005). It was released as the album's second single in Australia on 20 February 2006 as a . Two of the song's co-writers—Jessica and Lisa Origliasso of —recorded a demo of "Faded" prior to DeAraugo's release. They have been known to perform the song live. "Faded" was DeAraugo's second top-10 single following her number-one hit "" after winning of . DeAraugo went on to achieve two other top-10 singles with girl group the . In 2008, the song was covered by German act for their second studio album, (2007).
German group covered the song on their American/Canadian release of their album . In the U.S. "Faded" was digitally released on 5 August 2008 and then released on a CD Maxi 26 August 2008. Although the song did not receive much attention on United States charts, it did rank as No. 55 on New York's Radio Station Z100's Top 100 Songs of 2008. The track was also released in certain European countries such as Finland and Germany as a digital download in 2010.
[]
[ "Cascada cover" ]
[ "2005 songs", "2006 singles", "Kate DeAraugo songs", "Song recordings produced by Matthew Gerrard", "Songs written by Jessica Origliasso", "Songs written by Lisa Origliasso", "Songs written by Matthew Gerrard", "Songs written by Robbie Nevil", "Sony BMG singles" ]
projected-04044955-007
https://en.wikipedia.org/wiki/Faded%20%28Kate%20DeAraugo%20song%29
Faded (Kate DeAraugo song)
Formats and track listing
"Faded" is a song written by , , , and , produced by Gerrard and Bryon Jones for Australian singer 's first album (2005). It was released as the album's second single in Australia on 20 February 2006 as a . Two of the song's co-writers—Jessica and Lisa Origliasso of —recorded a demo of "Faded" prior to DeAraugo's release. They have been known to perform the song live. "Faded" was DeAraugo's second top-10 single following her number-one hit "" after winning of . DeAraugo went on to achieve two other top-10 singles with girl group the . In 2008, the song was covered by German act for their second studio album, (2007).
United States "Faded" (Album Version) – 2:50 "Faded" (Dave Ramone Electro Club Edit) – 2:57 "Faded" (Wideboys Electro Radio Edit) – 2.36 "Faded" (Dave Ramone Pop Radio Mix) – 2:54 "Faded" (Album Extended Version) – 4:26 "Faded" (Dave Ramone Electro Club Extended) – 6:25 "Faded" (Wideboys Electro Club Mix) – 6:07 "Faded" (Dave Ramone Pop Extended Mix) – 5:51 "Faded" (Lior Magal Remix) – 5:27 "Faded" (Giuseppe D's Dark Fader Club Mix) – 7:20 Europe "Faded" (Radio Edit) – 2:48 "Faded" (Wideboys Radio Edit) – 2:36 "Faded" (Extended Mix) – 4:24 "Faded" (Dave Ramone Remix) – 5.48 France "Faded" (Wideboys Miami House Mix) – 6:04
[]
[ "Cascada cover", "Formats and track listing" ]
[ "2005 songs", "2006 singles", "Kate DeAraugo songs", "Song recordings produced by Matthew Gerrard", "Songs written by Jessica Origliasso", "Songs written by Lisa Origliasso", "Songs written by Matthew Gerrard", "Songs written by Robbie Nevil", "Sony BMG singles" ]
projected-04044962-000
https://en.wikipedia.org/wiki/Colchester%20North%20%28provincial%20electoral%20district%29
Colchester North (provincial electoral district)
Introduction
Colchester North is a provincial in , , that elects one member of the . It was created in 1978 when the former district of was redistributed. The is of the , who replaced who had held the seat from 2006 to 2021 as both a Conservative and then a . The riding includes the northern half of . Communities include , , and .
[]
[ "Introduction" ]
[ "Nova Scotia provincial electoral districts" ]
projected-04044962-001
https://en.wikipedia.org/wiki/Colchester%20North%20%28provincial%20electoral%20district%29
Colchester North (provincial electoral district)
Geography
Colchester North is a provincial in , , that elects one member of the . It was created in 1978 when the former district of was redistributed. The is of the , who replaced who had held the seat from 2006 to 2021 as both a Conservative and then a . The riding includes the northern half of . Communities include , , and .
The land area of Colchester North is .
[]
[ "Geography" ]
[ "Nova Scotia provincial electoral districts" ]
projected-04044962-002
https://en.wikipedia.org/wiki/Colchester%20North%20%28provincial%20electoral%20district%29
Colchester North (provincial electoral district)
Members of the Legislative Assembly
Colchester North is a provincial in , , that elects one member of the . It was created in 1978 when the former district of was redistributed. The is of the , who replaced who had held the seat from 2006 to 2021 as both a Conservative and then a . The riding includes the northern half of . Communities include , , and .
This riding has elected the following :
[]
[ "Members of the Legislative Assembly" ]
[ "Nova Scotia provincial electoral districts" ]
projected-04044962-014
https://en.wikipedia.org/wiki/Colchester%20North%20%28provincial%20electoral%20district%29
Colchester North (provincial electoral district)
2013 general election
Colchester North is a provincial in , , that elects one member of the . It was created in 1978 when the former district of was redistributed. The is of the , who replaced who had held the seat from 2006 to 2021 as both a Conservative and then a . The riding includes the northern half of . Communities include , , and .
| | |align="right"| 5,003 |align="right"| 61.00 |align="right"| |- | | John Kendrick MacDonald |align="right"| 2,162 |align="right"| 26.36 |align="right"| |- | | Jim Wyatt |align="right"| 1,037 |align="right"| 12.64 |align="right"| |}
[]
[ "Election results", "2013 general election" ]
[ "Nova Scotia provincial electoral districts" ]
projected-04044969-000
https://en.wikipedia.org/wiki/Du%2C%20du%20liegst%20mir%20im%20Herzen
Du, du liegst mir im Herzen
Introduction
"Du, du liegst mir im Herzen" ("You, you are in my heart") is a German , believed to have originated in around 1820. n , inventor of the fingering system for the modern , composed a theme and variations for flute and on this tune.
[]
[ "Introduction" ]
[ "Volkslied", "1820s songs", "Year of song unknown", "Songwriter unknown" ]
projected-04044969-001
https://en.wikipedia.org/wiki/Du%2C%20du%20liegst%20mir%20im%20Herzen
Du, du liegst mir im Herzen
Notable performances
"Du, du liegst mir im Herzen" ("You, you are in my heart") is a German , believed to have originated in around 1820. n , inventor of the fingering system for the modern , composed a theme and variations for flute and on this tune.
The song is heard in the film during a key scene between and . In 's , , caricaturing Dietrich, sings it with a group of n soldiers. It also features in , , , the film and in 's , sung by . It was sung by the Kenneth Mars character, Franz Liebkind, in . In the 1991 film, , a gleeful whistles the melody of the chorus. included the song in a medley on his album (1961). German-American jazz keyboardist recorded two dramatically contrasting versions in 1975 and 1980, a solo piano performance on and his arrangement for a ensemble supplemented by the vocal quartet 2+2 on the eponymous album . , cofounder of Alcoholics Anonymous recorded this song in 1947 to send as part of a communication on reel to reel to his friend, cofounder . This is the only known recording of Bill playing the violin, and can be listened to at in Akron, Ohio.
[]
[ "Notable performances" ]
[ "Volkslied", "1820s songs", "Year of song unknown", "Songwriter unknown" ]
projected-04044969-002
https://en.wikipedia.org/wiki/Du%2C%20du%20liegst%20mir%20im%20Herzen
Du, du liegst mir im Herzen
Lyrics
"Du, du liegst mir im Herzen" ("You, you are in my heart") is a German , believed to have originated in around 1820. n , inventor of the fingering system for the modern , composed a theme and variations for flute and on this tune.
Du, du liegst mir im Herzen du, du liegst mir im Sinn. Du, du machst mir viel Schmerzen, weißt nicht wie gut ich dir bin. Ja, ja, ja, ja, weißt nicht wie gut ich dir bin. So, so wie ich dich liebe so, so liebe auch mich. Die, die zärtlichsten Triebe fühl' ich allein nur für dich. Ja, ja, ja, ja, fühl' ich allein nur für dich. Doch, doch darf ich dir trauen dir, dir mit leichtem Sinn? Du, du kannst auf mich bauen weißt ja wie gut ich dir bin! Ja, ja, ja, ja, weißt ja wie gut ich dir bin! Und, und wenn in der Ferne, mir, mir dein Bild erscheint, dann, dann wünscht ich so gerne daß uns die Liebe vereint. Ja, ja, ja, ja, daß uns die Liebe vereint. You, you are in my heart, you, you are in my mind. You, you cause me much pain, you don't know how good I am for you. Yes, yes, yes, yes you don't know how good I am for you. So, as I love you so, so love me too. The most tender desires I alone feel only for you. Yes, yes, yes, yes, I alone feel only for you. But, but may I trust you you, you with a light heart? You, you know you can rely on me, you do know how good for you I am! Yes, yes, yes, yes you do know how good for you I am! And, and if in the distance, it seems to me like your picture, then, then I wish so much that we were united in love. Yes, yes, yes, yes, that we were united in love.
[]
[ "Lyrics" ]
[ "Volkslied", "1820s songs", "Year of song unknown", "Songwriter unknown" ]
projected-04044975-000
https://en.wikipedia.org/wiki/List%20of%20United%20States%20Virgin%20Islands%20highways
List of United States Virgin Islands highways
Introduction
Below is a list of highways in the (USVI). US Virgin Islands code places responsibility for highways in the territory to the USVI Department of Public Works. In the USVI, highways which begin with the numbers 1-2 are located on the island of , 3-4 are located on , and 5-8 are located on . Unlike elsewhere in the U.S., traffic in the USVI drives on the left.
[]
[ "Introduction" ]
[ "Highways in the United States Virgin Islands", "Lists of roads in the United States" ]
projected-04044982-000
https://en.wikipedia.org/wiki/Cole%20Harbour-Eastern%20Passage
Cole Harbour-Eastern Passage
Introduction
Cole Harbour—Eastern Passage was a provincial in , , that elected one member of the . The district was created in 1992 from . In 2003, the district lost an area south of the Circumferential Highway and the eastern side of Morris Lake to Dartmouth South, and lost an area south of Portland Street to . In 2013, the district gained the area south of Russell Lake and east of Highway 111 from . The district was abolished at the mostly into and parts of and .
[]
[ "Introduction" ]
[ "Former provincial electoral districts of Nova Scotia", "Politics of Halifax, Nova Scotia" ]
projected-04044982-001
https://en.wikipedia.org/wiki/Cole%20Harbour-Eastern%20Passage
Cole Harbour-Eastern Passage
Members of the Legislative Assembly
Cole Harbour—Eastern Passage was a provincial in , , that elected one member of the . The district was created in 1992 from . In 2003, the district lost an area south of the Circumferential Highway and the eastern side of Morris Lake to Dartmouth South, and lost an area south of Portland Street to . In 2013, the district gained the area south of Russell Lake and east of Highway 111 from . The district was abolished at the mostly into and parts of and .
This riding has elected the following :
[]
[ "Members of the Legislative Assembly" ]
[ "Former provincial electoral districts of Nova Scotia", "Politics of Halifax, Nova Scotia" ]
projected-04044982-002
https://en.wikipedia.org/wiki/Cole%20Harbour-Eastern%20Passage
Cole Harbour-Eastern Passage
Election results
Cole Harbour—Eastern Passage was a provincial in , , that elected one member of the . The district was created in 1992 from . In 2003, the district lost an area south of the Circumferential Highway and the eastern side of Morris Lake to Dartmouth South, and lost an area south of Portland Street to . In 2013, the district gained the area south of Russell Lake and east of Highway 111 from . The district was abolished at the mostly into and parts of and .
|- | | |align="right"|3,057 |align="right"|40.62 |align="right"|+25.02 |- | | |align="right"|2,914 |align="right"|38.72 |align="right"|-26.45 |- | |Lloyd Jackson |align="right"|1,555 |align="right"|20.66 |align="right"|+4.76 |} |- | | |align="right"|4,402 |align="right"|65.17 |align="right"|+20.78 |- | |Lloyd Jackson |align="right"|1,074 |align="right"|15.90 |align="right"|-17.73 |- | |Orest Ulan |align="right"|1,054 |align="right"|15.60 |align="right"|-1.70 |- |} |- | | |align="right"|2,459 |align="right"|44.39 |align="right"|-20.01 |- | |Michael Eddy |align="right"|1,863 |align="right"|33.63 |align="right"|+14.71 |- |- |} |- |New Democratic Party | |align="right"|3,997 |align="right"|58.44 |align="right"|+19.17 |- |Progressive Conservative |Harry McInroy |align="right"|1,641 |align="right"|23.99 |align="right"|-13.36 |- |Liberal |Brian Churchill |align="right"|1,121 |align="right"|16.39 |align="right"|-6.99 |} |- |New Democratic Party | |align="right"|3,721 |align="right"|39.27 |align="right"|- |- |Progressive Conservative |Nadune Cooper Mont |align="right"|3,539 |align="right"|37.35 |align="right"|- |- |Liberal |Colin MacEachern |align="right"|2,216 |align="right"|23.38 |align="right"|- |} |- |New Democratic Party | |align="right"|4,411 |align="right"|45.73 |align="right"|- |- |Progressive Conservative |Randy Anstey |align="right"|3,303 |align="right"|34.24 |align="right"|- |- |Liberal |Linda DeGrace |align="right"|1,931 |align="right"|20.02 |align="right"|- |} |- |Liberal | |align="right"|4,702 |align="right"|48.13 |align="right"|- |- |Progressive Conservative |John Gold |align="right"|3,409 |align="right"|34.89 |align="right"|- |- |New Democratic Party |Ash Shaihk |align="right"|1,501 |align="right"|15.36 |align="right"|- |- |Natural Law Party |Helen Creighton |align="right"|158 |align="right"|1.62 |align="right"|- |}
[]
[ "Election results" ]
[ "Former provincial electoral districts of Nova Scotia", "Politics of Halifax, Nova Scotia" ]
projected-04045016-000
https://en.wikipedia.org/wiki/Margaret%20Mayall
Margaret Mayall
Introduction
Margaret Walton Mayall (January 27, 1902 – December 6, 1995) was an American astronomer. She was the director of the (AAVSO) from 1949 to 1973. Mayall (born Margaret Lyle Walton) was born in Iron Hill, Maryland, on 27 January 1902. She attended the , where her interest in astronomy grew after taking math and chemistry courses. She then moved to , where she received her Bachelor's Degree in Mathematics in 1924. She earned an MA in Astronomy from , Harvard University, in 1928 and worked as a research assistant and astronomer at from 1924 to 1954, initially working with on classifying star spectra and estimating star brightness. She was a research staff member at the Heat Research Laboratory, Special Weapons Group, from 1943 to 1946. While working in Nantucket, she met Robert Newton Mayall, a member of the (AAVSO). They married in 1927. In 1958 she won the . She died of congestive heart failure in , on 6 December 1995.
[]
[ "Introduction" ]
[ "1902 births", "1995 deaths", "American women astronomers", "Recipients of the Annie J. Cannon Award in Astronomy", "20th-century American women scientists", "People from Cecil County, Maryland", "20th-century American scientists", "Radcliffe College alumni", "Swarthmore College alumni", "Harvard College Observatory people" ]
projected-04045025-000
https://en.wikipedia.org/wiki/Colchester-Musquodoboit%20Valley
Colchester-Musquodoboit Valley
Introduction
Colchester—Musquodoboit Valley is a provincial in , , that elects one member of the . The district was created in 1978 from , and was called Colchester South until 1993. In 1993, the name was changed to Colchester-Musquodoboit Valley and it gained the and Musquodoboit Valley areas from , and from . It includes the southern half of (not including the area) plus the region of the .
[]
[ "Introduction" ]
[ "Nova Scotia provincial electoral districts" ]
projected-04045025-001
https://en.wikipedia.org/wiki/Colchester-Musquodoboit%20Valley
Colchester-Musquodoboit Valley
Geography
Colchester—Musquodoboit Valley is a provincial in , , that elects one member of the . The district was created in 1978 from , and was called Colchester South until 1993. In 1993, the name was changed to Colchester-Musquodoboit Valley and it gained the and Musquodoboit Valley areas from , and from . It includes the southern half of (not including the area) plus the region of the .
The landmass of Colchester-Musquodoboit Valley is .
[]
[ "Geography" ]
[ "Nova Scotia provincial electoral districts" ]
projected-04045025-002
https://en.wikipedia.org/wiki/Colchester-Musquodoboit%20Valley
Colchester-Musquodoboit Valley
Members of the Legislative Assembly
Colchester—Musquodoboit Valley is a provincial in , , that elects one member of the . The district was created in 1978 from , and was called Colchester South until 1993. In 1993, the name was changed to Colchester-Musquodoboit Valley and it gained the and Musquodoboit Valley areas from , and from . It includes the southern half of (not including the area) plus the region of the .
This riding has elected the following :
[]
[ "Members of the Legislative Assembly" ]
[ "Nova Scotia provincial electoral districts" ]
projected-04045025-014
https://en.wikipedia.org/wiki/Colchester-Musquodoboit%20Valley
Colchester-Musquodoboit Valley
2013 general election
Colchester—Musquodoboit Valley is a provincial in , , that elects one member of the . The district was created in 1978 from , and was called Colchester South until 1993. In 1993, the name was changed to Colchester-Musquodoboit Valley and it gained the and Musquodoboit Valley areas from , and from . It includes the southern half of (not including the area) plus the region of the .
|- | | |align="right"|3,304 |align="right"|42.27 |align="right"|+13.28 |- | | |align="right"|2,293 |align="right"|29.33 |align="right"|-18.76 |- | | Tom Martin |align="right"|2,220 |align="right"|28.40 |align="right"|+7.79 |}
[]
[ "Election results", "2013 general election" ]
[ "Nova Scotia provincial electoral districts" ]
projected-04045025-017
https://en.wikipedia.org/wiki/Colchester-Musquodoboit%20Valley
Colchester-Musquodoboit Valley
References
Colchester—Musquodoboit Valley is a provincial in , , that elects one member of the . The district was created in 1978 from , and was called Colchester South until 1993. In 1993, the name was changed to Colchester-Musquodoboit Valley and it gained the and Musquodoboit Valley areas from , and from . It includes the southern half of (not including the area) plus the region of the .
Election Summary From 1867 - 2007 1993 Poll by Poll Results 1988 Poll by Poll Results 1984 Poll by Poll Results 1981 Poll by Poll Results 1978 Poll by Poll Results
[]
[ "References" ]
[ "Nova Scotia provincial electoral districts" ]
projected-04045032-000
https://en.wikipedia.org/wiki/Morning%20sun
Morning sun
Introduction
Morning sun may refer to: , the Solar System's star
[]
[ "Introduction" ]
[]
projected-04045032-002
https://en.wikipedia.org/wiki/Morning%20sun
Morning sun
Music
Morning sun may refer to: , the Solar System's star
, by Barbara Mandrell, 1990 , by the Beautiful Girls, 2002 , 2010 , 2015 "Morning Sun", a song by the Spencer Davis Group from , 1968
[]
[ "Music" ]
[]
projected-04045032-003
https://en.wikipedia.org/wiki/Morning%20sun
Morning sun
Newspapers
Morning sun may refer to: , the Solar System's star
, Pittsburg, Kansas, US Morning Sun, published by in Denison, Texas, US
[]
[ "Newspapers" ]
[]
projected-04045032-004
https://en.wikipedia.org/wiki/Morning%20sun
Morning sun
Other uses
Morning sun may refer to: , the Solar System's star
, a 2003 American documentary about China's Cultural Revolution Morning Sun, a 1952 painting by Morning Sun, a 1963 off-Broadway musical with a book by
[]
[ "Other uses" ]
[]