|
{"doc_id": "1912.10514", "revision_depth": 1, "before_revision": "An effective method to generate a large number of parallel sentences for training improved neural machine translation (NMT) systems is the use of back-translations of the target-side monolingual data. Tagging, or using gates, has been used to enable translation models to distinguish between synthetic and natural data. This improves standard back-translation and also enables the use of iterative back-translation on language pairs that underperformed using standard back-translation. This work presents a simplified approach of differentiating between the two data using pretraining and finetuning . The approach - tag-less back-translation - trains the model on the synthetic data and finetunes it on the natural data. Preliminary experiments have shown the approach to continuously outperform the tagging approach on low resource English-Vietnamese neural machine translation . While the need for tagging (noising) the dataset has been removed, the approach outperformed the tagged back-translation approach by an average of 0.4 BLEU .", "after_revision": "An effective method to generate a large number of parallel sentences for training improved neural machine translation (NMT) systems is the use of back-translations of the target-side monolingual data. The method was not able to utilize the available huge amount of monolingual data because of the inability of models to differentiate between the authentic and synthetic parallel data. Tagging, or using gates, has been used to enable translation models to distinguish between synthetic and authentic data, improving standard back-translation and also enabling the use of iterative back-translation on language pairs that under-performed using standard back-translation. This work presents pre-training and fine-tuning as a simplified but more effective approach of differentiating between the two data . The approach - tag-less back-translation - trains the model on the synthetic data and fine-tunes it on the authentic data. Experiments have shown the approach to outperform the baseline and standard back-translation by 4.0 and 0.7 BLEU respectively on low resource English-Vietnamese NMT . While the need for tagging (noising) the dataset has been removed, the technique outperformed tagged back-translation by 0.4 BLEU . The approach reached the best scores in less training time than the standard and tagged back-translation approaches .", "edit_actions": [{"type": "A", "before": null, "after": "The method was not able to utilize the available huge amount of monolingual data because of the inability of models to differentiate between the authentic and synthetic parallel data.", "start_char_pos": 201, "end_char_pos": 201, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "natural data. This improves", "after": "authentic data, improving", "start_char_pos": 307, "end_char_pos": 334, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "R", "before": "enables", "after": "enabling", "start_char_pos": 370, "end_char_pos": 377, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "clarity"]}, {"type": "R", "before": "underperformed", "after": "under-performed", "start_char_pos": 439, "end_char_pos": 453, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "a simplified", "after": "pre-training and fine-tuning as a simplified but more effective", "start_char_pos": 506, "end_char_pos": 518, "major_intent": "coherence", "raw_intents": ["coherence", "clarity", "coherence"]}, {"type": "D", "before": "using pretraining and finetuning", "after": null, "start_char_pos": 568, "end_char_pos": 600, "major_intent": "clarity", "raw_intents": ["clarity", "coherence", "clarity"]}, {"type": "R", "before": "finetunes", "after": "fine-tunes", "start_char_pos": 689, "end_char_pos": 698, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "natural data. Preliminary experiments", "after": "authentic data. Experiments", "start_char_pos": 709, "end_char_pos": 746, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "R", "before": "continuously outperform the tagging approach", "after": "outperform the baseline and standard back-translation by 4.0 and 0.7 BLEU respectively", "start_char_pos": 774, "end_char_pos": 818, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "neural machine translation", "after": "NMT", "start_char_pos": 854, "end_char_pos": 880, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "approach outperformed the", "after": "technique outperformed", "start_char_pos": 954, "end_char_pos": 979, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "approach by an average of", "after": "by", "start_char_pos": 1004, "end_char_pos": 1029, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "A", "before": null, "after": ". The approach reached the best scores in less training time than the standard and tagged back-translation approaches", "start_char_pos": 1039, "end_char_pos": 1039, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}], "sents_char_pos": [0, 200, 320, 486, 602, 722, 882], "domain": "arxiv"} |
|
{"doc_id": "2002.06353", "revision_depth": 2, "before_revision": "With the recent success of pre-training technique for NLP and image-linguistic tasks, there are still few works on video-linguistic pre-training . Besides , most of the existing multimodal models are pre-trained for understanding task, which leads to a pretrain-finetune discrepency for generation tasks. In this paper , we propose UniViLM : a Unified Video and Language pre-training Model for both multimodal understanding and generation. Our model comprises of 4 components including two single-modal encoders, a cross encoder and a decoder with the Transformer backbone. We first pre-train our model to learn the universal representation for both video and language on a large instructional video dataset. Then we fine-tune the model on two multimodal tasks including understanding task (text-based video retrieval) and generation task (multimodal video captioning). Our extensive experiments show that our method can improve the performance of both understanding and generation tasks and achieves the state-of-the art results .", "after_revision": "With the recent success of the pre-training technique for NLP and image-linguistic tasks, some video-linguistic pre-training works are gradually developed to improve video-text related downstream tasks. However , most of the existing multimodal models are pre-trained for understanding tasks, leading to a pretrain-finetune discrepancy for generation tasks. This paper proposes UniVL : a Unified Video and Language pre-training model for both multimodal understanding and generation. It comprises four components, including two single-modal encoders, a cross encoder , and a decoder with the Transformer backbone. Five objectives, including video-text joint, conditioned masked language model (CMLM), conditioned masked frame model (CMFM), video-text alignment, and language reconstruction, are designed to train each of the components. We further develop two pre-training strategies, stage by stage pre-training (StagedP) and enhanced video representation (EnhancedV), to make the training process of the UniVL more effective. The pre-train is carried out on a sizeable instructional video dataset HowTo100M. Experimental results demonstrate that the UniVL can learn strong video-text representation and achieves state-of-the-art results on five downstream tasks .", "edit_actions": [{"type": "A", "before": null, "after": "the", "start_char_pos": 27, "end_char_pos": 27, "major_intent": "coherence", "raw_intents": ["fluency", "coherence", "coherence"]}, {"type": "R", "before": "there are still few works on", "after": "some", "start_char_pos": 87, "end_char_pos": 115, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "R", "before": ". Besides", "after": "works are gradually developed to improve video-text related downstream tasks. However", "start_char_pos": 146, "end_char_pos": 155, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "coherence", "meaning-changed"]}, {"type": "R", "before": "task, which leads", "after": "tasks, leading", "start_char_pos": 231, "end_char_pos": 248, "major_intent": "fluency", "raw_intents": ["fluency", "style", "fluency"]}, {"type": "R", "before": "discrepency", "after": "discrepancy", "start_char_pos": 272, "end_char_pos": 283, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "In this paper , we propose UniViLM", "after": "This paper proposes UniVL", "start_char_pos": 306, "end_char_pos": 340, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "Model", "after": "model", "start_char_pos": 385, "end_char_pos": 390, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "Our model comprises of 4 components", "after": "It comprises four components,", "start_char_pos": 441, "end_char_pos": 476, "major_intent": "clarity", "raw_intents": ["style", "clarity", "clarity"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 530, "end_char_pos": 530, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "We first", "after": "Five objectives, including video-text joint, conditioned masked language model (CMLM), conditioned masked frame model (CMFM), video-text alignment, and language reconstruction, are designed to train each of the components. We further develop two pre-training strategies, stage by stage pre-training (StagedP) and enhanced video representation (EnhancedV), to make the training process of the UniVL more effective. The", "start_char_pos": 576, "end_char_pos": 584, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "our model to learn the universal representation for both video and language on a large instructional video dataset. Then we fine-tune the model on two multimodal tasks including understanding task (text-based video retrieval) and generation task (multimodal video captioning). Our extensive experiments show that our method can improve the performance of both understanding and generation tasks and achieves", "after": "is carried out on a sizeable instructional video dataset HowTo100M. Experimental results demonstrate that", "start_char_pos": 595, "end_char_pos": 1002, "major_intent": "clarity", "raw_intents": ["clarity", "meaning-changed", "clarity"]}, {"type": "R", "before": "state-of-the art results", "after": "UniVL can learn strong video-text representation and achieves state-of-the-art results on five downstream tasks", "start_char_pos": 1007, "end_char_pos": 1031, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "style"]}], "sents_char_pos": [0, 147, 305, 440, 575, 710, 871], "domain": "arxiv"} |
|
{"doc_id": "2005.01795", "revision_depth": 1, "before_revision": "Following each patient visit, physicians must draft detailed clinical summaries called SOAP notes . Moreover, with electronic health records, these notes must be digitized. For all the benefits of this documentation the process remains onerous , contributing to increasing physician burnout. In a parallel development, patients increasingly record audio from their visits (with consent), often through dedicated apps. In this paper, we present the first study to evaluate complete pipelines for leveraging these transcripts to train machine learning model to generate these notes . We first describe a unique dataset of patient visit records, consisting of transcripts , paired SOAP notes, and annotations marking noteworthy utterances that support each summary sentence. We decompose the problem into extractive and abstractive subtasks, exploring a spectrum of approaches according to how much they demand from each component. Our best performing method first (i) extracts noteworthy utterances via multi-label classification assigns them to summary section(s); (ii) clusters noteworthy utterances on a per-section basis; and (iii) generates the summary sentences by conditioning on the corresponding cluster and the subsection of the SOAP sentence to be generated. Compared to an end-to-end approach that generates the full SOAP note from the full conversation, our approach improves by 7 ROUGE-1 points . Oracle experiments indicate that fixing our generative capabilities, improvements in extraction alone could provide (up to) a further 9 ROUGE point gain .", "after_revision": "Following each patient visit, physicians must draft a detailed clinical summary called a SOAP note . Moreover, with electronic health records, these notes must be digitized. Despite the benefits of this documentation , their creation remains an onerous process , contributing to increasing physician burnout. In this paper, we present the first study to evaluate complete pipelines to train summarization models to generate these notes from conversations between physicians and patients. We benefit from a dataset that, along with transcripts and paired SOAP notes, consists of annotations marking noteworthy utterances that support each summary sentence. We decompose the problem into extractive and abstractive subtasks, exploring a spectrum of approaches according to how much they demand from each component. We observe that the performance improves constantly as the extractive subtask is made more complex - an observation that we also replicate on the well-known AMI meeting summarization dataset. Our best performing method first (i) extracts noteworthy utterances via multi-label classification , assigning each to summary section(s); (ii) clusters noteworthy utterances on a per-section basis; and (iii) generates the summary sentences by conditioning on the corresponding cluster and the subsection of the SOAP sentence to be generated. Compared to an end-to-end approach that generates the full SOAP note from the full conversation, our approach improves by around 8 ROUGE-1 points .", "edit_actions": [{"type": "R", "before": "detailed clinical summaries called SOAP notes", "after": "a detailed clinical summary called a SOAP note", "start_char_pos": 52, "end_char_pos": 97, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "R", "before": "For all", "after": "Despite", "start_char_pos": 173, "end_char_pos": 180, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "style"]}, {"type": "R", "before": "the process remains onerous", "after": ", their creation remains an onerous process", "start_char_pos": 216, "end_char_pos": 243, "major_intent": "clarity", "raw_intents": ["style", "clarity", "clarity"]}, {"type": "D", "before": "a parallel development, patients increasingly record audio from their visits (with consent), often through dedicated apps. In", "after": null, "start_char_pos": 295, "end_char_pos": 420, "major_intent": "clarity", "raw_intents": ["clarity", "coherence", "clarity"]}, {"type": "D", "before": "for leveraging these transcripts to train machine learning model", "after": null, "start_char_pos": 491, "end_char_pos": 555, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "A", "before": null, "after": "train summarization models to", "start_char_pos": 559, "end_char_pos": 559, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "coherence"]}, {"type": "R", "before": ". We first describe a unique dataset of patient visit records, consisting of transcripts ,", "after": "from conversations between physicians and patients. We benefit from a dataset that, along with transcripts and", "start_char_pos": 581, "end_char_pos": 671, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "R", "before": "and", "after": "consists of", "start_char_pos": 691, "end_char_pos": 694, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "style"]}, {"type": "A", "before": null, "after": "We observe that the performance improves constantly as the extractive subtask is made more complex - an observation that we also replicate on the well-known AMI meeting summarization dataset.", "start_char_pos": 930, "end_char_pos": 930, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "assigns them", "after": ", assigning each", "start_char_pos": 1030, "end_char_pos": 1042, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "R", "before": "7", "after": "around 8", "start_char_pos": 1392, "end_char_pos": 1393, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "D", "before": ". Oracle experiments indicate that fixing our generative capabilities, improvements in extraction alone could provide (up to) a further 9 ROUGE point gain", "after": null, "start_char_pos": 1409, "end_char_pos": 1563, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}], "sents_char_pos": [0, 99, 172, 291, 417, 582, 772, 929, 1065, 1125, 1269, 1410], "domain": "arxiv"} |
|
{"doc_id": "2005.05298", "revision_depth": 2, "before_revision": "This paper presents a new method SOLOIST , which uses transfer learning to efficiently build task-oriented dialog systems at scale. We parameterize a dialog system using a Transformer-based auto-regressive language model, which subsumes different dialog modules (e.g., state tracker, dialog policy, response generator) into a single neural model. We pre-train, on large heterogeneous dialog corpora, a large-scale Transformer model which can generate dialog responses grounded in user goals and real-world knowledge for task completion. The pre-trained model can be efficiently adapted to accomplish a new dialog task with a handful of task-specific dialogs via machine teaching . Our experiments demonstrate that (i) SOLOIST creates new state-of-the-art results on two well-known benchmarks, CamRest and MultiWOZ, (ii) in the few-shot learning setting, the dialog systems developed by SOLOIST significantly outperform those developed by existing methods, and (iii) the use of machine teaching substantially reduces the labeling cost . We will release our code and pre-trained models for reproducible research.", "after_revision": "We present a new method SOLOIST that uses transfer learning and machine teaching to build task bots at scale. We parameterize classical modular task-oriented dialog systems using a Transformer-based auto-regressive language model, which subsumes different dialog modules into a single neural model. We pre-train, on heterogeneous dialog corpora, a task-grounded response generation model, which can generate dialog responses grounded in user goals and real-world knowledge for task completion. The pre-trained model can be efficiently adapted to accomplish new tasks with a handful of task-specific dialogs via machine teaching , where training samples are generated by human teachers interacting with the system. Experiments show that (i) SOLOIST creates new state-of-the-art on well-studied task-oriented dialog benchmarks, including CamRest676 and MultiWOZ; (ii) in the few-shot fine-tuning settings, SOLOIST significantly outperforms existing methods, and (iii) the use of machine teaching substantially reduces the labeling cost of fine-tuning. The pre-trained models and codes are available at URL", "edit_actions": [{"type": "R", "before": "This paper presents", "after": "We present", "start_char_pos": 0, "end_char_pos": 19, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "R", "before": ", which", "after": "that", "start_char_pos": 41, "end_char_pos": 48, "major_intent": "clarity", "raw_intents": ["fluency", "clarity", "clarity"]}, {"type": "R", "before": "to efficiently build task-oriented dialog systems", "after": "and machine teaching to build task bots", "start_char_pos": 72, "end_char_pos": 121, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "a dialog system", "after": "classical modular task-oriented dialog systems", "start_char_pos": 148, "end_char_pos": 163, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "meaning-changed"]}, {"type": "D", "before": "(e.g., state tracker, dialog policy, response generator)", "after": null, "start_char_pos": 262, "end_char_pos": 318, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "clarity"]}, {"type": "D", "before": "large", "after": null, "start_char_pos": 364, "end_char_pos": 369, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "R", "before": "large-scale Transformer model", "after": "task-grounded response generation model,", "start_char_pos": 402, "end_char_pos": 431, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "a new dialog task", "after": "new tasks", "start_char_pos": 600, "end_char_pos": 617, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": ". Our experiments demonstrate", "after": ", where training samples are generated by human teachers interacting with the system. Experiments show", "start_char_pos": 679, "end_char_pos": 708, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "coherence"]}, {"type": "R", "before": "results on two well-known benchmarks, CamRest and MultiWOZ,", "after": "on well-studied task-oriented dialog benchmarks, including CamRest676 and MultiWOZ;", "start_char_pos": 755, "end_char_pos": 814, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "learning setting, the dialog systems developed by SOLOIST significantly outperform those developed by", "after": "fine-tuning settings, SOLOIST significantly outperforms", "start_char_pos": 836, "end_char_pos": 937, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": ". We will release our code and", "after": "of fine-tuning. The", "start_char_pos": 1034, "end_char_pos": 1064, "major_intent": "coherence", "raw_intents": ["clarity", "coherence", "coherence"]}, {"type": "R", "before": "for reproducible research.", "after": "and codes are available at URL", "start_char_pos": 1084, "end_char_pos": 1110, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}], "sents_char_pos": [0, 131, 346, 536, 680, 1035], "domain": "arxiv"} |
|
{"doc_id": "2006.03644", "revision_depth": 2, "before_revision": "Stance detection on social media is an emerging opinion mining paradigm for various social and political applications where sentiment analysis might be sub-optimal. This paper surveys the work on stance detection and situates its usage within current opinion mining techniques in social media. An exhaustive review of stance detection techniques on social media is presented , including the task definition, the different types of targets in stance detection, the features set used, and the various machine learning approaches applied. The survey reports the state-of-the-art results on the existing benchmark datasets on stance detection, and discusses the most effective approaches. In addition, this study explores the emerging trends and the different applications of stance detection on social media. The study concludes by providing discussion of the gaps in the current existing research and highlighting the possible future directions for stance detection on social media.", "after_revision": "Stance detection on social media is an emerging opinion mining paradigm for various social and political applications in which sentiment analysis may be sub-optimal. There has been a growing research interest for developing effective methods for stance detection methods varying among multiple communities including natural language processing, web science, and social computing. This paper surveys the work on stance detection within those communities and situates its usage within current opinion mining techniques in social media. It presents an exhaustive review of stance detection techniques on social media , including the task definition, different types of targets in stance detection, features set used, and various machine learning approaches applied. The survey reports state-of-the-art results on the existing benchmark datasets on stance detection, and discusses the most effective approaches. In addition, this study explores the emerging trends and different applications of stance detection on social media. The study concludes by discussing the gaps in the current existing research and highlights the possible future directions for stance detection on social media.", "edit_actions": [{"type": "R", "before": "where sentiment analysis might", "after": "in which sentiment analysis may", "start_char_pos": 118, "end_char_pos": 148, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "A", "before": null, "after": "There has been a growing research interest for developing effective methods for stance detection methods varying among multiple communities including natural language processing, web science, and social computing.", "start_char_pos": 165, "end_char_pos": 165, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "A", "before": null, "after": "within those communities", "start_char_pos": 214, "end_char_pos": 214, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "clarity"]}, {"type": "R", "before": "An", "after": "It presents an", "start_char_pos": 296, "end_char_pos": 298, "major_intent": "coherence", "raw_intents": ["style", "coherence", "coherence"]}, {"type": "D", "before": "is presented", "after": null, "start_char_pos": 364, "end_char_pos": 376, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "clarity"]}, {"type": "D", "before": "the", "after": null, "start_char_pos": 410, "end_char_pos": 413, "major_intent": "fluency", "raw_intents": ["fluency", "coherence", "fluency"]}, {"type": "D", "before": "the", "after": null, "start_char_pos": 462, "end_char_pos": 465, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "D", "before": "the", "after": null, "start_char_pos": 489, "end_char_pos": 492, "major_intent": "fluency", "raw_intents": ["fluency", "clarity", "fluency"]}, {"type": "D", "before": "the", "after": null, "start_char_pos": 557, "end_char_pos": 560, "major_intent": "fluency", "raw_intents": ["clarity", "fluency", "fluency"]}, {"type": "D", "before": "the", "after": null, "start_char_pos": 744, "end_char_pos": 747, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "fluency"]}, {"type": "R", "before": "providing discussion of", "after": "discussing", "start_char_pos": 831, "end_char_pos": 854, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "style"]}, {"type": "R", "before": "highlighting", "after": "highlights", "start_char_pos": 901, "end_char_pos": 913, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}], "sents_char_pos": [0, 164, 295, 537, 686, 807], "domain": "arxiv"} |
|
{"doc_id": "2010.12873", "revision_depth": 2, "before_revision": "Recently, neural-symbolic models have achieved noteworthy success in leveraging knowledge graphs (KGs) for commonsense reasoning tasks , like question answering (QA) . However, fact sparsity, inherent in human-annotated KGs, can hinder such models from retrieving task-relevant knowledge . To address these issues, we propose Hybrid Graph Network (HGN) , a neural-symbolic model that reasons over both extracted (human-labeled) and generated facts within the same learned graph structure. Given a KG subgraphof extracted facts , HGN is jointly trained to generate complementary facts, encode relational information in the resulting \"hybrid\" subgraph, and filter out task-irrelevant facts . We demonstrate HGN's ability to produce contextually pertinent subgraphs by showing considerable performance gains across four commonsense reasoning benchmarks and a user study of fact validness and helpfulness.", "after_revision": "Recently, knowledge graph (KG) augmented models have achieved noteworthy success on various commonsense reasoning tasks . However, KG edge (fact) sparsity and noisy edge extraction/generation often hinder models from obtaining useful knowledge to reason over . To address these issues, we propose a new KG-augmented model: Hybrid Graph Network (HGN) . Unlike prior methods, HGN learns to jointly contextualize extracted and generated knowledge by reasoning over both within a unified graph structure. Given the task input context and an extracted KG subgraph , HGN is trained to generate embeddings for the subgraph's missing edges to form a \"hybrid\" graph, then reason over the hybrid graph while filtering out context-irrelevant edges . We demonstrate HGN's effectiveness through considerable performance gains across four commonsense reasoning benchmarks , plus a user study on edge validness and helpfulness.", "edit_actions": [{"type": "R", "before": "neural-symbolic", "after": "knowledge graph (KG) augmented", "start_char_pos": 10, "end_char_pos": 25, "major_intent": "meaning-changed", "raw_intents": ["style", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "in leveraging knowledge graphs (KGs) for", "after": "on various", "start_char_pos": 66, "end_char_pos": 106, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "D", "before": ", like question answering (QA)", "after": null, "start_char_pos": 135, "end_char_pos": 165, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "R", "before": "fact sparsity, inherent in human-annotated KGs, can hinder such models from retrieving task-relevant knowledge", "after": "KG edge (fact) sparsity and noisy edge extraction/generation often hinder models from obtaining useful knowledge to reason over", "start_char_pos": 177, "end_char_pos": 287, "major_intent": "style", "raw_intents": ["style", "style", "clarity"]}, {"type": "A", "before": null, "after": "a new KG-augmented model:", "start_char_pos": 326, "end_char_pos": 326, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": ", a neural-symbolic model that reasons over both extracted (human-labeled) and generated facts within the same learned", "after": ". Unlike prior methods, HGN learns to jointly contextualize extracted and generated knowledge by reasoning over both within a unified", "start_char_pos": 354, "end_char_pos": 472, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "a KG subgraphof extracted facts", "after": "the task input context and an extracted KG subgraph", "start_char_pos": 496, "end_char_pos": 527, "major_intent": "clarity", "raw_intents": ["style", "clarity", "clarity"]}, {"type": "D", "before": "jointly", "after": null, "start_char_pos": 537, "end_char_pos": 544, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "complementary facts, encode relational information in the resulting", "after": "embeddings for the subgraph's missing edges to form a", "start_char_pos": 565, "end_char_pos": 632, "major_intent": "meaning-changed", "raw_intents": ["clarity", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "subgraph, and filter out task-irrelevant facts", "after": "graph, then reason over the hybrid graph while filtering out context-irrelevant edges", "start_char_pos": 642, "end_char_pos": 688, "major_intent": "clarity", "raw_intents": ["meaning-changed", "clarity", "clarity"]}, {"type": "R", "before": "ability to produce contextually pertinent subgraphs by showing", "after": "effectiveness through", "start_char_pos": 712, "end_char_pos": 774, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "R", "before": "and", "after": ", plus", "start_char_pos": 851, "end_char_pos": 854, "major_intent": "coherence", "raw_intents": ["coherence", "clarity", "coherence"]}, {"type": "R", "before": "of fact", "after": "on edge", "start_char_pos": 868, "end_char_pos": 875, "major_intent": "style", "raw_intents": ["style", "fluency", "clarity"]}], "sents_char_pos": [0, 289, 489, 690], "domain": "arxiv"} |
|
{"doc_id": "2011.00416", "revision_depth": 1, "before_revision": "Driven by the increasingly larger deep learning models, neural language generation (NLG) has enjoyed unprecedentedly improvement and is now able to generate a diversity of human-like texts on demand, granting itself the capability of serving as a human writing assistant. Text attribute transfer is one of the most important NLG tasks, which aims to control certain attributes that people may expect the texts to possess , such as sentiment, tense, emotion, political position, etc . It has a long history in Natural Language Processing but recently gains much more attention thanks to the promising performance brought by deep learning models. In this article , we present a systematic survey on these works for neural text attribute transfer. We collect all related academic works since the first appearance in 2017. We then select, summarize, discuss, and analyze around 65 representative works in a comprehensive way. Overall, we have covered the task formulation, existing datasets and metrics for model development and evaluation , and all methods developed over the last several years. We reveal that existing methods are indeed based on a combination of several loss functions with each of which serving a certain goal. Such a unique perspective we provide could shed light on the design of new methods. We conclude our survey with a discussion on open issues that need to be resolved for better future development .", "after_revision": "Text style transfer (TST) is an important task in natural language generation (NLG) , which aims to control certain attributes in the generated text , such as politeness, emotion, humor, and many others . It has a long history in the field of natural language processing (NLP), but recently it has gained significant attention thanks to the promising performance brought by deep learning models. In this paper , we present a systematic survey of the research done on neural text style transfer. We have collected, summarized, and discussed nearly 70 representative articles since the first neural text style transfer work in 2017. Overall, we have covered the task formulation, existing datasets and subtasks, evaluation metrics, and methods on parallel and non-parallel data. We also provide discussions a variety of important topics regarding TST, which can shed light on new development in this field. Our curated paper list is at URL", "edit_actions": [{"type": "R", "before": "Driven by the increasingly larger deep learning models, neural", "after": "Text style transfer (TST) is an important task in natural", "start_char_pos": 0, "end_char_pos": 62, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "style"]}, {"type": "R", "before": "has enjoyed unprecedentedly improvement and is now able to generate a diversity of human-like texts on demand, granting itself the capability of serving as a human writing assistant. Text attribute transfer is one of the most important NLG tasks,", "after": ",", "start_char_pos": 89, "end_char_pos": 335, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "coherence"]}, {"type": "R", "before": "that people may expect the texts to possess", "after": "in the generated text", "start_char_pos": 377, "end_char_pos": 420, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "R", "before": "sentiment, tense, emotion, political position, etc", "after": "politeness, emotion, humor, and many others", "start_char_pos": 431, "end_char_pos": 481, "major_intent": "meaning-changed", "raw_intents": ["clarity", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "Natural Language Processing but recently gains much more", "after": "the field of natural language processing (NLP), but recently it has gained significant", "start_char_pos": 509, "end_char_pos": 565, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "R", "before": "article", "after": "paper", "start_char_pos": 653, "end_char_pos": 660, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "R", "before": "on these works for neural text attribute", "after": "of the research done on neural text style", "start_char_pos": 694, "end_char_pos": 734, "major_intent": "clarity", "raw_intents": ["style", "clarity", "clarity"]}, {"type": "R", "before": "collect all related academic works since the first appearance in", "after": "have collected, summarized, and discussed nearly 70 representative articles since the first neural text style transfer work in", "start_char_pos": 748, "end_char_pos": 812, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "D", "before": "We then select, summarize, discuss, and analyze around 65 representative works in a comprehensive way.", "after": null, "start_char_pos": 819, "end_char_pos": 921, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "coherence"]}, {"type": "R", "before": "metrics for model development and evaluation , and all methods developed over the last several years. We reveal that existing methods are indeed based on a combination of several loss functions with each of which serving a certain goal. Such a unique perspective we provide could", "after": "subtasks, evaluation metrics, and methods on parallel and non-parallel data. We also provide discussions a variety of important topics regarding TST, which can", "start_char_pos": 991, "end_char_pos": 1270, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "the design of new methods. We conclude our survey with a discussion on open issues that need to be resolved for better future development .", "after": "new development in this field. Our curated paper list is at URL", "start_char_pos": 1285, "end_char_pos": 1424, "major_intent": "meaning-changed", "raw_intents": ["coherence", "meaning-changed", "meaning-changed"]}], "sents_char_pos": [0, 271, 483, 644, 744, 818, 921, 1092, 1227, 1311], "domain": "arxiv"} |
|
{"doc_id": "2011.04393", "revision_depth": 1, "before_revision": "Recently, contextualized word embeddings outperform static word embeddings on many NLP tasks. However, we still don't know much about the mechanism inside these internal representationsproduced by BERT . Do they have any common patterns? What are the relations between word sense and context ? We find that nearly all the contextualized word vectors of BERT and RoBERTa have some common patterns . For BERT, the 557^{th} element is always the smallest. For RoBERTa, the 588^{th} element is always the largest and the 77^{th} element is the smallest. We call them as \"tails\" of models. We find that these \"tails\" are the major cause of anisotrpy of the vector space. After \"cutting the tails\", the same word's different vectors are more similar to each other. The internal representations also perform better on word-in-context (WiC) task. These suggest that \" cutting the tails\" can decrease the influence of context and better represent word sense .", "after_revision": "Recently, contextualized word embeddings outperform static word embeddings on many NLP tasks. However, we still do not know much about the mechanism inside these representations . Do they have any common patterns? If so, where do these patterns come from ? We find that almost all the contextualized word vectors of BERT and RoBERTa have a common pattern . For BERT, the 557^{th} element is always the smallest. For RoBERTa, the 588^{th} element is always the largest , and the 77^{th} element is the smallest. We call them \"tails\" of models. We introduce a new neuron-level method to analyze where these \"tails\" come from. We find that these \"tails\" are closely related to the positional information. We also investigate what will happen if we \"cutting the tails\" (zero-out). Our results show that \"tails\" are the major cause of anisotropy of vector space. After \"cutting the tails\", a word's different vectors are more similar to each other. The internal representations have a better ability to distinguish a word's different senses with the word-in-context (WiC) dataset. The performance on the word sense disambiguation task is better for BERT and unchanged for RoBERTa. We can also better induce phrase grammar from the vector space. These suggest that \" tails\" are less related to the sense and syntax information in vectors. These findings provide insights into the inner workings of contextualized word vectors .", "edit_actions": [{"type": "R", "before": "don't", "after": "do not", "start_char_pos": 112, "end_char_pos": 117, "major_intent": "fluency", "raw_intents": ["coherence", "fluency", "fluency"]}, {"type": "R", "before": "internal representationsproduced by BERT", "after": "representations", "start_char_pos": 161, "end_char_pos": 201, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "What are the relations between word sense and context", "after": "If so, where do these patterns come from", "start_char_pos": 238, "end_char_pos": 291, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "R", "before": "nearly", "after": "almost", "start_char_pos": 307, "end_char_pos": 313, "major_intent": "style", "raw_intents": ["style", "clarity", "style"]}, {"type": "R", "before": "some common patterns", "after": "a common pattern", "start_char_pos": 375, "end_char_pos": 395, "major_intent": "clarity", "raw_intents": ["clarity", "style", "fluency"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 509, "end_char_pos": 509, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "D", "before": "as", "after": null, "start_char_pos": 564, "end_char_pos": 566, "major_intent": "fluency", "raw_intents": ["fluency", "coherence", "fluency"]}, {"type": "A", "before": null, "after": "introduce a new neuron-level method to analyze where these \"tails\" come from. We", "start_char_pos": 589, "end_char_pos": 589, "major_intent": "meaning-changed", "raw_intents": ["coherence", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "the", "after": "closely related to the positional information. We also investigate what will happen if we \"cutting the tails\" (zero-out). Our results show that \"tails\" are the", "start_char_pos": 618, "end_char_pos": 621, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "anisotrpy of the", "after": "anisotropy of", "start_char_pos": 637, "end_char_pos": 653, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "the same", "after": "a", "start_char_pos": 695, "end_char_pos": 703, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "also perform better on", "after": "have a better ability to distinguish a word's different senses with the", "start_char_pos": 790, "end_char_pos": 812, "major_intent": "clarity", "raw_intents": ["style", "clarity", "clarity"]}, {"type": "R", "before": "task.", "after": "dataset. The performance on the word sense disambiguation task is better for BERT and unchanged for RoBERTa. We can also better induce phrase grammar from the vector space.", "start_char_pos": 835, "end_char_pos": 840, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "D", "before": "cutting the", "after": null, "start_char_pos": 862, "end_char_pos": 873, "major_intent": "style", "raw_intents": ["style", "style", "style"]}, {"type": "R", "before": "can decrease the influence of context and better represent word sense", "after": "are less related to the sense and syntax information in vectors. These findings provide insights into the inner workings of contextualized word vectors", "start_char_pos": 881, "end_char_pos": 950, "major_intent": "meaning-changed", "raw_intents": ["style", "meaning-changed", "meaning-changed"]}], "sents_char_pos": [0, 93, 237, 293, 397, 452, 550, 585, 667, 760, 840], "domain": "arxiv"} |
|
{"doc_id": "2101.01321", "revision_depth": 1, "before_revision": "Transformer based models, like BERT and RoBERTa, have achieved state-of-the-art results in many Natural Language Processing tasks. However, their memory footprint, inference latency, and power consumption are prohibitive for many edgeprocessors, and it has been a challenge to deploy these models for edge applications and devices that have resource constraints . While quantization can be a viable solution to this, previous work on quantizing Transformer based models uses floating-point arithmetic during inference, thus limiting model deployment on many edge processors. In this work, we propose a novel integer-only quantization scheme for Transformer based models that quantizes the entire inference process. In particular, we demonstrate how to approximate nonlinear operationsin Transformer architectures , e.g., GELU, Softmax, and Layer Normalization, with lightweight integer computations. We use those approximations in our method, I-BERT , with an end-to-end integer-only inference, and without any floating point calculation. We test our approach on GLUE downstream tasks using RoBERTa-Base and RoBERTa-Large. For both cases, with an 8-bit integer-only quantization scheme, I-BERT achieves similar accuracy as compared to the full-precision baseline .", "after_revision": "Transformer based models, like BERT and RoBERTa, have achieved state-of-the-art results in many Natural Language Processing tasks. However, their memory footprint, inference latency, and power consumption are prohibitive for efficient inference at the edge, and even at the data center . While quantization can be a viable solution for this, previous work on quantizing Transformer based models use floating-point arithmetic during inference, which cannot efficiently utilize integer-only logical units such as the recent Turing Tensor Cores, or traditional integer-only ARM processors. In this work, we propose I-BERT, a novel quantization scheme for Transformer based models that quantizes the entire inference with integer-only arithmetic. Based on lightweight integer-only approximation methods for nonlinear operations , e.g., GELU, Softmax, and Layer Normalization, I-BERT performs an end-to-end integer-only BERT inference without any floating point calculation. We evaluate our approach on GLUE downstream tasks using RoBERTa-Base /Large. We show that for both cases, I-BERT achieves similar (and slightly higher) accuracy as compared to the full-precision baseline . Furthermore, our preliminary implementation of I-BERT shows a speedup of 2.4 - 4.0x for INT8 inference on a T4 GPU system as compared to FP32 inference. The framework has been developed in PyTorch and has been open-sourced .", "edit_actions": [{"type": "R", "before": "many edgeprocessors, and it has been a challenge to deploy these models for edge applications and devices that have resource constraints", "after": "efficient inference at the edge, and even at the data center", "start_char_pos": 225, "end_char_pos": 361, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "to", "after": "for", "start_char_pos": 408, "end_char_pos": 410, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "uses", "after": "use", "start_char_pos": 470, "end_char_pos": 474, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "thus limiting model deployment on many edge", "after": "which cannot efficiently utilize integer-only logical units such as the recent Turing Tensor Cores, or traditional integer-only ARM", "start_char_pos": 519, "end_char_pos": 562, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "a novel integer-only", "after": "I-BERT, a novel", "start_char_pos": 600, "end_char_pos": 620, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "meaning-changed"]}, {"type": "R", "before": "process. In particular, we demonstrate how to approximate nonlinear operationsin Transformer architectures", "after": "with integer-only arithmetic. Based on lightweight integer-only approximation methods for nonlinear operations", "start_char_pos": 706, "end_char_pos": 812, "major_intent": "meaning-changed", "raw_intents": ["coherence", "meaning-changed", "meaning-changed"]}, {"type": "D", "before": "with lightweight integer computations. We use those approximations in our method,", "after": null, "start_char_pos": 861, "end_char_pos": 942, "major_intent": "clarity", "raw_intents": ["clarity", "coherence", "clarity"]}, {"type": "R", "before": ", with", "after": "performs", "start_char_pos": 950, "end_char_pos": 956, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "inference, and", "after": "BERT inference", "start_char_pos": 984, "end_char_pos": 998, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "meaning-changed"]}, {"type": "R", "before": "test", "after": "evaluate", "start_char_pos": 1042, "end_char_pos": 1046, "major_intent": "clarity", "raw_intents": ["style", "clarity", "clarity"]}, {"type": "R", "before": "and RoBERTa-Large. For", "after": "/Large. We show that for", "start_char_pos": 1104, "end_char_pos": 1126, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "D", "before": "with an 8-bit integer-only quantization scheme,", "after": null, "start_char_pos": 1139, "end_char_pos": 1186, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "A", "before": null, "after": "(and slightly higher)", "start_char_pos": 1211, "end_char_pos": 1211, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "clarity", "meaning-changed"]}, {"type": "A", "before": null, "after": ". Furthermore, our preliminary implementation of I-BERT shows a speedup of 2.4 - 4.0x for INT8 inference on a T4 GPU system as compared to FP32 inference. The framework has been developed in PyTorch and has been open-sourced", "start_char_pos": 1264, "end_char_pos": 1264, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}], "sents_char_pos": [0, 130, 363, 574, 714, 899, 1038, 1122], "domain": "arxiv"} |
|
{"doc_id": "2103.06922", "revision_depth": 2, "before_revision": "Recent studies indicate that NLU models are prone to rely on shortcut features for prediction, without achieving true language understanding. As a result, these models fail to generalize to real-world out-of-distribution data. In this work, we show that the words in the NLU training set can be modeled as a long-tailed distribution. There are two findings: 1) NLU models have strong preference for features located at the head of the long-tailed distribution, and 2) Shortcut features are picked up during very early few iterations of the model training. These two observations are further employed to formulate a measurement which can quantify the shortcut degree of each training sample. Based on this shortcut measurement, we propose a shortcut mitigation framework LGTR , to suppress the model from making overconfident predictions for samples with large shortcut degree. Experimental results on three NLU benchmarks demonstrate that our long-tailed distribution explanation accurately reflects the shortcut learning behavior of NLU models. Experimental analysis further indicates that LGTR can improve the generalization accuracy on OOD data, while preserving the accuracy on in-distribution data.", "after_revision": "Recent studies indicate that NLU models are prone to rely on shortcut features for prediction, without achieving true language understanding. As a result, these models fail to generalize to real-world out-of-distribution data. In this work, we show that the words in the NLU training set can be modeled as a long-tailed distribution. There are two findings: 1) NLU models have strong preference for features located at the head of the long-tailed distribution, and 2) Shortcut features are picked up during very early few iterations of the model training. These two observations are further employed to formulate a measurement which can quantify the shortcut degree of each training sample. Based on this shortcut measurement, we propose a shortcut mitigation framework LTGR , to suppress the model from making overconfident predictions for samples with large shortcut degree. Experimental results on three NLU benchmarks demonstrate that our long-tailed distribution explanation accurately reflects the shortcut learning behavior of NLU models. Experimental analysis further indicates that LTGR can improve the generalization accuracy on OOD data, while preserving the accuracy on in-distribution data.", "edit_actions": [{"type": "R", "before": "LGTR", "after": "LTGR", "start_char_pos": 770, "end_char_pos": 774, "major_intent": "clarity", "raw_intents": ["clarity", "fluency", "clarity"]}, {"type": "R", "before": "LGTR", "after": "LTGR", "start_char_pos": 1091, "end_char_pos": 1095, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "clarity"]}], "sents_char_pos": [0, 141, 226, 333, 555, 690, 876, 1045], "domain": "arxiv"} |
|
{"doc_id": "109525", "revision_depth": 2, "before_revision": "A car crash in the British region of Leicestershire has resulted in the deaths of six people , three men and three women. The incident occurred at around 01:00 UTC today ( 0200 local time) and it involved a lorry and a Ford Mondeo on the A607 road. Location of Leicestershire within England Simon Adkin , from the local police , said that \" The black Mondeo was travelling towards Leicester and the lorry was travelling in the opposite direction.\" He then said that the police \"are appealing to anyone who witnessed the incident and saw either of these vehicles beforehand to get in touch.\" The driver of the Ford, who was killed in the incident, was a 23-year-old male. In addition to the 23-year-old male, an 18-year-old man, a 19-year old woman and an 18-year-old woman were also killed in the incident. The age of the other two people is not yet known.", "after_revision": "A car crash in the British region of Leicestershire has resulted in the deaths of six people : three men and three women. The incident , involving a lorry and a Ford Mondeo, occurred at around 01:00 UTC today ( 02:00 local time) on the A607 road. Location of Leicestershire within England Simon Adkin of the local police said that, \" the black Mondeo was travelling towards Leicester and the lorry was travelling in the opposite direction.\" He announced that the police \"are appealing to anyone who witnessed the incident and saw either of these vehicles beforehand to get in touch.\" The driver of the Ford, who was killed in the incident, was a 23-year old male. An 18-year old man, 19-year old woman and 18-year old woman were also killed in the incident. The ages of the other two people are not yet known.", "edit_actions": [{"type": "R", "before": ",", "after": ":", "start_char_pos": 93, "end_char_pos": 94, "major_intent": "fluency", "raw_intents": ["fluency", "others", "fluency"]}, {"type": "A", "before": null, "after": ", involving a lorry and a Ford Mondeo,", "start_char_pos": 135, "end_char_pos": 135, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "0200", "after": "02:00", "start_char_pos": 173, "end_char_pos": 177, "major_intent": "fluency", "raw_intents": ["clarity", "fluency", "fluency"]}, {"type": "D", "before": "and it involved a lorry and a Ford Mondeo", "after": null, "start_char_pos": 190, "end_char_pos": 231, "major_intent": "coherence", "raw_intents": ["coherence", "clarity", "coherence"]}, {"type": "R", "before": ", from", "after": "of", "start_char_pos": 304, "end_char_pos": 310, "major_intent": "fluency", "raw_intents": ["coherence", "fluency", "fluency"]}, {"type": "R", "before": ", said that", "after": "said that,", "start_char_pos": 328, "end_char_pos": 339, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "The", "after": "the", "start_char_pos": 342, "end_char_pos": 345, "major_intent": "fluency", "raw_intents": ["fluency", "clarity", "fluency"]}, {"type": "R", "before": "then said", "after": "announced", "start_char_pos": 452, "end_char_pos": 461, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "23-year-old male. In addition to the 23-year-old male, an 18-year-old man, a", "after": "23-year old male. An 18-year old man,", "start_char_pos": 654, "end_char_pos": 730, "major_intent": "fluency", "raw_intents": ["fluency", "coherence", "clarity"]}, {"type": "R", "before": "an 18-year-old", "after": "18-year old", "start_char_pos": 753, "end_char_pos": 767, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "age", "after": "ages", "start_char_pos": 812, "end_char_pos": 815, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "style"]}, {"type": "R", "before": "is", "after": "are", "start_char_pos": 840, "end_char_pos": 842, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}], "sents_char_pos": [0, 121, 249, 448, 591, 671, 807], "domain": "news"} |
|
{"doc_id": "12341", "revision_depth": 1, "before_revision": "300px|U2 in concert earlier this year in Anaheim 240,000 U2 fans are preparing for Ireland's most successful band ever to return for a hometown gig in Dublin . The supergroup is playing Friday, Saturday and Monday in front of 80,000 people a night at Croke Park. Garda have put in place a massive traffic management plan to cope with the influx of people. Fans have been queing since Thursday morning when a group of fans from Holland , Italy and Canada started queing outside the stadium. The concert forms part of U2's current \" Vertigo \" world tour. The support acts include emerging Irish bands Snow Patrol and The Thrills along with already very popular Paddy Casey. Ash and The Bravery will cap it off as warm up acts on Monday. The gates for the gig will open at 4pm with the actual band not expected until around 8.30pm. Irish media has entered a frenzied anticipation of the concert all this week, with constant coverage on radio stations. The national broadcaster, RT , is set to dedicated almost 3hr 30min to the band with an exclusive interview by Dave Fanning forming the centerpiece of the stations coverage of the \"U2 weekend\". Coldplay frontman Chris Martin , who played in front of a sell out crowd in Dublin earlier this week told fans that U2 were \"still the best band in the world\". Croke Park is the fourth largest stadium in Europe, with an official capacity of 82,000 - bigger than both Cardiff's Millennium Stadium and Paris' Stade de France . Croke Park is used to host GAA matches in sports such as hurling and Gaelic football and regularly attracts audiences of 60,000 or more in the summer months.", "after_revision": "300px|U2 in concert earlier this year in Anaheim 240,000 fans are preparing for Ireland's most successful band ever to return for a hometown gig in . The supergroup is playing Friday, Saturday and Monday in front of 80,000 people a night at . have put in place a massive traffic management plan to cope with the influx of people. Fans have been queing since Thursday morning when a group of fans from , Italy and Canada started queuing outside the stadium. The concert forms part of U2's current \" \" world tour. The support acts include emerging Irish bands and along with already very popular Paddy Casey. and will cap it off as warm up acts on Monday. The gates for the gig will open at 4pm with the actual band not expected until around 8.30pm. Irish media has entered a frenzied anticipation of the concert all this week, with constant coverage on radio stations. The national broadcaster, , is set to dedicated almost 3hr 30min to the band with an exclusive interview by Dave Fanning forming the centerpiece of the stations coverage of the \"U2 weekend\". frontman , who played in front of a sell out crowd in Dublin earlier this week told fans that U2 were \"still the best band in the world\". Croke Park is the fourth largest stadium in Europe, with an official capacity of 82,000 - bigger than both w|Cardiff|Cardiff's and . Croke Park is used to host GAA matches in sports such as and and regularly attracts audiences of 60,000 or more in the summer months.", "edit_actions": [{"type": "D", "before": "U2", "after": null, "start_char_pos": 57, "end_char_pos": 59, "major_intent": "style", "raw_intents": ["style", "style", "style"]}, {"type": "D", "before": "Dublin", "after": null, "start_char_pos": 151, "end_char_pos": 157, "major_intent": "others", "raw_intents": ["others", "others", "others"]}, {"type": "R", "before": "Croke Park. Garda", "after": ".", "start_char_pos": 251, "end_char_pos": 268, "major_intent": "others", "raw_intents": ["others", "others", "others"]}, {"type": "D", "before": "Holland", "after": null, "start_char_pos": 427, "end_char_pos": 434, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "queing", "after": "queuing", "start_char_pos": 462, "end_char_pos": 468, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "D", "before": "Vertigo", "after": null, "start_char_pos": 531, "end_char_pos": 538, "major_intent": "others", "raw_intents": ["others", "others", "others"]}, {"type": "R", "before": "Snow Patrol and The Thrills", "after": "and", "start_char_pos": 599, "end_char_pos": 626, "major_intent": "style", "raw_intents": ["style", "style", "clarity"]}, {"type": "R", "before": "Ash and The Bravery", "after": "and", "start_char_pos": 672, "end_char_pos": 691, "major_intent": "others", "raw_intents": ["others", "others", "others"]}, {"type": "D", "before": "RT", "after": null, "start_char_pos": 975, "end_char_pos": 977, "major_intent": "others", "raw_intents": ["others", "others", "others"]}, {"type": "R", "before": "Coldplay frontman Chris Martin", "after": "frontman", "start_char_pos": 1143, "end_char_pos": 1173, "major_intent": "others", "raw_intents": ["others", "others", "others"]}, {"type": "R", "before": "Cardiff's Millennium Stadium and Paris' Stade de France", "after": "w|Cardiff|Cardiff's", "start_char_pos": 1410, "end_char_pos": 1465, "major_intent": "others", "raw_intents": ["others", "others", "others"]}, {"type": "A", "before": null, "after": "and", "start_char_pos": 1466, "end_char_pos": 1466, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "hurling and Gaelic football", "after": "and", "start_char_pos": 1526, "end_char_pos": 1553, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}], "sents_char_pos": [0, 159, 262, 355, 489, 552, 671, 734, 828, 948, 1142, 1302]} |
|
{"doc_id": "51762", "revision_depth": 1, "before_revision": "Sports presenter for 3 News has given an unsatisfactory letter of apology after he abused a taxi driver in Taupo while he was drunk. The taxi driver, Mere Wall, was verbally abused by Brown on September 13 where he, allegedly, called Wall a \"Tuwharetoa whore and bitch,\" and a \"Tainui whore and bitch.\" Brown was then punched by the partner of a woman he had insulted. Brown sent an unsigned letter of apology to Wall, which Wall said she received last week , she had asked for one after the incident occurred . Wall said she was unhappy with the letter and has written to TV3 to complain. Wall said that the letter was two paragraphs long and said that due to a concussion Brown cannot remember the incident but if he did do something wrong then he is sorry. The letter contained no return address but she said: \"I have written to TV3 [about the letter].\" Wall hopes to get a better apology. However Wall has admitted that she did make some comments about Brown after the incident, Wall said: \"But it has gone on and on.\" Brown was stood down after the incident by TV3 and Roger Beaumont, a spokesman for TV3, said that he is still not back at work and it is unsure when, if, he returns . \"The investigation is still continuing. TV3 and police were investigating the incident but it is not known how long the investigation would take.\" TV3 was also conducting their own investigation of which included hiring a private investigator. They said that they are not ruling out any disciplinary action if the allegations are proved to be correct.", "after_revision": "Clint Brown, a sports presenter for 3 News , has given a reportedly \"unsatisfactory\" letter of apology after he abused a taxi driver in Taupo while drunk. The taxi driver, Mere Wall, was verbally abused by Brown on September 13 where he, allegedly, called Wall a \"Tuwharetoa whore and bitch,\" and a \"Tainui whore and bitch.\" Brown was then punched by the partner of a woman he had insulted. Wall had asked for an apology after the incident. Brown sent an unsigned letter of apology to Wall, which Wall said she received last week . Wall said she was unhappy with the letter , and has written to TV3 to complain. Wall said that the letter was two paragraphs long , and stated that Brown cannot remember the incident due to concussion, but if he did do something wrong then he is sorry. The letter contained no return address . Wall hopes to get a better apology, saying \"I have written to TV3 [about the letter].\" Wall admits to making some comments about Brown after the incident, Wall said: \"But it has gone on and on.\" Brown was stood down after the incident by TV3 . Roger Beaumont, a spokesman for TV3, said that he is still not back at work and it is unsure when, or if, he will return . \"The investigation is still continuing. TV3 and police were investigating the incident but it is not known how long the investigation would take.\" TV3 was also conducting their own investigation , which included hiring a private investigator. They are not ruling out any disciplinary action if the allegations are proved to be correct.", "edit_actions": [{"type": "R", "before": "Sports", "after": "Clint Brown, a sports", "start_char_pos": 0, "end_char_pos": 6, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "has given an unsatisfactory", "after": ", has given a reportedly \"unsatisfactory\"", "start_char_pos": 28, "end_char_pos": 55, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "D", "before": "he was", "after": null, "start_char_pos": 119, "end_char_pos": 125, "major_intent": "fluency", "raw_intents": ["fluency", "clarity", "fluency"]}, {"type": "A", "before": null, "after": "Wall had asked for an apology after the incident.", "start_char_pos": 369, "end_char_pos": 369, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "clarity", "meaning-changed"]}, {"type": "D", "before": ", she had asked for one after the incident occurred", "after": null, "start_char_pos": 459, "end_char_pos": 510, "major_intent": "coherence", "raw_intents": ["coherence", "clarity", "coherence"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 555, "end_char_pos": 555, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "fluency"]}, {"type": "R", "before": "and said that due to a concussion", "after": ", and stated that", "start_char_pos": 642, "end_char_pos": 675, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "A", "before": null, "after": "due to concussion,", "start_char_pos": 711, "end_char_pos": 711, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "but she said:", "after": ". Wall hopes to get a better apology, saying", "start_char_pos": 802, "end_char_pos": 815, "major_intent": "coherence", "raw_intents": ["coherence", "meaning-changed", "coherence"]}, {"type": "D", "before": "Wall hopes to get a better apology.", "after": null, "start_char_pos": 860, "end_char_pos": 895, "major_intent": "coherence", "raw_intents": ["coherence", "clarity", "coherence"]}, {"type": "R", "before": "However Wall has admitted that she did make", "after": "Wall admits to making", "start_char_pos": 896, "end_char_pos": 939, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "and", "after": ".", "start_char_pos": 1073, "end_char_pos": 1076, "major_intent": "fluency", "raw_intents": ["fluency", "style", "fluency"]}, {"type": "A", "before": null, "after": "or", "start_char_pos": 1176, "end_char_pos": 1176, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "fluency"]}, {"type": "R", "before": "returns", "after": "will return", "start_char_pos": 1184, "end_char_pos": 1191, "major_intent": "fluency", "raw_intents": ["clarity", "fluency", "fluency"]}, {"type": "R", "before": "of", "after": ",", "start_char_pos": 1389, "end_char_pos": 1391, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "coherence"]}, {"type": "D", "before": "said that they", "after": null, "start_char_pos": 1443, "end_char_pos": 1457, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}], "sents_char_pos": [0, 132, 302, 368, 512, 591, 762, 859, 895, 1233, 1340, 1437], "domain": "news"} |
|
{"doc_id": "55544", "revision_depth": 1, "before_revision": "Today at a special caucus meeting, member of parliaments (MP) for National agreed that the new leader of the opposition, New Zealand's National Party, is John Key and the deputy leader is Bill English , they are taking over from past leader, Dr Don Brash and past deputy leader, Gerry Brownlee. Bill English, 45-years-old, use to be leader of the National Party before being outed in 2003 and being replaced by Dr Brash for a bad performance at the 2002 New Zealand elections, Mr English only managed to get 22\\% of the votes cast. Mr Key, also 45-years-old, said at the announcement of the new leader and deputy leader that he was honoured that he was chosen to be the new leader of the National Party and also hounoured that Mr English would be his deputy. \"Can I tell you I think we will make a formidable team . Key said the public placed a high value on unity. If National could demonstrate it could manage itself, then it could start to manage the country.\" Dr Brash will however stay on the National Party. He is at number 3 on the list of National MP , behind Key and English. His new caucus responsibilities will be : Spokesman for the Security and Intelligence Service and Spokesman for Relationships with Non-Government Parties. His select committee responsibility is Security and Intelligence.", "after_revision": "Today at a special caucus meeting, member of parliaments (MP) for National agreed that the new leader of the opposition, New Zealand's National Party, is John Key and the deputy leader is Bill English . They are taking over from past leader, Dr Don Brash and past deputy leader, Gerry Brownlee. Bill English, 45-years-old, is the former leader of the National Party . He was outed in 2003 , for a bad performance at the 2002 New Zealand elections, and replaced by Dr Brash. Mr English only managed to get 22\\% of the votes cast. John Key, also 45-years-old, said at the announcement of the new leader and deputy leader that he was honoured that he was chosen to be the new leader of the National Party and also honoured that Mr English would be his . If National could demonstrate it could manage itself, then it could start to manage the country. deputy. \"Can I tell you I think we will make a formidable team ,\" he said . Key said the public placed a high value on unity. Dr Brash will however stay on the National Party. He is at number 3 on the list of National MPs , behind Key and English. His new caucus responsibilities will be Spokesman for the Security and Intelligence Service and Spokesman for Relationships with Non-Government Parties. His select committee responsibility is Security and Intelligence.", "edit_actions": [{"type": "R", "before": ", they", "after": ". They", "start_char_pos": 201, "end_char_pos": 207, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "fluency"]}, {"type": "R", "before": "use to be", "after": "is the former", "start_char_pos": 323, "end_char_pos": 332, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "before being", "after": ". He was", "start_char_pos": 362, "end_char_pos": 374, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "style"]}, {"type": "R", "before": "and being replaced by Dr Brash", "after": ",", "start_char_pos": 389, "end_char_pos": 419, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "clarity"]}, {"type": "A", "before": null, "after": "and replaced by Dr Brash.", "start_char_pos": 477, "end_char_pos": 477, "major_intent": "meaning-changed", "raw_intents": ["clarity", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "Mr", "after": "John", "start_char_pos": 533, "end_char_pos": 535, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "R", "before": "hounoured", "after": "honoured", "start_char_pos": 713, "end_char_pos": 722, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "A", "before": null, "after": ". If National could demonstrate it could manage itself, then it could start to manage the country.", "start_char_pos": 752, "end_char_pos": 752, "major_intent": "meaning-changed", "raw_intents": ["coherence", "meaning-changed", "meaning-changed"]}, {"type": "A", "before": null, "after": ",\" he said", "start_char_pos": 816, "end_char_pos": 816, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "coherence"]}, {"type": "D", "before": "If National could demonstrate it could manage itself, then it could start to manage the country.\"", "after": null, "start_char_pos": 869, "end_char_pos": 966, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "coherence"]}, {"type": "R", "before": "MP", "after": "MPs", "start_char_pos": 1059, "end_char_pos": 1061, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "D", "before": ":", "after": null, "start_char_pos": 1128, "end_char_pos": 1129, "major_intent": "fluency", "raw_intents": ["fluency", "clarity", "fluency"]}], "sents_char_pos": [0, 294, 532, 760, 868, 966, 1016, 1087, 1242], "domain": "news"} |
|
{"doc_id": "64904", "revision_depth": 1, "before_revision": "A cannabis leaf The Australian state of New South Wales (NSW) launched its latest anti-cannabis propaganda campaign in Sydney today. The propaganda campaign, which specifically targets 14 to 19 year-olds aims to reduce the number of young people experimenting with the soft drug. The advertisements are the result of a tightening of cannabis laws in NSW last year. The propaganda , which NSW Health has cost at AUD$600,000 , that could of been spent on feeding starving children, will use a variety of print ads placed at bus stops and in youth magazines in addition to propaganda messages on websites such as MySpace and MSN. Propaganda ads use a tag line saying \"Pot. It mightn't kill you, but it could turn politicians into dickheads \". The propaganda images show a person staring from a black and white photo with quotes like \"You've got great eyes, dialated eyes are hot \" and \"I'd lend you money, but the government would just spend it on these ads \" , as part of an emotional appeal to manipulate the reader into an unfair bias on the topic, instead of comparing facts and science to the message they are promoting . The director of drug and alcohol programs with NSW Health, David McGrath said many young people know that cannabis was safe and its use was normal. \"There is still a significant cohort of people who think, particularly with the young person's age group, that cannabis is normal,\" Mr McGrath said. Mr McGrath said despite the number of young cannabis users had halved from forty percent to twenty percent, but claimed more needed to be done. \"We want to prevent people who are thinking about taking cannabis up from taking it up and also encouraging those people that are using cannabis to cease, through use of unrelenting lies and social shunning \" he said. Mr McGrath said young people had to be warned about the possible consequences of smoking cannabis, such as the alleged link with comedians and how it may improve social interaction and physical health. NSW Health Minister, Reba Meagher said a recent study indicated that while cannabis use among young people is in decline, almost one third of teenagers have tried cannabis. She also claimed that cannabis remained common within the community. \"Cannabis is a drug that is readily available in our community and it does not come with serious risk,\" she said.", "after_revision": "A cannabis leaf The Australian state of New South Wales (NSW) launched its latest anti-cannabis campaign in Sydney today. The campaign, which specifically targets 14 to 19 year-olds aims to reduce the number of young people experimenting with the drug. The advertisements follow a tightening of cannabis laws in NSW last year. The campaign , which NSW Health has cost at AUD$600,000 will use a variety of print ads placed at bus stops and in youth magazines in addition to advertisements on websites such as MySpace and MSN. Print ads use a tag line saying \"Pot. It mightn't kill you, but it could turn you into a dickhead \". The advertisements show a person staring from a black and white photo with quotes like \"You've got great eyes, when they're not bloodshot \" and \"I'd lend you money, but you still owe me from last time \" . The director of drug and alcohol programs with NSW Health, David McGrath said many young people believed that cannabis was safe and its use was normal. \"There is still a significant cohort of people who think, particularly with the young person's age group, that cannabis is normal,\" Mr McGrath said. Mr McGrath said despite the number of young cannabis users had halved from forty percent to twenty percent, but claimed more needed to be done. \"We want to prevent people who are thinking about taking cannabis up from taking it up and also encouraging those people that are using cannabis to cease, \" he said. Mr McGrath said young people had to be warned about the possible consequences of smoking cannabis, such as the alleged link with mental illnesss and how it may affect social interaction and physical health. NSW Health Minister, Reba Meagher said a recent study indicated that while cannabis use among young people is in decline, almost one third of teenagers have tried cannabis. She also claimed that cannabis remained common within the community. \"Cannabis is a drug that is readily available in our community and it does come with serious risk,\" she said.", "edit_actions": [{"type": "D", "before": "propaganda", "after": null, "start_char_pos": 96, "end_char_pos": 106, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "D", "before": "propaganda", "after": null, "start_char_pos": 137, "end_char_pos": 147, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "D", "before": "soft", "after": null, "start_char_pos": 269, "end_char_pos": 273, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "are the result of", "after": "follow", "start_char_pos": 299, "end_char_pos": 316, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "propaganda", "after": "campaign", "start_char_pos": 369, "end_char_pos": 379, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "D", "before": ", that could of been spent on feeding starving children,", "after": null, "start_char_pos": 423, "end_char_pos": 479, "major_intent": "coherence", "raw_intents": ["clarity", "coherence", "coherence"]}, {"type": "R", "before": "propaganda messages", "after": "advertisements", "start_char_pos": 570, "end_char_pos": 589, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "Propaganda", "after": "Print", "start_char_pos": 627, "end_char_pos": 637, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "R", "before": "politicians into dickheads", "after": "you into a dickhead", "start_char_pos": 710, "end_char_pos": 736, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "meaning-changed"]}, {"type": "R", "before": "propaganda images", "after": "advertisements", "start_char_pos": 744, "end_char_pos": 761, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "dialated eyes are hot", "after": "when they're not bloodshot", "start_char_pos": 854, "end_char_pos": 875, "major_intent": "meaning-changed", "raw_intents": ["clarity", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "the government would just spend it on these ads", "after": "you still owe me from last time", "start_char_pos": 907, "end_char_pos": 954, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "D", "before": ", as part of an emotional appeal to manipulate the reader into an unfair bias on the topic, instead of comparing facts and science to the message they are promoting", "after": null, "start_char_pos": 957, "end_char_pos": 1121, "major_intent": "coherence", "raw_intents": ["clarity", "coherence", "coherence"]}, {"type": "R", "before": "know", "after": "believed", "start_char_pos": 1220, "end_char_pos": 1224, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "D", "before": "through use of unrelenting lies and social shunning", "after": null, "start_char_pos": 1720, "end_char_pos": 1771, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "clarity"]}, {"type": "R", "before": "comedians", "after": "mental illnesss", "start_char_pos": 1912, "end_char_pos": 1921, "major_intent": "meaning-changed", "raw_intents": ["others", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "improve", "after": "affect", "start_char_pos": 1937, "end_char_pos": 1944, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "D", "before": "not", "after": null, "start_char_pos": 2302, "end_char_pos": 2305, "major_intent": "meaning-changed", "raw_intents": ["others", "meaning-changed", "meaning-changed"]}], "sents_char_pos": [0, 132, 279, 364, 669, 739, 1123, 1271, 1420, 1564, 1782, 1984, 2157, 2226], "domain": "news"} |
|
{"doc_id": "74857", "revision_depth": 1, "before_revision": "The political involvement of most media and other factors are causing contradictions among official sources that are making press work extremely hard. Moreover, nobody is permitted to reach the accident location. The site of the accident was closed to visitors following the intoxication of three TV journalists who got too close. Location of the cloud Map highlighting Lviv, the site of the derailment, in the Lviv Oblast. TV channel Novyj Kanal translated a telephonic interview with officers from the Ministry of Internal Affairs during which the existence of any cloud whatsoever is firmly denied. Immediately after the interview a press conference by Kiev mayor , Mr. Leonid Chernovezkij, announces that starting next Monday a daily report about the movements of the same cloud will be available to the population. The service will be provided by the Kiev administration. In an interview to TV channel UT-1 an officer of the ecological service admitted that they have no real data about the area closer to the accident, because not even government officers are permitted to reach it. The officer said that most data about the hypothetical movements of the cloud (if the cloud exists) are made based on computer simulations they receive from Russian vendors, because no such simulation model is currently available in the Ukraine. The closest available on spot measures are related to areas located several kilometers away from the epicentre. Removal operation In the same press release the Mayor of Kiev also announced that no transport of whatever dangerous material will be allowed in the town area. The removal by railway of the phosphor containers involved in the accident was originally planned for today. Its unclear how Kievs transit refusal may affect the operationand whether the mayor has the authority to block national traffic on the railways. According to the Press Manager of the Ministery for Emergency Situations, Mr. Igor Krol', four containers will be lifted and put back on the rails today. Five containers have already been lifted. Yet the weather may affect the operations, as violent storms and very strong winds are expected in the area. This is going to introduce a serious risk factor for the phosphor that still remains on the ground. The contaminating products are in fact insulated from the air by means of pillows made of air and foam; a strong wind may cause new emergencies by even just partially removing such insulation structures. Yushchenko issued an official call to speed up the closure of the Chernobyl atomic power plants on July , 20. Immediately after he left the country with his family for an unofficial visit to Poland, that will be followed by an official visit to Germany. He is expected to travel to Germany on the evening of July 21. The Ukraine Procuror General, Mr. Aleksandr Medvedenko, declares to the press that he has visited the accident site together with the President, and that all necessary measures are being taken, both for the liquidation of the accident and the defense of the civilian population. He notes that \"it takes courage to work there, for the personnel of the Ministry for Emergency Situations\". He announces that a complex cycle of medical care is being planned for about 1500 children of the affected area. In the same interview the Procuror General announces that the Government of Kazakhstan will accept being returned the phosphor left (the goods were originally from Kazakhstan). Yet in the same hours the director of Kazphosphat (the vendor ) declares in an interview to the newspaper Segodnja that he gathers that \"the phosphor will be stocked in the Ukraine, because it makes no sense to transfer it back. Moreover, you can hardly imagine that Russia would accept the passage of such a dangerous damaged load on their railways\".", "after_revision": "The political involvement of most media and other factors are causing contradictions among official sources that are making press work extremely hard. Moreover, no one is permitted to enter the accident location. The site of the accident was closed to visitors following the injury of three TV journalists who got too close. Location of the cloud Map highlighting Lviv, the site of the derailment, in the Lviv Oblast. TV channel Novyj Kanal translated a telephonic interview with officers from the Ministry of Internal Affairs , during which the existence of any cloud whatsoever was firmly denied. Immediately after the interview , a press conference by the mayor of Kiev , Mr. Leonid Chernovezkij, announced that starting next Monday a daily report about the movements of the cloud will be available to the population. The service will be provided by the Kiev administration. In an interview with TV channel UT-1 , an officer of the ecological service admitted that they have no real data about the area closer to the accident, because not even government officers are permitted to reach it. The officer said that most data about the hypothetical movements of the cloud (if the cloud exists) are made based on computer simulations they receive from Russian vendors, because no such simulation model is currently available in the Ukraine. The closest available on spot measures are taken in areas located several kilometers away from the epicentre. Removal operation In the same press release , the Mayor of Kiev also announced that no transport of any dangerous material will be allowed in the town area. The removal by railway of the phosphor containers involved in the accident was originally planned for today. It is unclear how Kievs transit refusal may affect this operation, or whether the mayor has the authority to block national traffic on the railways. According to the Press Manager of the Ministery for Emergency Situations, Mr. Igor Krol', four containers will be lifted and put back on the rails today. Five containers have already been righted. However, weather may affect the operations, as violent storms and very strong winds are expected in the area. This is going to introduce a serious risk factor for the phosphor that still remains on the ground. The contaminating products are currently insulated from the air by means of pillows made of air and foam; a strong wind may cause new emergencies by even just partially removing such insulation structures. Yushchenko issued an official call to speed up the closure of the Chernobyl atomic power plants on July 20. Immediately afterwards, he left the country with his family for an unofficial visit to Poland, which will be followed by an official visit to Germany. He is expected to travel to Germany on the evening of July 21. The Ukraine Procuror General, Mr. Aleksandr Medvedenko, declared to the press that he has visited the accident site together with the President, and that all necessary measures are being taken, both for the cleanup of the accident and the defense of the civilian population. He notes that \"it takes courage to work there, for the personnel of the Ministry for Emergency Situations\". He announces that a complex cycle of medical care is being planned for about 1500 children of the affected area. In the same interview , the Procuror General announced that the Government of Kazakhstan will accept the phosphor still remaining (the goods were originally from Kazakhstan). However, in the same time period, the director of Kazphosphat (the vendor of the materials) declared in an interview to the newspaper Segodnja that he gathers that \"the phosphor will be stocked in the Ukraine, because it makes no sense to transfer it back. Moreover, you can hardly imagine that Russia would accept the passage of such a dangerous damaged load on their railways\".", "edit_actions": [{"type": "R", "before": "nobody", "after": "no one", "start_char_pos": 161, "end_char_pos": 167, "major_intent": "fluency", "raw_intents": ["fluency", "style", "fluency"]}, {"type": "R", "before": "reach", "after": "enter", "start_char_pos": 184, "end_char_pos": 189, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "R", "before": "intoxication", "after": "injury", "start_char_pos": 275, "end_char_pos": 287, "major_intent": "clarity", "raw_intents": ["clarity", "meaning-changed", "clarity"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 533, "end_char_pos": 533, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "is", "after": "was", "start_char_pos": 585, "end_char_pos": 587, "major_intent": "fluency", "raw_intents": ["style", "fluency", "fluency"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 635, "end_char_pos": 635, "major_intent": "fluency", "raw_intents": ["coherence", "fluency", "fluency"]}, {"type": "R", "before": "Kiev mayor", "after": "the mayor of Kiev", "start_char_pos": 658, "end_char_pos": 668, "major_intent": "clarity", "raw_intents": ["clarity", "fluency", "clarity"]}, {"type": "R", "before": "announces", "after": "announced", "start_char_pos": 696, "end_char_pos": 705, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "D", "before": "same", "after": null, "start_char_pos": 774, "end_char_pos": 778, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "R", "before": "to", "after": "with", "start_char_pos": 895, "end_char_pos": 897, "major_intent": "coherence", "raw_intents": ["fluency", "coherence", "coherence"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 914, "end_char_pos": 914, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "related to", "after": "taken in", "start_char_pos": 1381, "end_char_pos": 1391, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 1494, "end_char_pos": 1494, "major_intent": "fluency", "raw_intents": ["fluency", "coherence", "fluency"]}, {"type": "R", "before": "whatever", "after": "any", "start_char_pos": 1549, "end_char_pos": 1557, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "R", "before": "Its", "after": "It is", "start_char_pos": 1720, "end_char_pos": 1723, "major_intent": "style", "raw_intents": ["fluency", "style", "style"]}, {"type": "R", "before": "the operationand", "after": "this operation, or", "start_char_pos": 1769, "end_char_pos": 1785, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "lifted. Yet the", "after": "righted. However,", "start_char_pos": 2053, "end_char_pos": 2068, "major_intent": "clarity", "raw_intents": ["clarity", "coherence", "clarity"]}, {"type": "R", "before": "in fact", "after": "currently", "start_char_pos": 2301, "end_char_pos": 2308, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "D", "before": ",", "after": null, "start_char_pos": 2578, "end_char_pos": 2579, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "after", "after": "afterwards,", "start_char_pos": 2596, "end_char_pos": 2601, "major_intent": "fluency", "raw_intents": ["clarity", "fluency", "fluency"]}, {"type": "R", "before": "that", "after": "which", "start_char_pos": 2673, "end_char_pos": 2677, "major_intent": "fluency", "raw_intents": ["clarity", "fluency", "fluency"]}, {"type": "R", "before": "declares", "after": "declared", "start_char_pos": 2847, "end_char_pos": 2855, "major_intent": "fluency", "raw_intents": ["style", "fluency", "fluency"]}, {"type": "R", "before": "liquidation", "after": "cleanup", "start_char_pos": 2998, "end_char_pos": 3009, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 3313, "end_char_pos": 3313, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "coherence"]}, {"type": "R", "before": "announces", "after": "announced", "start_char_pos": 3335, "end_char_pos": 3344, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "being returned the phosphor left", "after": "the phosphor still remaining", "start_char_pos": 3391, "end_char_pos": 3423, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "style"]}, {"type": "R", "before": "Yet", "after": "However,", "start_char_pos": 3469, "end_char_pos": 3472, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "coherence"]}, {"type": "R", "before": "hours", "after": "time period,", "start_char_pos": 3485, "end_char_pos": 3490, "major_intent": "clarity", "raw_intents": ["style", "clarity", "clarity"]}, {"type": "R", "before": ") declares", "after": "of the materials) declared", "start_char_pos": 3531, "end_char_pos": 3541, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}], "sents_char_pos": [0, 150, 212, 330, 423, 602, 821, 878, 1091, 1337, 1449, 1610, 1719, 1864, 2018, 2060, 2169, 2269, 2373, 2473, 2727, 2790, 3069, 3177, 3290, 3468, 3697], "domain": "news"} |
|
{"doc_id": "85743", "revision_depth": 1, "before_revision": "Silvio Berlusconi shaking the hand of US President URLe W. Bush. Wiretaps disclosed by Italian daily La Repubblica have upset Italian politics. The conversations suggest that RAI (Italian state television) and private rival Mediaset ( owned by Silvio Berlusconi ) made arrangements to favour Berlusconi himself when he was Prime Minister of Italy. In these conversations RAI and Mediaset managers discussed how to present Berlusconi's defeat at the last elections in news programmes and talk shows and the decision to delay announcement of the bad results. This is not the first time Berlusconi is accused of controlling the Italy's media. Mr Berlusconi did not comment the allegations but one of his closer men , Renato Schifani, said that \"This is a media operation [... ] precise objective of destroying Mr Berlusconi's companies\". Moreover Mediaset said it would sue La Repubblica for these allegations. On the other side Walter Veltroni, the leader of the opposing party, said \"What has emerged is extremely serious\" and asked an internal investigation at RAI.", "after_revision": "Silvio Berlusconi shaking the hand of US President URLe W. Bush. Wiretaps disclosed by Italian daily La Repubblica have upset Italian politics. The conversations suggest that state-run RAI and private rival Mediaset , which is owned by Silvio Berlusconi , made arrangements to favour Berlusconi himself when he was Prime Minister of Italy. In these conversations , RAI and Mediaset managers discussed how to present Berlusconi's defeat at the last elections in news programmes and talk shows and the decision to delay announcement of the bad results. This is not the first time Berlusconi has been accused of controlling the Italian media. Berlusconi did not comment the allegations but one of his close associates , Renato Schifani, said that \"This is a media operation [... with a ] precise objective of destroying Mr Berlusconi's companies\". Mediaset said it would sue La Repubblica for these allegations. Walter Veltroni, the leader-elect of the opposition Democratic Party, said, \"What has emerged is extremely serious\" and requested an internal investigation within the RAI.", "edit_actions": [{"type": "R", "before": "RAI (Italian state television)", "after": "state-run RAI", "start_char_pos": 175, "end_char_pos": 205, "major_intent": "clarity", "raw_intents": ["style", "clarity", "clarity"]}, {"type": "R", "before": "(", "after": ", which is", "start_char_pos": 233, "end_char_pos": 234, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "clarity"]}, {"type": "R", "before": ")", "after": ",", "start_char_pos": 262, "end_char_pos": 263, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "coherence"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 371, "end_char_pos": 371, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "is", "after": "has been", "start_char_pos": 596, "end_char_pos": 598, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "style"]}, {"type": "R", "before": "Italy's", "after": "Italian", "start_char_pos": 626, "end_char_pos": 633, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "clarity"]}, {"type": "D", "before": "Mr", "after": null, "start_char_pos": 641, "end_char_pos": 643, "major_intent": "style", "raw_intents": ["clarity", "style", "style"]}, {"type": "R", "before": "closer men", "after": "close associates", "start_char_pos": 702, "end_char_pos": 712, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "A", "before": null, "after": "with a", "start_char_pos": 774, "end_char_pos": 774, "major_intent": "clarity", "raw_intents": ["clarity", "fluency", "coherence"]}, {"type": "D", "before": "Moreover", "after": null, "start_char_pos": 837, "end_char_pos": 845, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "coherence"]}, {"type": "D", "before": "On the other side", "after": null, "start_char_pos": 910, "end_char_pos": 927, "major_intent": "coherence", "raw_intents": ["clarity", "coherence", "coherence"]}, {"type": "R", "before": "leader of the opposing party, said", "after": "leader-elect of the opposition Democratic Party, said,", "start_char_pos": 949, "end_char_pos": 983, "major_intent": "meaning-changed", "raw_intents": ["clarity", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "asked", "after": "requested", "start_char_pos": 1028, "end_char_pos": 1033, "major_intent": "clarity", "raw_intents": ["fluency", "clarity", "clarity"]}, {"type": "R", "before": "at", "after": "within the", "start_char_pos": 1060, "end_char_pos": 1062, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}], "sents_char_pos": [0, 64, 143, 347, 557, 640, 836, 909], "domain": "news"} |
|
{"doc_id": "87307", "revision_depth": 1, "before_revision": "After almost two weeks of negotiations, this years United Nations Convention on Climate Change (UNCCC) is soon going to end, Friday being scheduled as the last day of the talks, with an extention until Saturday being possible if agreement can not be reached beforehand. While the head of the UNFCCC Sekretariat , Yvo de Boer, stressed from the onset that actual targets for emission reductions were not to be expected to be agreed on, the inclusion of a target frame for industrialized nations is being debated after all. However, his stated goal of deciding on a roadmap for negotiations, which he said needed to adress the matter of Greenhouse Gas (GHG) emissions reductions after 2012, and would have to be concluded by 2009, is not yet achieved, there being disagreement on a matter of points. But a bigger point of contestion is that the EU and the vast majority of the 190 nations participating in the conference want to set a target of a 25-40\\% cut in GHG emissions. The U.S., along with Canada and Japan, are refusing to agree to any targets at this point.", "after_revision": "After almost two weeks of negotiations, this years United Nations Convention on Climate Change (UNCCC) is soon going to end, Friday being scheduled as the last day of the talks, with an extension until Saturday being possible if agreement can not be reached beforehand. While the head of the UNFCCC Secretariat , Yvo de Boer, stressed from the onset that actual targets for emission reductions were not to be expected to be agreed on, the inclusion of a target frame for industrialized nations is being debated after all. However, his stated goal of deciding on a roadmap for negotiations, which he said needed to address the matter of Greenhouse Gas (GHG) emissions reductions after 2012, and would have to be concluded by 2009, is not yet achieved, there being disagreement on a matter of points. But a bigger point of contention is that the EU and the vast majority of the 190 nations participating in the conference want to set a target of a 25-40\\% cut in GHG emissions. The U.S., along with Canada and Japan, are refusing to agree to any targets at this point.", "edit_actions": [{"type": "R", "before": "extention", "after": "extension", "start_char_pos": 186, "end_char_pos": 195, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "Sekretariat", "after": "Secretariat", "start_char_pos": 299, "end_char_pos": 310, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "adress", "after": "address", "start_char_pos": 614, "end_char_pos": 620, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "contestion", "after": "contention", "start_char_pos": 820, "end_char_pos": 830, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}], "sents_char_pos": [0, 269, 521, 797, 974], "domain": "news"} |
|
{"doc_id": "98257", "revision_depth": 1, "before_revision": " The 2008 Taipei Game URLanized by the Taipei Computer Association (TCA), ended on Monday, and was different from shows of past years ; from the gaming population, industry, and exhibition arrangements. Gaming population progressively grown up Game Adventure Island. To encourage the participation of the public, not only participants (exhibitors), but also URLanizers held lots of on-site events including the \"Taiwan Local Game Championship\", \"Gaming Adventure Area\", and several stage events during the show. With many gaming experts and visitors participating in the several events, the gaming population has progressively grown up . Especially in the \"Popular Game & Company Voting\" category, the public apparently looked at the gaming industry on the bright side as the OBMs (game production companies) produced many \"excellence\" products and marketed them around the world. Excellences from participants Microsoft vs. Sony, another platform battle Sony did a grand return by cooperating with government and academical units in Taiwan. Since Sony Computer Entertainment Taiwan Limited (SCE Taiwan) was established in December 2007, SCE Taiwan participated in the 2007 Taipei IT Month to rearrange their styles from marketing and cross-industry cooperations. They finally decided to participate in the 2008 Taipei Game Show and to battle with Microsoft Taiwan by challenging different premiums and plans. On the other side of Microsoft Taiwan, to gain popularity and notability at this show , they (Microsoft ) preliminary moved their new game launch from the 3C store to the TWTC again, since 2007. And also , they invited their spokespeople \"Mayday\" to get more visitors to visit the show. They won all of the categories of the popular presentation class. After a battle between the two main world-class companies, Microsoft & Sony not only showed different cultures from exhibition arrangements and product presentations but successfully marketed their indicative products in consuming and high-definition markets. OBMs in Taiwan Some local OBMs like Softstar Entertainment, International Games Systems (IGS), GameFlier Station, UserJoy Technology, and others showed their versatility by participating not only in the trade (B2B) area for international buyers but also the consuming (B2C) area for local visitors. Not only a show Since the \"Charity Bidding Pavilion\" was created in the past year and recommended by local NGOs and URLanizations, the TCA planned to cooperate with the Taipei Orphans Welfare Association, and the Taiwan Gaming Industry Association (TGIA) to help orphan kids this year again. As of TCA's information , since 2007, company members from the TGIA adopted several orphans and accompanied them to grow up in healthy conditions before the show. The companies URLanizers were successful in turning the image of the Taipei Game Show from \"a simple consuming show\" to \"a mixed show with welfare and charity\" this year. Other Highlights ", "after_revision": "__NOTOC__ The 2008 Taipei Game URLanized by the Taipei Computer Association (TCA), ended on Monday, and was different from shows of past years . This could be seen in the gaming population, industry, and exhibition arrangements. Gaming population progressively maturing Game Adventure Island. To encourage the participation of the public, not only participants (exhibitors), but also URLanizers held lots of on-site events including the \"Taiwan Local Game Championship\", \"Gaming Adventure Area\", and several stage events during the show. With many gaming experts and visitors participating in the several events, the gaming population has progressively matured . Especially in the \"Popular Game & Company Voting\" category, the public apparently viewed the gaming industry on the bright side as the OBMs (game production companies) produced many \"excellence\" products and marketed them around the world. Excellence from participants Microsoft vs. Sony, another platform battle Sony did a grand return by cooperating with government and academical units in Taiwan. Since Sony Computer Entertainment Taiwan Limited (SCE Taiwan) was established in December 2007, SCE Taiwan has participated in the 2007 Taipei IT Month to rearrange their styles from marketing and cross-industry cooperations. They finally decided to participate in the 2008 Taipei Game Show and to battle with Microsoft Taiwan by challenging different premiums and plans. Sony rival, Microsoft Taiwan, sought to gain popularity and notability at this show . Microsoft preliminary moved their new game launch from the 3C store to the TWTC again, since 2007. Also , they invited their spokespeople \"Mayday\" to get more visitors to visit the show. They won all of the categories of the popular presentation class. After a battle between the two main world-class companies, Microsoft & Sony not only showed different cultures from exhibition arrangements and product presentations but successfully marketed their indicative products in consuming and high-definition markets. OBMs in Taiwan Some local OBMs like Softstar Entertainment, International Games Systems (IGS), GameFlier Station, UserJoy Technology, and others showed their versatility by participating , not only in the (B2B) trade area for international buyers but also the consumer (B2C) area for local visitors. Not just a show Since the \"Charity Bidding Pavilion\" was created in the past year and recommended by local NGOs and URLanizations, the TCA planned to cooperate with the Taipei Orphans Welfare Association, and the Taiwan Gaming Industry Association (TGIA) to help orphan kids this year again. As of TCA's information in 2007, corporate participants of the TGIA have \"adopted\" several orphans and accompanied them to grow up in healthy conditions before the show. The companies URLanizers were successful in turning the public image of the Taipei Game Show from \"a simple consumer show\" to \"a mixed show with welfare and charity\" this year. Other highlights", "edit_actions": [{"type": "A", "before": null, "after": "__NOTOC__", "start_char_pos": 0, "end_char_pos": 0, "major_intent": "others", "raw_intents": ["others", "others", "meaning-changed"]}, {"type": "R", "before": "; from", "after": ". This could be seen in", "start_char_pos": 134, "end_char_pos": 140, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "coherence"]}, {"type": "R", "before": "grown up", "after": "maturing", "start_char_pos": 235, "end_char_pos": 243, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "R", "before": "grown up", "after": "matured", "start_char_pos": 627, "end_char_pos": 635, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "looked at", "after": "viewed", "start_char_pos": 720, "end_char_pos": 729, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "Excellences", "after": "Excellence", "start_char_pos": 881, "end_char_pos": 892, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "clarity"]}, {"type": "A", "before": null, "after": "has", "start_char_pos": 1149, "end_char_pos": 1149, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "clarity"]}, {"type": "R", "before": "On the other side of", "after": "Sony rival,", "start_char_pos": 1411, "end_char_pos": 1431, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "A", "before": null, "after": "sought", "start_char_pos": 1450, "end_char_pos": 1450, "major_intent": "clarity", "raw_intents": ["clarity", "style", "clarity"]}, {"type": "R", "before": ", they (Microsoft )", "after": ". Microsoft", "start_char_pos": 1498, "end_char_pos": 1517, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "coherence"]}, {"type": "R", "before": "And also", "after": "Also", "start_char_pos": 1607, "end_char_pos": 1615, "major_intent": "fluency", "raw_intents": ["fluency", "clarity", "fluency"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 2212, "end_char_pos": 2212, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "D", "before": "trade", "after": null, "start_char_pos": 2229, "end_char_pos": 2234, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "A", "before": null, "after": "trade", "start_char_pos": 2241, "end_char_pos": 2241, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "meaning-changed"]}, {"type": "R", "before": "consuming", "after": "consumer", "start_char_pos": 2285, "end_char_pos": 2294, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "only", "after": "just", "start_char_pos": 2330, "end_char_pos": 2334, "major_intent": "clarity", "raw_intents": ["fluency", "clarity", "clarity"]}, {"type": "R", "before": ", since", "after": "in", "start_char_pos": 2642, "end_char_pos": 2649, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "R", "before": "company members from the TGIA adopted", "after": "corporate participants of the TGIA have \"adopted\"", "start_char_pos": 2656, "end_char_pos": 2693, "major_intent": "clarity", "raw_intents": ["style", "clarity", "clarity"]}, {"type": "A", "before": null, "after": "public", "start_char_pos": 2837, "end_char_pos": 2837, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "clarity"]}, {"type": "R", "before": "consuming", "after": "consumer", "start_char_pos": 2883, "end_char_pos": 2892, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "clarity"]}, {"type": "D", "before": "Other Highlights", "after": null, "start_char_pos": 2953, "end_char_pos": 2969, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "coherence"]}, {"type": "A", "before": null, "after": "Other highlights", "start_char_pos": 2970, "end_char_pos": 2970, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}], "sents_char_pos": [0, 135, 202, 266, 511, 880, 924, 1041, 1264, 1410, 1606, 1698, 1764, 2024, 2325, 2617, 2780, 2952]} |
|
{"doc_id": "1560029", "revision_depth": 2, "before_revision": "The Kingdom of Tungning (), also known as Tywan by the British at the time, pp. 347348. was a dynastic maritime state that ruled part of southwestern Formosa (Taiwan) and Penghu islands between 1661 and 1683. At its peak the kingdom's maritime power dominated varying extents of coastal regions of southeastern China, and its vast trade network stretching from Japan to Southeast Asia. The kingdom was founded by Koxinga (Zheng Chenggong) after seizing control of Taiwan from the Dutch rule. Zheng hoped to restore the Ming dynastic rule on the Chinese mainland , when the Ming remnants' rump state in the southern China was progressively conquered by the Manchu-led Qing dynasty. Zheng dynasts used the newly-owned island as part of the loyalist movement aiming to reclaim the mainland China from the Qing, mainly as a base of military operationsbut also deepening the process of Sinicization on Taiwan, in an effort to consolidate the last stronghold of Han Chinese resistance against the invading Manchus. \"Historical and Legal Aspects of the International Status of Taiwan (Formosa)\" by Ng Yuzin Chiautong, published on August 28, 1971, WUFI Until its annexation by the Qing dynasty in 1683, the kingdom was ruled by Koxinga's heirs, the House of Koxinga , and the period of rule is sometimes referred to as the Koxinga dynasty . Names In reference to its reigning dynasty, the Kingdom of Tungning is sometimes known as the Zheng dynasty (), Zheng clan Kingdom () or Yanping Kingdom (), named after Koxinga's hereditary title of \"Prince of Yanping \" ()that bestowed by the Yongli emperor of the South Ming . Taiwan was referred to by Koxinga as Tungtu (). In Britain , it was known as Tywan (Taiwan), named after the King's residence at the city of \"Tywan\" in present-day Tainan. The period of rule is sometimes referred to as the Koxinga dynasty. ", "after_revision": "The Kingdom of Tungning (), or the Kingdom of Formosa, was a dynastic state that ruled part of southwestern Formosa (Taiwan) between 1661 and 1683. It was founded by Koxinga (Zheng Chenggong) as part of the loyalist movement to restore the Ming dynasty on the Chinese mainland after its rump state in southern China was overthrown by the Manchu-led Qing dynasty. Koxinga hoped to recapture the mainland China from the Qing, using the island as a base of operations. Until its annexation by the Qing dynasty in 1683, the Kingdom was ruled by Koxinga's heirs, the House of Koxinga . Names In reference to its reigning dynasty, the Kingdom of Tungning is sometimes known as the Zheng dynasty (), Zheng Family Kingdom () or Kingdom of Yanping () . Taiwan was referred to by Koxinga as Tungtu (). In the West , it was known as the Kingdom of Taiwan, and the period of rule is sometimes referred to as the Koxinga dynasty. \"Historical and Legal Aspects of the International Status of Taiwan (Formosa)\" WUFI", "edit_actions": [{"type": "R", "before": "also known as Tywan by the British at the time, pp. 347348.", "after": "or the Kingdom of Formosa,", "start_char_pos": 28, "end_char_pos": 87, "major_intent": "clarity", "raw_intents": ["clarity", "meaning-changed", "clarity"]}, {"type": "D", "before": "maritime", "after": null, "start_char_pos": 103, "end_char_pos": 111, "major_intent": "style", "raw_intents": ["coherence", "style", "style"]}, {"type": "D", "before": "and Penghu islands", "after": null, "start_char_pos": 167, "end_char_pos": 185, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "D", "before": "At its peak the kingdom's maritime power dominated varying extents of coastal regions of southeastern China, and its vast trade network stretching from Japan to Southeast Asia.", "after": null, "start_char_pos": 209, "end_char_pos": 385, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "clarity"]}, {"type": "R", "before": "The kingdom", "after": "It", "start_char_pos": 386, "end_char_pos": 397, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "R", "before": "after seizing control of Taiwan from the Dutch rule. Zheng hoped", "after": "as part of the loyalist movement", "start_char_pos": 439, "end_char_pos": 503, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "dynastic rule", "after": "dynasty", "start_char_pos": 524, "end_char_pos": 537, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": ", when the Ming remnants'", "after": "after its", "start_char_pos": 562, "end_char_pos": 587, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "style"]}, {"type": "D", "before": "the", "after": null, "start_char_pos": 602, "end_char_pos": 605, "major_intent": "clarity", "raw_intents": ["clarity", "fluency", "clarity"]}, {"type": "R", "before": "progressively conquered", "after": "overthrown", "start_char_pos": 625, "end_char_pos": 648, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "Zheng dynasts used the newly-owned island as part of the loyalist movement aiming to reclaim", "after": "Koxinga hoped to recapture", "start_char_pos": 681, "end_char_pos": 773, "major_intent": "clarity", "raw_intents": ["clarity", "meaning-changed", "clarity"]}, {"type": "R", "before": "mainly", "after": "using the island", "start_char_pos": 808, "end_char_pos": 814, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "clarity"]}, {"type": "R", "before": "military operationsbut also deepening the process of Sinicization on Taiwan, in an effort to consolidate the last stronghold of Han Chinese resistance against the invading Manchus. \"Historical and Legal Aspects of the International Status of Taiwan (Formosa)\" by Ng Yuzin Chiautong, published on August 28, 1971, WUFI", "after": "operations.", "start_char_pos": 828, "end_char_pos": 1145, "major_intent": "coherence", "raw_intents": ["coherence", "clarity", "coherence"]}, {"type": "R", "before": "kingdom", "after": "Kingdom", "start_char_pos": 1200, "end_char_pos": 1207, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "D", "before": ", and the period of rule is sometimes referred to as the Koxinga dynasty", "after": null, "start_char_pos": 1259, "end_char_pos": 1331, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "R", "before": "clan", "after": "Family", "start_char_pos": 1452, "end_char_pos": 1456, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "Yanping Kingdom (), named after Koxinga's hereditary title of \"Prince of Yanping \" ()that bestowed by the Yongli emperor of the South Ming", "after": "Kingdom of Yanping ()", "start_char_pos": 1471, "end_char_pos": 1609, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "R", "before": "Britain", "after": "the West", "start_char_pos": 1663, "end_char_pos": 1670, "major_intent": "clarity", "raw_intents": ["meaning-changed", "clarity", "clarity"]}, {"type": "R", "before": "Tywan (Taiwan), named after the King's residence at the city of \"Tywan\" in present-day Tainan. The", "after": "the Kingdom of Taiwan, and the", "start_char_pos": 1689, "end_char_pos": 1787, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "A", "before": null, "after": "\"Historical and Legal Aspects of the International Status of Taiwan (Formosa)\" WUFI", "start_char_pos": 1852, "end_char_pos": 1852, "major_intent": "coherence", "raw_intents": ["coherence", "meaning-changed", "coherence"]}], "sents_char_pos": [0, 208, 385, 491, 680, 1008, 1087, 1467, 1611, 1659, 1783], "domain": "wiki"} |
|
{"doc_id": "18060507", "revision_depth": 1, "before_revision": "In the field Corporate diagnosis is a process that involves the three steps of publicly entering a human system, collecting valid data about experiences, and feeding back to the system toward promoting corporate performance (Zarei et al. ( 2014). The effective diagnosis URLanizational culture, and structural and operational strengths and weaknesses are fundamental to any URLanizational development intervention. As BeckhardOrganizational Development: strategies and models - Beckhard 1969 said in the preface to his seminal work ... in our rapidly changing environment, URLanization forms must be developed; more effective goal-setting and planning processes must be learned, and practiced teams of independent people must spend real time improving their methods of working, decision-making and communicating. Competing or conflicting groups must move towards a collaborative way of work. In order for these changes to occur and be maintained, a planned, managed change effort is necessary - a program URLanizational development. Since the beginnings URLanizational development as a profession, diagnosis has moved from the purely behavioral towards a strategic and holistic business diagnostic approach, and from looking at human interventions in isolation to exploring the interactions of people in the context in which they operate. URLanizations are more collaborative in nature, the traditional silo approach to diagnostics is becoming increasingly rare. Organizational development and in particular the diagnostic phase of activities is spreading from the occupational psychologists towards mainstream business. This is important for OD practitioners as the role is increasingly holistic URLanizational diagnosis models Until now, the following models are introduced URLanizational diagnosis: Force Field Analysis (1951) Leavitts model (1965) Likert system analysis (1967) Weisbords six-box model; (1976) defined by focusing on : One major output, exploring the extent in which consumers of the output are satisfied with it, and tracing the reasons for any dissatisfaction. Congruence model URLanization analysis (1977) Mckinsey 7s framework (1981-1982) Tichys technical political cultural (TPC) framework (1983) High-performance programming (1984) Diagnosing individual and group behavior (1987) BurkeLitwin model URLanizational performance and change (1992) All models are based on open system (Open System Theory ( OST): From the General System Theory is defined by Von Bertalaffy (a system complex of interacting elements), Katz and Kahn (1978) apply the concept Open System Theory (OST) looks at the relationship between URLanizations and the environment in which they involved. This focus reflects URLanization's ability to adapt to changes in environment conditions (with or without the need for information processing (Boulding, 1956; Katz and Kahn, 1978) . URLanizational intelligence model (2008) Semantic Network Analysis (2014) (by Zarei, Chaghouee and Ghapanchi) The Consulting Process URLanizational Diagnostic phase is often integrated within an overall OD process, commonly called 'a consulting process'. An example of such a process is: References Cameron & Quinn; Diagnosing and Changing Organizational Culture, 1999 Harrison, Michael I. Diagnosing Organizations: Methods, Models, and Processes, 2005 Levinson, Harry; Organizational Diagnosis, 1972 Organizational Diagnosis - A practical approach to company problem solving and growth, 1988 Weisbord, Marvin R; Organizational Diagnosis - A workbook of theory and Practice , 1978 Zarei, B., Chaghouee, Y. and Ghapanchi, F. (2014) - Organizational Diagnosis in Project-Based Companies: Challenges and Directions, Sage open,4(2), PP.17 , DOI: 10.1177/2158244014537498.;", "after_revision": "In the field of corporate diagnosis is a process that involves the three steps of publicly entering a human system, collecting valid data about experiences, and feeding back to the system toward promoting corporate performance (Zarei et al. , 2014). The effective diagnosis URLanizational culture, and structural and operational strengths and weaknesses are fundamental to any URLanizational development intervention. As BeckhardOrganizational Development: strategies and models - Beckhard 1969 said in the preface to his seminal work : ... in our rapidly changing environment, URLanization forms must be developed; more effective goal-setting and planning processes must be learned, and practiced teams of independent people must spend real time improving their methods of working, decision-making and communicating. Competing or conflicting groups must move towards a collaborative way of work. In order for these changes to occur and be maintained, a planned, managed change effort is necessary - a program URLanizational development. Since the beginnings URLanizational development as a profession, diagnosis has moved from the purely behavioral towards a strategic and holistic business diagnostic approach, and from looking at human interventions in isolation to exploring the interactions of people in the context in which they operate. URLanizations are more collaborative in nature, the traditional silo approach to diagnostics is becoming increasingly rare. Organizational development and in particular the diagnostic phase of activities is spreading from the occupational psychologists towards mainstream business. This is important for OD practitioners as the role is increasingly holistic . URLanizational diagnosis models The following models have been introduced URLanizational diagnosis: Force Field Analysis (1951) Leavitts model (1965) Likert system analysis (1967) Weisbords six-box model; (1976) defined by focusing on one major output, exploring the extent to which consumers of the output are satisfied with it, and tracing the reasons for any dissatisfaction. Congruence model URLanization analysis (1977) Mckinsey 7s framework (1981-1982) Tichys technical political cultural (TPC) framework (1983) High-performance programming (1984) Diagnosing individual and group behavior (1987) BurkeLitwin model URLanizational performance and change (1992) All models are based on open system (Open System Theory , OST): From the General System Theory defined by Von Bertalaffy (a system complex of interacting elements), Katz and Kahn (1978) apply the concept of Open System Theory (OST) , looking at the relationship between URLanizations and the environment in which they are involved. This focus reflects on URLanization's ability to adapt to changes in environment conditions (with or without the need for information processing ). (Boulding, 1956; Katz and Kahn, 1978) URLanizational intelligence model (2008) Semantic Network Analysis (2014) (by Zarei, Chaghouee and Ghapanchi) The consulting process URLanizational diagnostic phase is often integrated within an overall OD process, commonly called 'a consulting process'. An example of such a process is: References Cameron & Quinn; Diagnosing and Changing Organizational Culture, 1999 Harrison, Michael I. ; Diagnosing Organizations: Methods, Models, and Processes, 2005 Levinson, Harry; Organizational Diagnosis, 1972 Organizational Diagnosis - A practical approach to company problem solving and growth, 1988 Weisbord, Marvin R; Organizational Diagnosis - A workbook of theory and practice , 1978 Zarei, B., Chaghouee, Y. and Ghapanchi, F. (2014) - Organizational Diagnosis in Project-Based Companies: Challenges and Directions, Sage open,4(2), pp. 17 , DOI: 10.1177/2158244014537498.;", "edit_actions": [{"type": "R", "before": "Corporate", "after": "of corporate", "start_char_pos": 13, "end_char_pos": 22, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "clarity"]}, {"type": "R", "before": "(", "after": ",", "start_char_pos": 238, "end_char_pos": 239, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "A", "before": null, "after": ":", "start_char_pos": 532, "end_char_pos": 532, "major_intent": "fluency", "raw_intents": ["fluency", "others", "fluency"]}, {"type": "A", "before": null, "after": ".", "start_char_pos": 1698, "end_char_pos": 1698, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "Until now, the following models are", "after": "The following models have been", "start_char_pos": 1731, "end_char_pos": 1766, "major_intent": "clarity", "raw_intents": ["clarity", "others", "clarity"]}, {"type": "R", "before": ": One", "after": "one", "start_char_pos": 1939, "end_char_pos": 1944, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "in", "after": "to", "start_char_pos": 1980, "end_char_pos": 1982, "major_intent": "fluency", "raw_intents": ["coherence", "fluency", "fluency"]}, {"type": "R", "before": "(", "after": ",", "start_char_pos": 2427, "end_char_pos": 2428, "major_intent": "fluency", "raw_intents": ["coherence", "fluency", "fluency"]}, {"type": "D", "before": "is", "after": null, "start_char_pos": 2466, "end_char_pos": 2468, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "A", "before": null, "after": "of", "start_char_pos": 2578, "end_char_pos": 2578, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "looks", "after": ", looking", "start_char_pos": 2604, "end_char_pos": 2609, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "style"]}, {"type": "A", "before": null, "after": "are", "start_char_pos": 2686, "end_char_pos": 2686, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "A", "before": null, "after": "on", "start_char_pos": 2717, "end_char_pos": 2717, "major_intent": "fluency", "raw_intents": ["fluency", "others", "fluency"]}, {"type": "A", "before": null, "after": ").", "start_char_pos": 2840, "end_char_pos": 2840, "major_intent": "fluency", "raw_intents": ["fluency", "others", "fluency"]}, {"type": "D", "before": ".", "after": null, "start_char_pos": 2879, "end_char_pos": 2880, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "Consulting Process URLanizational Diagnostic", "after": "consulting process URLanizational diagnostic", "start_char_pos": 2995, "end_char_pos": 3039, "major_intent": "others", "raw_intents": ["style", "others", "others"]}, {"type": "A", "before": null, "after": ";", "start_char_pos": 3271, "end_char_pos": 3271, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "Practice", "after": "practice", "start_char_pos": 3547, "end_char_pos": 3555, "major_intent": "fluency", "raw_intents": ["others", "fluency", "fluency"]}, {"type": "R", "before": "PP.17", "after": "pp. 17", "start_char_pos": 3711, "end_char_pos": 3716, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}], "sents_char_pos": [0, 246, 414, 611, 813, 892, 1033, 1339, 1463, 1621, 1908, 2084, 2434, 2696, 2857, 3135, 3196, 3351, 3494], "domain": "wiki"} |
|
{"doc_id": "22062208", "revision_depth": 1, "before_revision": "Discovery-driven planning is a planning technique first introduced in a Harvard Business Review article by Rita Gunther McGrath and Ian C. MacMillan in 1995 McGrath , R. G. & MacMillan, I. C. 1995. Discovery Driven Planning . Harvard Business Review, 73(4): 4454 and subsequently referenced in a number of books and articles.McGrath, R. G. & MacMillan, I. C. 2009. Discovery Driven Growth: A Breakthrough Process to Reduce Risk and Seize Opportunity . Boston: Harvard Business Publishing ; Christensen, C. M. 1997. The innovator's dilemma: When new technologies cause great firms to fail. Boston , Mass. : Harvard Business School Press Its main thesis is that when one is operating in arenas with significant amounts of uncertainty, that a different approach than is normally used in conventional planning applies . In conventional planning, the correctness of a plan is generally judged by how close outcomes come to projections. In discovery-driven planning, it is assumed that plan parameters may change as new information is revealed. With conventional planning, it is considered appropriate to fund the entire project as the expectation is that one can predict a positive outcome. In discovery driven planning, funds are released based on the accomplishment of key milestones or checkpoints, at which point additional funding can be made available predicated on reasonable expectations for future success.Block, Z. & MacMillan, I. C. 1985. Milestones for successful venture planning. Harvard Business Review, 63(5): 8490 Conventional project management tools, such as stage-gate models or the use of financial tools to assess innovation have been found to be flawed in that they are not well suited for the uncertainty of innovation-oriented projects Rajesh, S. & Zafar, I. 2008. Stage-Gate Controls, Learning Failure, and Adverse Effect on Novel New Products . Journal of Marketing, 72(1): 118. Christensen, C., Kaufman, S., & Shih, W. 2008. Innovation killers: how financial tools destroy your capacity to do new things. Harvard Business Review, 86(1): 98105, 137 Discovery-driven planning has been widely used in entrepreneurship curricula and has recently been cited by Steven G. Blank as a foundational idea in the lean startup methodology Blank, S. (2013). \"Why the Lean Start-Up Changes Everything. \" Harvard Business Review 91(5): 63-72. A discovery driven plan incorporates five disciplines or plan elements: Definition of success for the plan or initiative, including a 'reverse' income statement Benchmarking against market and competitive parameters Specification of operational requirements Documentation of assumptions Specification of key checkpoints Using discovery-driven planning, it is often possible to iterate the ideas in a plan, encouraging experimentation at lowest possible cost. The methodology is consistent with the application of real options reasoning to business planning, in which ventures are considered 'real' options. A real option is a small investment made today which buys the right, but not the obligation to make further investments.McGrath, R. G. 1997. A real options logic for initiating technology positioning investments. Academy of Management Review, 22(4): 974996van Putten, A. B. & MacMillan, I. C. 2004. Making Real Options Really Work . Harvard Business Review, 82(12): 134Dixit , A. K. & Pindyck, R. S. 1994. Investment Under Uncertainty. Princeton, New Jersey : Princeton University Press ", "after_revision": "Discovery-driven planning is a planning technique first introduced in a Harvard Business Review article by Rita Gunther McGrath and Ian C. MacMillan in 1995McGrath , R. G. & MacMillan, I. C. 1995. Discovery driven planning . Harvard Business Review, 73(4): 4454. and subsequently referenced in a number of books and articles.McGrath, R. G. & MacMillan, I. C. 2009. Discovery driven growth: a breakthrough process to reduce risk and seize opportunity . Boston: Harvard Business Publishing . Christensen, C. M. 1997. The innovator's dilemma: When new technologies cause great firms to fail. Boston : Harvard Business School Press . Its main thesis is that when one is operating in arenas with significant amounts of uncertainty, that a different approach applies than is normally used in conventional planning . In conventional planning, the correctness of a plan is generally judged by how close outcomes come to projections. In discovery-driven planning, it is assumed that plan parameters may change as new information is revealed. With conventional planning, it is considered appropriate to fund the entire project , as the expectation is that one can predict a positive outcome. In discovery-driven planning, funds are released based on the accomplishment of key milestones or checkpoints, at which point additional funding can be made available predicated on reasonable expectations for future success.Block, Z. & MacMillan, I. C. 1985. Milestones for successful venture planning. Harvard Business Review, 63(5): 8490. Conventional project management tools, such as stage-gate models or the use of financial tools to assess innovation , have been found to be flawed in that they are not well suited for the uncertainty of innovation-oriented projects Rajesh, S. & Zafar, I. 2008. Stage-gate controls, learning failure, and adverse effect on novel new products . Journal of Marketing, 72(1): 118. Christensen, C., Kaufman, S., & Shih, W. 2008. Innovation killers: how financial tools destroy your capacity to do new things. Harvard Business Review, 86(1): 98105, 137. Discovery-driven planning has been widely used in entrepreneurship curricula and has recently been cited by Steven G. Blank as a foundational idea in the lean startup methodology Blank, S. 2013. Why the lean start-up changes everything. Harvard Business Review , 91(5): 6372. A discovery-driven plan incorporates five disciplines or plan elements: Definition of success for the plan or initiative, including a 'reverse' income statement Benchmarking against market and competitive parameters Specification of operational requirements Documentation of assumptions Specification of key checkpoints Using discovery-driven planning, it is often possible to iterate the ideas in a plan, encouraging experimentation at lowest possible cost. The methodology is consistent with the application of real options reasoning to business planning, in which ventures are considered \"real\" options. A real option is a small investment made today which buys the right, but not the obligation to make further investments.McGrath, R. G. 1997. A real options logic for initiating technology positioning investments. Academy of Management Review, 22(4): 974996. van Putten, A. B. & MacMillan, I. C. 2004. Making real options really work . Harvard Business Review, 82(12): 134.Dixit , A. K. & Pindyck, R. S. 1994. Investment under uncertainty. Princeton : Princeton University Press .", "edit_actions": [{"type": "R", "before": "1995 McGrath", "after": "1995McGrath", "start_char_pos": 152, "end_char_pos": 164, "major_intent": "others", "raw_intents": ["others", "fluency", "others"]}, {"type": "R", "before": "Driven Planning", "after": "driven planning", "start_char_pos": 208, "end_char_pos": 223, "major_intent": "fluency", "raw_intents": ["others", "fluency", "fluency"]}, {"type": "R", "before": "4454", "after": "4454.", "start_char_pos": 258, "end_char_pos": 262, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "Driven Growth: A Breakthrough Process to Reduce Risk and Seize Opportunity", "after": "driven growth: a breakthrough process to reduce risk and seize opportunity", "start_char_pos": 375, "end_char_pos": 449, "major_intent": "others", "raw_intents": ["fluency", "others", "others"]}, {"type": "R", "before": ";", "after": ".", "start_char_pos": 488, "end_char_pos": 489, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "D", "before": ", Mass.", "after": null, "start_char_pos": 596, "end_char_pos": 603, "major_intent": "style", "raw_intents": ["style", "others", "clarity"]}, {"type": "A", "before": null, "after": ".", "start_char_pos": 636, "end_char_pos": 636, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "A", "before": null, "after": "applies", "start_char_pos": 760, "end_char_pos": 760, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "coherence", "meaning-changed"]}, {"type": "D", "before": "applies", "after": null, "start_char_pos": 808, "end_char_pos": 815, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "style"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 1125, "end_char_pos": 1125, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "discovery driven", "after": "discovery-driven", "start_char_pos": 1192, "end_char_pos": 1208, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "8490", "after": "8490.", "start_char_pos": 1524, "end_char_pos": 1528, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 1645, "end_char_pos": 1645, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "Stage-Gate Controls, Learning Failure, and Adverse Effect on Novel New Products", "after": "Stage-gate controls, learning failure, and adverse effect on novel new products", "start_char_pos": 1789, "end_char_pos": 1868, "major_intent": "others", "raw_intents": ["fluency", "others", "others"]}, {"type": "R", "before": "137", "after": "137.", "start_char_pos": 2071, "end_char_pos": 2074, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "(2013). \"Why the Lean Start-Up Changes Everything. \"", "after": "2013. Why the lean start-up changes everything.", "start_char_pos": 2264, "end_char_pos": 2316, "major_intent": "others", "raw_intents": ["fluency", "others", "others"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 2341, "end_char_pos": 2341, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "63-72.", "after": "6372.", "start_char_pos": 2349, "end_char_pos": 2355, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "discovery driven", "after": "discovery-driven", "start_char_pos": 2358, "end_char_pos": 2374, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "'real'", "after": "\"real\"", "start_char_pos": 2947, "end_char_pos": 2953, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "974996van", "after": "974996. van", "start_char_pos": 3213, "end_char_pos": 3222, "major_intent": "fluency", "raw_intents": ["others", "fluency", "fluency"]}, {"type": "R", "before": "Real Options Really Work", "after": "real options really work", "start_char_pos": 3269, "end_char_pos": 3293, "major_intent": "others", "raw_intents": ["fluency", "others", "others"]}, {"type": "R", "before": "134Dixit", "after": "134.Dixit", "start_char_pos": 3329, "end_char_pos": 3337, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "others"]}, {"type": "R", "before": "Under Uncertainty. Princeton, New Jersey", "after": "under uncertainty. Princeton", "start_char_pos": 3380, "end_char_pos": 3420, "major_intent": "clarity", "raw_intents": ["clarity", "others", "clarity"]}, {"type": "A", "before": null, "after": ".", "start_char_pos": 3450, "end_char_pos": 3450, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}], "sents_char_pos": [0, 197, 325, 364, 451, 489, 514, 588, 817, 932, 1040, 1188, 1413, 1447, 1491, 1788, 1951, 2031, 2271, 2314, 2355, 2814, 2962, 3083, 3103, 3175, 3261, 3368, 3398], "domain": "wiki"} |
|
{"doc_id": "29305513", "revision_depth": 1, "before_revision": "Legacy of nationalization of history Nation mythologies, histories and states An ancient Finnish Hero Illustration from Kalevala One of the most important consequences of printed texts of nationalized history was that it provided a basis for national revivals in the process of creating modern nations. Projects of national awakenings captured nationalized history and turned it into a weapon of popularization of national myths in the period of establishing history as a social scientific discipline. A shortcut to production of national mythologies that proves ancient origins of modern nations, providing them with a respectable past, was URLery of historical documents, literature and historical works that were lost for some time, and then suddenly rediscovered to the approval of an astonished grateful public. Authors of such rediscovered treasures that were in a quest for success and glory did not suspect that they were in fact builders of as yet nonexistent modern nations. Even when it was obvious that certain texts were basically invented national myths, many social groups, and even intellectuals, wanted to believe that they were authentic national epics, like the Kalevala in Finland. The nationalization of history, which had its origins more in the epics and tendentious oratory than in philosophy, sometimes grew the idea of an esprit des peuples or national spirit , and , later still , the idea that each nation had a 'mission'. Such ideas did not evolve into groups of associated individuals, but into universal spirits that is said to be able to destroy individuals and nations. Nationalization of history was an important element of national revival and creating new nation states in the 19th and the beginning of the 20th century. New nation states and their institutions had the most important role in social process of the professionalization and institutionalization of history that was additionally supporting the process of nationalization of history. The final consequence was that national history regarded the nation-state as the primary unit of historical analysis. Society and nature 250px|State entities on the former territory of Yugoslavia, 2008. Nationalization of history affects all aspects of life, from relationships with other nationalities to architecture. This is a result of the fact that nationalization of history corresponds with nationalization of nature , and the fact that reservations and hostilities toward other nations accompanied nationalism from the beginning. At the end of the 20th century there were extreme nationalistic interpretations of Balkan and Caucasus history, which became powerful weapons in ethno-territorial conflicts and accelerated disintegration of multinational states like Yugoslavia and the Soviet Union.Pakier, Strth; p. 39, \"In the Balkan and Caucasian parts of Europe, history in extreme nationalistic interpretations developed into powerful weapon in ethno-territorial conflicts and accelerated disintegration of multi-national states like Soviet Union and Yugoslavia.\" After disintegration of multinational states like Yugoslavia and Soviet Union, besides the process of renationalization of history, there is sometimes also retroactive nationalization of victims or tragedies of the people that in past lived in those states. According to new national historical narratives, the reason for some people were the victims of certain tragedies was because they were of a certain nationality, for example the Ukrainians of the Soviet Union.Zhurzhenko, The geopolitics of memory, \" An important consequence of the delegitimization of the Soviet historical narrative and the (re-)construction of national histories after 1991 is the retroactive nationalization of victims.... According to the new national historical narrative, they were killed by the Soviet regime because they were Ukrainians \" Nationalist discourse in Croatia presents the aftermath of Bleiburg repatriations as an event where only Croatians suffered and died just because they were Croatians , eluding the fact that many of the victims were Serbs, Montenegrins or Slovenians while many Croatians had died while fighting as collaborators against Yugoslav partisans. Denationalization of history Nationalization of history has been increasingly called into question, and one of its consequences is the emerging of processes of denationalization of history, which is the result of an intention to change the perspective of creating works about history by promoting pluralism and international standards in social sciences. In Central and Eastern Europe there are tensions between nationalization of history and the process of European integrations. That is one of reasons URLanized activities aimed toward denationalization of history. In cases when history was reinterpreted and filtered by the media and official orthodoxy there is a situation in which nationalization of history leads to its denial.", "after_revision": "Legacy Nation mythologies, histories and states An ancient Finnish Hero Illustration from Kalevala One of the most important consequences of printed texts of nationalized history was that it provided a basis for national revivals in the process of creating modern nations. Projects of national awakenings captured nationalized history and turned it into a weapon of popularization of national myths in the period of establishing history as a social scientific discipline. A shortcut to production of national mythologies that proves ancient origins of modern nations, providing them with a respectable past, was URLery of historical documents, literature and historical works that were lost for some time, and then suddenly rediscovered to the approval of an astonished grateful public. Authors of such rediscovered treasures that were in a quest for success and glory did not suspect that they were in fact builders of as yet nonexistent modern nations. Even when it was obvious that certain texts were basically invented national myths, many social groups, and even intellectuals, wanted to believe that they were authentic national epics, like the Kalevala in Finland. The nationalization of history, which had its origins more in the epics and tendentious oratory than in philosophy, sometimes grew the idea of an esprit des peuples or national spirit and later still the idea that each nation had a 'mission'. Such ideas evolved not into groups of associated individuals, but into universal spirits that is said to be able to destroy individuals and nations. The ntionalization of history was an important element of national revival and creating new nation states in the 19th and the beginning of the 20th century. New nation states and their institutions had the most important role in social process of the professionalization and institutionalization of history that was additionally supporting the process of nationalization of history. The final consequence was that national history regarded the nation-state as the primary unit of historical analysis. Society and nature 250px|State entities on the former territory of Yugoslavia, 2008. The nationalization of history affects all aspects of life, from relationships with other nationalities to architecture. That is a result of the fact that nationalization of history corresponds with nationalization of nature and the fact that reservations and hostilities toward other nations accompanied nationalism from the beginning. At the end of the 20th century were extreme nationalistic interpretations of Balkan and Caucasus history, which became powerful weapons in ethno-territorial conflicts and accelerated disintegration of multinational states like Yugoslavia and the Soviet Union.Pakier, Strth; p. 39, \"In the Balkan and Caucasian parts of Europe, history in extreme nationalistic interpretations developed into powerful weapon in ethno-territorial conflicts and accelerated disintegration of multi-national states like Soviet Union and Yugoslavia.\" After disintegration of multinational states like Yugoslavia and Soviet Union, besides the process of renationalization of history, there is sometimes also retroactive nationalization of victims or tragedies of the people that in past lived in those states. According to new national historical narratives, the reason for some people were the victims of certain tragedies was because they were of a certain nationality, for example the Ukrainians of the Soviet Union.Zhurzhenko, The geopolitics of memory, \" An important consequence of the delegitimization of the Soviet historical narrative and the (re-)construction of national histories after 1991 is the retroactive nationalization of victims.... According to the new national historical narrative, they were killed by the Soviet regime because they were Ukrainians . \" Nationalist discourse in Croatia presents the aftermath of Bleiburg repatriations as an event in which only Croatians suffered and died just because they were Croatians but eludes the fact that many of the victims were Serbs, Montenegrins or Slovenians , and many Croatians had died fighting as collaborators against Yugoslav partisans. Denationalization of history The nationalization of history has been increasingly called into question, and one of its consequences is the emerging of processes of denationalization of history, which is the result of an intention to change the perspective of creating works about history by promoting pluralism and international standards in social sciences. In Central and Eastern Europe there are tensions between nationalization of history and the process of European integrations. That is one of reasons URLanized activities aimed toward denationalization of history. If history was reinterpreted and filtered by the media and official orthodoxy , there is a situation in which the nationalization of history leads to its denial.", "edit_actions": [{"type": "D", "before": "of nationalization of history", "after": null, "start_char_pos": 7, "end_char_pos": 36, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "R", "before": ", and , later still ,", "after": "and later still", "start_char_pos": 1386, "end_char_pos": 1407, "major_intent": "coherence", "raw_intents": ["fluency", "coherence", "coherence"]}, {"type": "R", "before": "did not evolve", "after": "evolved not", "start_char_pos": 1462, "end_char_pos": 1476, "major_intent": "style", "raw_intents": ["clarity", "style", "style"]}, {"type": "R", "before": "Nationalization", "after": "The ntionalization", "start_char_pos": 1603, "end_char_pos": 1618, "major_intent": "clarity", "raw_intents": ["clarity", "fluency", "others"]}, {"type": "R", "before": "Nationalization", "after": "The nationalization", "start_char_pos": 2186, "end_char_pos": 2201, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "clarity"]}, {"type": "R", "before": "This", "after": "That", "start_char_pos": 2303, "end_char_pos": 2307, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "fluency"]}, {"type": "D", "before": ",", "after": null, "start_char_pos": 2407, "end_char_pos": 2408, "major_intent": "fluency", "raw_intents": ["others", "fluency", "fluency"]}, {"type": "D", "before": "there", "after": null, "start_char_pos": 2552, "end_char_pos": 2557, "major_intent": "clarity", "raw_intents": ["clarity", "style", "others"]}, {"type": "A", "before": null, "after": ".", "start_char_pos": 3876, "end_char_pos": 3876, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "where", "after": "in which", "start_char_pos": 3973, "end_char_pos": 3978, "major_intent": "clarity", "raw_intents": ["fluency", "clarity", "clarity"]}, {"type": "R", "before": ", eluding", "after": "but eludes", "start_char_pos": 4045, "end_char_pos": 4054, "major_intent": "clarity", "raw_intents": ["clarity", "coherence", "clarity"]}, {"type": "R", "before": "while", "after": ", and", "start_char_pos": 4128, "end_char_pos": 4133, "major_intent": "coherence", "raw_intents": ["fluency", "coherence", "coherence"]}, {"type": "D", "before": "while", "after": null, "start_char_pos": 4158, "end_char_pos": 4163, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "R", "before": "Nationalization", "after": "The nationalization", "start_char_pos": 4247, "end_char_pos": 4262, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "In cases when", "after": "If", "start_char_pos": 4786, "end_char_pos": 4799, "major_intent": "coherence", "raw_intents": ["coherence", "clarity", "coherence"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 4875, "end_char_pos": 4875, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "A", "before": null, "after": "the", "start_char_pos": 4906, "end_char_pos": 4906, "major_intent": "clarity", "raw_intents": ["fluency", "clarity", "clarity"]}], "sents_char_pos": [0, 302, 501, 816, 984, 1201, 1450, 1602, 1756, 1982, 2100, 2185, 2302, 2520, 2786, 2800, 3055, 3313, 3523, 3756, 4217, 4572, 4698, 4785], "domain": "wiki"} |
|
{"doc_id": "51488038", "revision_depth": 1, "before_revision": "The vulture and the little girl, also known as \"The Struggling Girl\", is a famous photograph by Kevin Carter which first appeared in The New York Times on 26 March 1993. It is a photograph of a frail famine-stricken boy, initially believed to be a girl, who had collapsed in the foreground with a hooded vulture eyeing him from nearby. The child was reported to be attempting to reach a United Nations feeding center about a half mile away in Ayod, Sudan (now South Sudan), in March 1993, and to have survived the incident. The picture won the Pulitzer Prize for Feature Photography award in 1994. Carter took his own life four months after winning the prize. Background The Hunger Triangle, a name URLanisations used in the 1990s for the area defined by the southern Sudan communities Kongor, Ayod, and Waat, was dependent on UNESCO and other URLanisations to fight famine. Forty percent of the area's children under 5 years old were malnourished as of January 1993, and an estimated 10 to 13 adults died of starvation daily in Ayod alone. To raise awareness of the situation, Operation Lifeline Sudan invited photojournalists and others, previously excluded from entering the country, to report on conditions. In March 1993, the government began granting visas to journalists for a 24-hour stay with severe restrictions on their travel within the country, including government supervision at all times. Joo Silva and Kevin Carter in Sudan Invitation by UN Operation Lifeline Sudan In March 1993, Robert Hadley, a former photographer and at this time the information officer for the UN Operation Lifeline Sudan, offered Joo Silva and Kevin Carter to come to Sudan and report about the famine in southern Sudan. It was an offer to go into southern Sudan with the rebels. Silva saw this as a chance to work more as a war-photographer in the future. He started the arrangements and secured assignments for the expenses of the travel. Silva told Carter about the offer and Carter was also interested in going. According to fellow war photographer Greg Marinovich, Carter saw the trip as an opportunity to fix some problems \"he felt trapped in\". To take photos in Sudan was an opportunity for a better career as freelancer, and Kevin was apparently \"on a high, motivated and enthusiastic about the trip\". To pay for the travel Carter secured some money from the Associated Press and others, but needed to borrow money from Marinovich, for commitments back at home too. Not known to Carter and Silva was all the time that the UN Operation Lifeline Sudan did have \"great difficulties in securing funding for Sudan pancake\", explains Marinovich. Marinovich wrote further: \"The UN hoped to publish the famine Without publicity to show the need, it was difficult for URLanisations to sustain funding\". About the political differences and fighting \"Joo and Kevin knew none of this they just wanted to get in and shoot pictures\". Waiting in Nairobi Silva and Carter had prepared carefully for the trip. They stopped in Nairobi on their way to Sudan. The new fighting in Sudan forced them to wait in Nairobi for an unspecified period of time. In between Carter was flying with the UN for one day to Juba in south Sudan to take photos of a barge, with food aid for the region . But soon the situation changed again. The UN received permission from a rebel group to fly food aid to Ayod. Also Rob Hadley was flying on a UN light plane in and invited Silva and Carter to fly with him to Ayod. In Ayod The next day their light aircraft touched down in the tiny hamlet of Ayod with the cargo aircraft landing shortly afterwards. The residents of the hamlet have been looked after by the UN aid station for some time. Greg Marinovich and Joo Silva described that in the book The Bang Bang Club, Chapter 10 \"Flies and Hungry People\". The child was already cared for before Kevin Carter and Joo Silva landed there and Carter took his famous picture. Marinovich wrote that the villagers were already waiting next to the runway to get the food as quickly as possible: \"Mothers who had joined the throng waiting for food left their children on the sandy ground nearby.\" Silva and Carter separated to take pictures of both children and adults, both the living and dead, all victims of the catastrophic famine that had arisen through the war. Carter went several times to Silva to tell him about the shocking situation he had just photographed. Witnessing the famine affected him emotionally. Silva was searching for rebel soldiers who could take him to someone in authority and when he found some soldiers Carter joined him. The soldiers did not speak English, but one was interested in Carter's watch. Carter gave him his cheap wristwatch as a gift. The soldiers became their bodyguards and followed them for their protection. To stay a week with the rebels they needed the permission of a rebel commander. Their plane was due to depart in an hour and without the permission to stay they would be forced to fly out. Again they separated and Silva went to the clinic complex to ask for the rebel commander and he was told the commander was in Kongor, South Sudan. This was good news for Silva, as \"their little UN plane was heading there next\". He left the clinic and went back to the runway, taking pictures of children and adults on his way. He came across a child lying on his face in the hot sun he took a picture. In 2011, the child's father revealed the child was actually a boy, Kong Nyong, and had been taken care of by the UN food aid station. Nyong had died four years prior, c. 2007, of \"fevers\", according to his family.", "after_revision": "The vulture and the little girl, also known as \"The Struggling Girl\", is a photograph by Kevin Carter which first appeared in The New York Times on 26 March 1993. It is a photograph of a frail famine-stricken boy, initially believed to be a girl, who had collapsed in the foreground with a hooded vulture eyeing him from nearby. The child was reported to be attempting to reach a United Nations feeding center about a half mile away in Ayod, Sudan (now South Sudan), in March 1993, and to have survived the incident. The picture won the Pulitzer Prize for Feature Photography award in 1994. Carter took his own life four months after winning the prize. Background The Hunger Triangle, a name URLanisations used in the 1990s for the area defined by the southern Sudan communities Kongor, Ayod, and Waat, was dependent on UNESCO and other URLanisations to fight famine. Forty percent of the area's children under five years old were malnourished as of January 1993, and an estimated 10 to 13 adults died of starvation daily in Ayod alone. To raise awareness of the situation, Operation Lifeline Sudan invited photojournalists and others, previously excluded from entering the country, to report on conditions. In March 1993, the government began granting visas to journalists for a 24-hour stay with severe restrictions on their travel within the country, including government supervision at all times. Silva and Carter in Sudan Invitation by UN Operation Lifeline Sudan In March 1993, Robert Hadley, a former photographer and at this time the information officer for the UN Operation Lifeline Sudan, invited Joo Silva and Kevin Carter to come to Sudan and report on the famine in the south of the country, travelling into southern Sudan with the rebels. Silva saw this as a chance to work more as a war photographer in the future. He started the arrangements and secured assignments for the expenses of the travel. Silva told Carter about the offer and Carter was also interested in going. According to fellow war photographer Greg Marinovich, Carter saw the trip as an opportunity to fix some problems \"he felt trapped in\". To take photos in Sudan was an opportunity for a better career as freelancer, and Carter was apparently \"on a high, motivated and enthusiastic about the trip\". To pay for the travel , Carter secured some money from the Associated Press and others, but needed to borrow money from Marinovich, for commitments back at home too. Not known to Carter and Silva was all the time that the UN Operation Lifeline Sudan did have \"great difficulties in securing funding for Sudan pancake\", explains Marinovich. Marinovich wrote further: \"The UN hoped to publish the famine Without publicity to show the need, it was difficult for URLanisations to sustain funding\". About the political differences and fighting \"Joo and Kevin knew none of this they just wanted to get in and shoot pictures\". Waiting in Nairobi Silva and Carter stopped in Nairobi on their way to Sudan. The new fighting in Sudan forced them to wait there for an unspecified period of time. Carter flew with the UN for one day to Juba in south Sudan to take photos of a barge, with food aid for the region , but soon the situation changed again. The UN received permission from a rebel group to fly food aid to Ayod. Rob Hadley was flying on a UN light plane in and invited Silva and Carter to fly with him to Ayod. In Ayod The next day , their light aircraft touched down in the tiny hamlet of Ayod with the cargo aircraft landing shortly afterwards. The residents of the hamlet had been looked after by the UN aid station for some time. Greg Marinovich and Joo Silva described that in the book The Bang Bang Club, Chapter 10 \"Flies and Hungry People\". The child was already cared for before Kevin Carter and Joo Silva landed there and Carter took the picture. Marinovich wrote that the villagers were already waiting next to the runway to get the food as quickly as possible: \"Mothers who had joined the throng waiting for food left their children on the sandy ground nearby.\" Silva and Carter separated to take pictures of both children and adults, both the living and dead, all victims of the catastrophic famine that had arisen through the war. Carter went several times to Silva to tell him about the shocking situation he had just photographed. Witnessing the famine affected him emotionally. Silva was searching for rebel soldiers who could take him to someone in authority and when he found some soldiers Carter joined him. The soldiers did not speak English, but one was interested in Carter's watch. Carter gave him his cheap wristwatch as a gift. The soldiers became their bodyguards and followed them for their protection. To stay a week with the rebels they needed the permission of a rebel commander. Their plane was due to depart in an hour and without the permission to stay they would be forced to fly out. Again they separated and Silva went to the clinic complex to ask for the rebel commander and he was told the commander was in Kongor, South Sudan. This was good news for Silva, as \"their little UN plane was heading there next\". He left the clinic and went back to the runway, taking pictures of children and adults on his way. He came across a child lying on his face in the hot sun , and took a picture. In 2011, the child's father revealed the child was actually a boy, Kong Nyong, and had been taken care of by the UN food aid station. Nyong had died in about 2007, of \"fevers\", according to his family.", "edit_actions": [{"type": "D", "before": "famous", "after": null, "start_char_pos": 75, "end_char_pos": 81, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "5", "after": "five", "start_char_pos": 918, "end_char_pos": 919, "major_intent": "clarity", "raw_intents": ["clarity", "fluency", "clarity"]}, {"type": "R", "before": "Joo Silva and Kevin", "after": "Silva and", "start_char_pos": 1405, "end_char_pos": 1424, "major_intent": "clarity", "raw_intents": ["coherence", "clarity", "clarity"]}, {"type": "R", "before": "offered", "after": "invited", "start_char_pos": 1613, "end_char_pos": 1620, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "about", "after": "on", "start_char_pos": 1676, "end_char_pos": 1681, "major_intent": "fluency", "raw_intents": ["fluency", "clarity", "fluency"]}, {"type": "R", "before": "southern Sudan. It was an offer to go", "after": "the south of the country, travelling", "start_char_pos": 1696, "end_char_pos": 1733, "major_intent": "clarity", "raw_intents": ["clarity", "coherence", "clarity"]}, {"type": "R", "before": "war-photographer", "after": "war photographer", "start_char_pos": 1816, "end_char_pos": 1832, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "Kevin", "after": "Carter", "start_char_pos": 2224, "end_char_pos": 2229, "major_intent": "meaning-changed", "raw_intents": ["clarity", "meaning-changed", "meaning-changed"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 2323, "end_char_pos": 2323, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "D", "before": "had prepared carefully for the trip. They", "after": null, "start_char_pos": 2956, "end_char_pos": 2997, "major_intent": "coherence", "raw_intents": ["coherence", "clarity", "coherence"]}, {"type": "R", "before": "in Nairobi", "after": "there", "start_char_pos": 3086, "end_char_pos": 3096, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "In between Carter was flying", "after": "Carter flew", "start_char_pos": 3132, "end_char_pos": 3160, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}, {"type": "R", "before": ". But", "after": ", but", "start_char_pos": 3264, "end_char_pos": 3269, "major_intent": "fluency", "raw_intents": ["fluency", "coherence", "fluency"]}, {"type": "D", "before": "Also", "after": null, "start_char_pos": 3375, "end_char_pos": 3379, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "clarity"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 3500, "end_char_pos": 3500, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "have", "after": "had", "start_char_pos": 3642, "end_char_pos": 3646, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "fluency"]}, {"type": "R", "before": "his famous", "after": "the", "start_char_pos": 3912, "end_char_pos": 3922, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "he", "after": ", and", "start_char_pos": 5378, "end_char_pos": 5380, "major_intent": "coherence", "raw_intents": ["fluency", "coherence", "coherence"]}, {"type": "R", "before": "four years prior, c.", "after": "in about", "start_char_pos": 5546, "end_char_pos": 5566, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "coherence"]}], "sents_char_pos": [0, 169, 335, 523, 597, 659, 874, 1040, 1211, 1404, 1711, 1770, 1847, 1931, 2006, 2141, 2300, 2465, 2639, 2793, 2919, 2992, 3039, 3131, 3265, 3303, 3374, 3478, 3613, 3701, 3816, 3931, 4148, 4319, 4421, 4469, 4602, 4680, 4728, 4805, 4885, 4994, 5141, 5222, 5321, 5396, 5530], "domain": "wiki"} |
|
{"doc_id": "5467330", "revision_depth": 1, "before_revision": "Herbert Richard Baumeister (April 7, 1947 July 3, 1996) was an American suspected serial killer. A resident of the Indianapolis suburb of Westfield, Indiana, he was under investigation for murdering over a dozen men in the early 1990s, most of whom were last seen at gay bars. Police found the remains of eleven persons, eight identified, on Baumeister's property. After an arrest warrant was issued, Baumeister fled to Canada and killed himself before he could be brought to trial. He never confessed to the crimes and his suicide note made no mention of the murder allegations. He was later linked to a series of murders of at least nine men along Interstate 70, which occurred in the early to mid-1980s. Early life Baumeister was born in Indianapolis, Indiana, the oldest of four children born to Herbert and Elizabeth Baumeister. His childhood was reportedly normal . By the onset of adolescence, he began exhibiting anti-social behavior . Acquaintances later recalled the young Baumeister playing with dead animals and urinating on a teacher's desk , which are textbook signs of a burgeoning serial killer . In his teens, he was diagnosed with schizophrenia , but did not receive further psychiatric treatment. In 1965, Baumeister attended Indiana University for a semester before dropping out, but returned in 1967. In 1972, he attended a semester at Butler University. As an adult, he drifted through a series of jobs, marked by a strong work ethic , but also by increasingly bizarre behavior. Baumeister married Juliana \"Julie\" Saiter in November 1971, a union that produced three children. Julie later said they had been sexually intimate only six times in over 25 years of marriage. In the 1970s, Baumeister was committed to a psychiatric hospital by his father . His wife said he was \"hurting and needed help.\" Baumeister founded the successful Sav-A-Lot thrift store chain (two stores total) in Indianapolis in 1988. Investigation By the early 1990s, investigators with the Marion County Sheriff's Department and the Indianapolis Police Department began investigating the disappearances of gay men of similar age, height, and weight in the Indianapolis area. In 1992, they were contacted by a man named Tony Harris claiming that a gay bar patron calling himself \"Brian Smart\" had killed a friend of his, and had attempted to kill him with a pool hose during an erotic asphyxiation session. Harris eventually saw this man again in August 1995, following him and noting a license plate number. From this data, police identified \"Brian Smart\" as Herb Baumeister. Investigators approached Baumeister, told him he was a suspect in the disappearances, and asked to search his house. Both Baumeister and his wife, Julie, refused to allow a search of their house . By June 1996, however, Julie had become sufficiently frightened by her husband's mood swings and erratic behavior that, after filing for divorce, she consented to a search. The search of the estate, Fox Hollow Farm, was conducted while Baumeister was on vacation. It turned up the remains of eleven men, eight of whom were identified. Baumeister would posthumously be suspected of killing nine other men, the bodies of whom were found in rural areas along the corridor of Interstate 70 between Indianapolis and Columbus, Ohio during the early to mid 1980s. One eyewitness identified Baumeister as the man seen leaving a bar in 1983 with Michael Riley, who was later found dead. Like the other victims, Riley was strangled to death and deposited nude or semi-nude in a river.", "after_revision": "Herbert Richard Baumeister (April 7, 1947 July 3, 1996) was an American businessman and suspected serial killer. A resident of the Indianapolis suburb of Westfield, Indiana, Baumeister was under investigation for murdering over a dozen men in the early 1990s, most of whom were last seen at gay bars. Police found the remains of eleven persons, eight identified, on Baumeister's property. After an arrest warrant was issued, Baumeister fled to Canada and committed suicide before he could be brought to trial. He never confessed to the crimes and his suicide note made no mention of the murder allegations. Baumeister was later linked to a series of murders of at least nine men along Interstate 70, which occurred in the early to mid-1980s. Early life Herbert Baumeister was born in Indianapolis, Indiana, the oldest of four children born to Herbert and Elizabeth Baumeister. His childhood was reportedly normal but he began exhibiting anti-social behavior by the onset of adolescence, playing with dead animals and urinating on a teacher's desk . In his teens, Baumeister was diagnosed with schizophrenia but did not receive further psychiatric treatment. In 1965, Baumeister attended Indiana University for a semester before dropping out, but returned in 1967. In 1972, he attended a semester at Butler University. As an adult, Baumeister drifted through a series of jobs, marked by a strong work ethic but also by increasingly bizarre behavior. Baumeister married Juliana \"Julie\" Saiter in November 1971, a union that produced three children. Julie later said they had been sexually intimate only six times in over 25 years of marriage. In the 1970s, Baumeister was committed to a psychiatric hospital by his father ; his wife said he was \"hurting and needed help.\" Baumeister founded the successful Sav-A-Lot thrift store chain (two stores total) in Indianapolis in 1988. Investigation By the early 1990s, investigators with the Marion County Sheriff's Department and the Indianapolis Police Department began investigating the disappearances of gay men of similar age, height, and weight in the Indianapolis area. In 1992, they were contacted by a man named Tony Harris claiming that a gay bar patron calling himself \"Brian Smart\" had killed a friend of his, and had attempted to kill him with a pool hose during an erotic asphyxiation session. Harris eventually saw this man again in August 1995, following his car and noting his license plate number. From this data, police identified \"Brian Smart\" as Herb Baumeister. Investigators approached Baumeister, told him he was a suspect in the disappearances, and asked to search his house. Both Baumeister and his wife, Julie, refused to allow a search of their property . By June 1996, however, Julie had become sufficiently frightened by her husband's erratic behavior that, after filing for divorce, she consented to a search. The search of the estate, Fox Hollow Farm, was conducted while Baumeister was on vacation. It turned up the remains of eleven men, eight of whom were identified. Baumeister would posthumously be suspected of killing nine other men, the bodies of whom were found in rural areas along the corridor of Interstate 70 between Indianapolis and Columbus, Ohio , during the early to mid 1980s. One eyewitness identified Baumeister as the man seen leaving a bar in 1983 with Michael Riley, who was later found dead. Like the other victims, Riley was strangled to death and deposited nude or semi-nude in a river.", "edit_actions": [{"type": "A", "before": null, "after": "businessman and", "start_char_pos": 72, "end_char_pos": 72, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "he", "after": "Baumeister", "start_char_pos": 159, "end_char_pos": 161, "major_intent": "clarity", "raw_intents": ["clarity", "fluency", "clarity"]}, {"type": "R", "before": "killed himself", "after": "committed suicide", "start_char_pos": 432, "end_char_pos": 446, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "style"]}, {"type": "R", "before": "He", "after": "Baumeister", "start_char_pos": 581, "end_char_pos": 583, "major_intent": "clarity", "raw_intents": ["clarity", "meaning-changed", "clarity"]}, {"type": "A", "before": null, "after": "Herbert", "start_char_pos": 719, "end_char_pos": 719, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": ". By the onset of adolescence,", "after": "but", "start_char_pos": 872, "end_char_pos": 902, "major_intent": "clarity", "raw_intents": ["style", "clarity", "clarity"]}, {"type": "R", "before": ". Acquaintances later recalled the young Baumeister", "after": "by the onset of adolescence,", "start_char_pos": 944, "end_char_pos": 995, "major_intent": "style", "raw_intents": ["clarity", "style", "style"]}, {"type": "D", "before": ", which are textbook signs of a burgeoning serial killer", "after": null, "start_char_pos": 1056, "end_char_pos": 1112, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "clarity"]}, {"type": "R", "before": "he", "after": "Baumeister", "start_char_pos": 1129, "end_char_pos": 1131, "major_intent": "clarity", "raw_intents": ["meaning-changed", "clarity", "clarity"]}, {"type": "D", "before": ",", "after": null, "start_char_pos": 1165, "end_char_pos": 1166, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "he", "after": "Baumeister", "start_char_pos": 1391, "end_char_pos": 1393, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "meaning-changed"]}, {"type": "D", "before": ",", "after": null, "start_char_pos": 1458, "end_char_pos": 1459, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": ". His", "after": "; his", "start_char_pos": 1774, "end_char_pos": 1779, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "coherence"]}, {"type": "R", "before": "him and noting a", "after": "his car and noting his", "start_char_pos": 2467, "end_char_pos": 2483, "major_intent": "clarity", "raw_intents": ["clarity", "others", "clarity"]}, {"type": "R", "before": "house", "after": "property", "start_char_pos": 2763, "end_char_pos": 2768, "major_intent": "clarity", "raw_intents": ["clarity", "others", "style"]}, {"type": "D", "before": "mood swings and", "after": null, "start_char_pos": 2852, "end_char_pos": 2867, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 3297, "end_char_pos": 3297, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}], "sents_char_pos": [0, 97, 277, 365, 483, 580, 707, 835, 873, 945, 1114, 1217, 1323, 1377, 1502, 1600, 1694, 1775, 1930, 2172, 2403, 2505, 2573, 2690, 2770, 2943, 3034, 3105, 3328, 3449], "domain": "wiki"} |
|
{"doc_id": "5929171", "revision_depth": 1, "before_revision": "Singapore in the Straits Settlements refers to a period in the history of Singapore from 1826 to 1942, during which Singapore was part of the Straits Settlements together with Penang and Malacca. Previously a residency, or subdivision, of the Presidency of Bengal (officially the Presidency of Fort William), in 1867 the Straits Settlements became a separate crown colony, mainly due to ethnic and linguistic differences, and became directly overseen by the Colonial Office in Whitehall in London. The period saw Singapore establish itself as an important trading port and developed into a major city with a rapid increase in population. British rule was suspended in February 1942, when the Imperial Japanese Army of the Japanese Empire had invaded Singapore during the Pacific Front of World War II and renamed it Syonan-to . After the war, Singapore was returned to British rule. It subsequently separated from the Straits Settlements itself, and became its own standalone crown colony with its own governor. Beginning of British rule in Singapore An 1888 German map of Singapore. In 1819, British official Stamford Raffles landed in Singapore to establish a trading port. The island's status as a British outpost was initially in doubt, as the Dutch government soon issued bitter protests to the British government, arguing that their sphere of influence had been violated. The British government and the East India Company were initially worried about the potential liability of this new outpost, but that was soon overshadowed by Singapore's rapid growth as an important trading post. By 1822, it was made clear to the Dutch that the British had no intention of giving up the island. In these early decades, the island was riddled with opium houses and prostitution, and came to be widely monikered as \"Sin-galore\" . IT Figures S4 - Toggle, 28 December 2015", "after_revision": "Singapore in the Straits Settlements refers to a period in the history of Singapore from 1826 to 1942, during which Singapore was part of the Straits Settlements together with Penang and Malacca. From 1830 to 1867, the Straits Settlements was a residency, or subdivision, of the Presidency of Bengal , in British India. In 1867 , the Straits Settlements became a separate Crown colony, directly overseen by the Colonial Office in Whitehall in London. The period saw Singapore establish itself as an important trading port and developed into a major city with a rapid increase in population. British rule was suspended in February 1942, when the Imperial Japanese Army invaded Singapore during World War II . Beginning of British rule in Singapore An 1888 German map of Singapore. In 1819, British official Stamford Raffles landed in Singapore to establish a trading port. The island's status as a British outpost was initially in doubt, as the Dutch government soon issued bitter protests to the British government, arguing that their sphere of influence had been violated. The British government and the East India Company were initially worried about the potential liability of this new outpost, but that was soon overshadowed by Singapore's rapid growth as an important trading post. By 1822, it was made clear to the Dutch that the British had no intention of giving up the island. In these early decades, the island was riddled with opium houses and prostitution, and came to be widely monikered as \"Sin-galore\" IT Figures S4 - Toggle, 28 December 2015", "edit_actions": [{"type": "R", "before": "Previously", "after": "From 1830 to 1867, the Straits Settlements was", "start_char_pos": 196, "end_char_pos": 206, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "coherence", "meaning-changed"]}, {"type": "R", "before": "(officially the Presidency of Fort William), in", "after": ", in British India.", "start_char_pos": 264, "end_char_pos": 311, "major_intent": "clarity", "raw_intents": ["clarity", "meaning-changed", "clarity"]}, {"type": "A", "before": null, "after": "In", "start_char_pos": 312, "end_char_pos": 312, "major_intent": "coherence", "raw_intents": ["coherence", "fluency", "coherence"]}, {"type": "A", "before": null, "after": ",", "start_char_pos": 318, "end_char_pos": 318, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "fluency"]}, {"type": "R", "before": "crown colony, mainly due to ethnic and linguistic differences, and became", "after": "Crown colony,", "start_char_pos": 361, "end_char_pos": 434, "major_intent": "clarity", "raw_intents": ["clarity", "coherence", "clarity"]}, {"type": "D", "before": "of the Japanese Empire had", "after": null, "start_char_pos": 717, "end_char_pos": 743, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "clarity"]}, {"type": "D", "before": "the Pacific Front of", "after": null, "start_char_pos": 769, "end_char_pos": 789, "major_intent": "coherence", "raw_intents": ["coherence", "clarity", "coherence"]}, {"type": "D", "before": "and renamed it Syonan-to", "after": null, "start_char_pos": 803, "end_char_pos": 827, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "clarity"]}, {"type": "D", "before": "After the war, Singapore was returned to British rule. It subsequently separated from the Straits Settlements itself, and became its own standalone crown colony with its own governor.", "after": null, "start_char_pos": 830, "end_char_pos": 1013, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "coherence"]}, {"type": "D", "before": ".", "after": null, "start_char_pos": 1823, "end_char_pos": 1824, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "others"]}], "sents_char_pos": [0, 195, 499, 639, 884, 1013, 1085, 1177, 1379, 1592, 1691], "domain": "wiki"} |
|
{"doc_id": "63712897", "revision_depth": 2, "before_revision": "Joseph Mercado (, born July 2, 1972) is a Filipino statistician, professor, businessperson and university administrator who was a former dean and vice president of the Polytechnic University of the Philippines . Mercado has co-authored scientific publications involving Pharmaceutical Sciences , economic analysis, and sociological analysis and received the Distinguished Research Award from the Academic International Consortium of Indonesia for his work in 2016. At present, Mercado is a member of the Philippine Statistical Association, Philippine Association for Teacher Education (PAFTE), the Rotary Club of New Manila Heights and a Fellow of Royal Institution based in Singapore . Early life and education Mercado finished a Bachelor of Applied Statistics degree in 1993, a Master of Applied Science degree in 2000 and a Doctor of Educational Management degree in 2003 at the Polytechnic University of the Philippines. In 2011, Mercado accepted an offer to complete a PhD on Criminology at the Philippine College of Criminology. After being designated as the Vice President of Research, Extension, Planning and Development of the Polytechnic University of the Philippines, he attended an Executive Development Program held by the Commission on Higher Education (CHED) with other university administrators to further his knowledge on higher education management. Career Polytechnic University of the Philippines (20052018) In 2004, Mercado first sat as the Acting Director of PUP Ragay. Consequently, in 2006, he was appointed as the director of PUP Sto. Tomas Batangas and then designated as the Dean of the College of Science. At the same time, he was also the Assistant Director of Center for Data and Statistical Analysis. Three years later, he became the Executive Director for Branches and Campuses. Mercado then sat as the Vice President for Branches and Campuses for another three years. In late 2015, Mercado was then appointed by Emanuel de Guzman, the university's President, to be the Vice President for Research, Extension, Planning and Development. To pursue personal endeavours, Mercado resigned from office in mid-2018. He still serves as an expert to the university, specializing in Applied Statistics .", "after_revision": "Joseph Mercado (, born July 2, 1972) is a Filipino statistician, professor, businessperson and university administrator who served as dean and vice president of the Polytechnic University of the Philippines from 2006 to 2018. Mercado has co-authored scientific publications involving pharmaceutical sciences , economic analysis, and sociological analysis and received the Distinguished Research Award from the Academic International Consortium of Indonesia for his work in 2016. At present, Mercado is a member of the Philippine Statistical Association, the National Research Council of the Philippines, and the Royal Institution based in Singapore as a Fellow . Early life and education Mercado finished a Bachelor of Applied Statistics degree in 1993, a Master of Applied Science degree in 2000 and a Doctor of Educational Management degree in 2003 , all at the Polytechnic University of the Philippines. In 2011, Mercado completed Criminology PhD at the Philippine College of Criminology. After becoming a university vice president, he attended an Executive Development Program held by the Commission on Higher Education (CHED) with other university administrators to further his knowledge on higher education management. Career Polytechnic University of the Philippines (20052018) In 2004, Mercado first sat as the acting director of PUP Ragay. Consequently, in 2006, he was appointed as the director of PUP Sto. Tomas Batangas branch and then designated as the dean of the College of Science. At the same time, he was also the Assistant Director of Center for Data and Statistical Analysis. Three years later, he became the Executive Director for Branches and Campuses. After a short 5 month stint as Executive Director, Mercado then served as the Vice President for Branches and Campuses for three years. In late 2015, Mercado was then appointed by Emanuel de Guzman, the university's President, to be the Vice President for Research, Extension, Planning and Development. To pursue personal endeavours, Mercado resigned from office in mid-2018. However, he still serves as an expert advisor to the university, specializing in applied statistics .", "edit_actions": [{"type": "R", "before": "was a former", "after": "served as", "start_char_pos": 124, "end_char_pos": 136, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": ".", "after": "from 2006 to 2018.", "start_char_pos": 210, "end_char_pos": 211, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "R", "before": "Pharmaceutical Sciences", "after": "pharmaceutical sciences", "start_char_pos": 270, "end_char_pos": 293, "major_intent": "fluency", "raw_intents": ["fluency", "fluency", "clarity"]}, {"type": "R", "before": "Philippine Association for Teacher Education (PAFTE), the Rotary Club of New Manila Heights and a Fellow of", "after": "the National Research Council of the Philippines, and the", "start_char_pos": 540, "end_char_pos": 647, "major_intent": "clarity", "raw_intents": ["clarity", "others", "clarity"]}, {"type": "A", "before": null, "after": "as a Fellow", "start_char_pos": 685, "end_char_pos": 685, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "A", "before": null, "after": ", all", "start_char_pos": 876, "end_char_pos": 876, "major_intent": "coherence", "raw_intents": ["coherence", "others", "coherence"]}, {"type": "R", "before": "accepted an offer to complete a PhD on Criminology", "after": "completed Criminology PhD", "start_char_pos": 944, "end_char_pos": 994, "major_intent": "style", "raw_intents": ["style", "others", "clarity"]}, {"type": "R", "before": "being designated as the Vice President of Research, Extension, Planning and Development of the Polytechnic University of the Philippines,", "after": "becoming a university vice president,", "start_char_pos": 1043, "end_char_pos": 1180, "major_intent": "clarity", "raw_intents": ["clarity", "clarity", "clarity"]}, {"type": "R", "before": "Acting Director", "after": "acting director", "start_char_pos": 1464, "end_char_pos": 1479, "major_intent": "fluency", "raw_intents": ["fluency", "others", "fluency"]}, {"type": "A", "before": null, "after": "branch", "start_char_pos": 1577, "end_char_pos": 1577, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "meaning-changed"]}, {"type": "R", "before": "Dean", "after": "dean", "start_char_pos": 1605, "end_char_pos": 1609, "major_intent": "fluency", "raw_intents": ["others", "fluency", "fluency"]}, {"type": "R", "before": "Mercado then sat", "after": "After a short 5 month stint as Executive Director, Mercado then served", "start_char_pos": 1814, "end_char_pos": 1830, "major_intent": "meaning-changed", "raw_intents": ["meaning-changed", "meaning-changed", "meaning-changed"]}, {"type": "D", "before": "another", "after": null, "start_char_pos": 1883, "end_char_pos": 1890, "major_intent": "clarity", "raw_intents": ["clarity", "others", "coherence"]}, {"type": "R", "before": "He", "after": "However, he", "start_char_pos": 2144, "end_char_pos": 2146, "major_intent": "coherence", "raw_intents": ["coherence", "coherence", "coherence"]}, {"type": "A", "before": null, "after": "advisor", "start_char_pos": 2173, "end_char_pos": 2173, "major_intent": "clarity", "raw_intents": ["clarity", "others", "coherence"]}, {"type": "R", "before": "Applied Statistics", "after": "applied statistics", "start_char_pos": 2209, "end_char_pos": 2227, "major_intent": "others", "raw_intents": ["fluency", "others", "others"]}], "sents_char_pos": [0, 211, 464, 926, 1036, 1369, 1493, 1561, 1636, 1734, 1813, 1903, 2070, 2143], "domain": "wiki"} |
|
|